BOYHOOD IN AMERICA An Encyclopedia
The American Family The six titles that make up The American Family offer a revital...
239 downloads
3842 Views
5MB Size
Report
This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
Report copyright / DMCA form
BOYHOOD IN AMERICA An Encyclopedia
The American Family The six titles that make up The American Family offer a revitalizing new take on U.S. history, surveying current culture from the perspective of the family and incorporating insights from psychology, sociology, and medicine. Each two-volume, A-to-Z encyclopedia features its own advisory board, editorial slant, and apparatus, including illustrations, bibliography, and index.
Parenthood in America edited by Lawrence Balter, New York University
Adolescence in America edited by Jacqueline V. Lerner, Boston College, and Richard M. Lerner, Tufts University; Jordan W. Finkelstein, Pennsylvania State University, Advisory Editor
Girlhood in America edited by Miriam Forman-Brunell, University of Missouri, Kansas City
Boyhood in America edited by Priscilla Ferguson Clement, Pennsylvania State University, Delaware County, and Jacqueline S. Reinier, California State University, Sacramento
Infancy in America edited by Alice Sterling Honig, Emerita, Syracuse University; Hiram E. Fitzgerald, Michigan State University; and Holly Brophy-Herb, Michigan State University
The Family in America edited by Joseph M. Hawes, University of Memphis, and Elizabeth F. Shores, Little Rock, Arkansas
BOYHOOD IN AMERICA An Encyclopedia
Volume 1 A–K
Priscilla Ferguson Clement, editor Professor of History Pennsylvania State University–Delaware County Media, Pennsylvania
Jacqueline S. Reinier, editor Professor Emerita California State University–Sacramento Sacramento, California foreword by
Elliott West
University of Arkansas Fayetteville, Arkansas
Santa Barbara, California Denver, Colorado Oxford, England
© 2001 by Priscilla Ferguson Clement and Jacqueline S. Reinier “Masculinities” (page 425) © 2001 by Michael Kimmel All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, except for the inclusion of brief quotations in a review, without prior permission in writing from the publishers. Library of Congress Cataloging-in-Publication Data Boyhood in America : an encyclopedia / edited by Priscilla Ferguson Clement, Jacqueline S. Reinier ; foreword by Elliott West. p. cm. — (The American family) Includes bibliographical references and index. ISBN 1-57607-215-0 (hardcover : alk. paper) 1-57607-540-0 (e-book) 1. Boys—United States—Encyclopedias. I. Clement, Priscilla Ferguson, 1942– II. Reinier, Jacqueline S. III. American family (Santa Barbara, Calif.) HQ775 .B635 2001 305.23—dc21
07 06 05 04 03 02 01
10 9 8 7 6 5 4 3 2 1 (cloth)
ABC-CLIO, Inc. 130 Cremona Drive, P.O. Box 1911 Santa Barbara, California 93116-1911
This book is also available on the World Wide Web as an e-book. Visit www.abc-clio.com for details.
This book is printed on acid-free paper • Manufactured in the United States of America
To our grandsons: Jackson Jennings Clement, Nataniel Joseph Bael, and Joshua Tasmin Reinier
Advisory Board Joseph M. Hawes University of Memphis Memphis, Tennessee Michael Kimmel State University of New York Stony Brook, New York Lynn Spigel University of Southern California Los Angeles, California Barrie Thorne University of California, Berkeley Berkeley, California Elliott West University of Arkansas Fayetteville, Arkansas
Contents
A-to-Z List of Entries xi Contributors and Their Entries Foreword xxiii Preface xxvii Introduction xxxi
xv
Volume 1: Entries A to K 1 Volume 2: Entries L to Z 413 Bibliography 769 Index 837 About the Editors 847
ix
A-to-Z List of Entries
VOLUME 1, A–K
Boy Scouts Boys’ Choruses Boys Town Bullying
A
C
Abuse Accidents Adams, John Adolescence Adoption African American Boys Alger, Horatio Allowances Amusement and Theme Parks Apprenticeship Artists Asian American Boys
California Missions Camping Cars Chinese American Boys Circumcision Civil War Clothing Clubs Comic Books Competition Computers Cowboys
B
D
Bar Mitzvah Baseball Baseball Cards Basketball Bicycles Big Brothers Bodies Books and Reading, 1600s and 1700s Books and Reading, 1800s Books and Reading, 1900–1960 Books since 1960 Boxing
Discipline Disease and Death Divorce Douglass, Frederick Drag Racing
E Early Republic
xi
xii
A-to-Z List of Entries
Emerson, Ralph Waldo Emotions
F Farm Boys Fathers Fathers, Adolescent Films Fire Companies Fishing Football Foster Care 4-H in the Midwest Franklin, Benjamin Fraternities Frontier Boyhood
G Gambling Games Gangs Gold Rush Graffiti Grandparents Great Depression Guns
H Holidays Horror Films Hunting
I Ice Hockey Illegal Substances Immigrants Indentured Servants Intelligence Testing
J Jefferson, Thomas Jobs in the Seventeenth and Eighteenth Centuries Jobs in the Nineteenth Century Jobs in the Twentieth Century Jokes Juvenile Courts Juvenile Delinquency
VOLUME 2, L–Z L Learning Disabilities Left-Wing Education
M Manners and Gentility Masculinities Masturbation Melodrama Mexican American Boys Military Schools Mothers Muscular Christianity Music
N Nationalism and Boyhood: The “Young America” Movement Native American Boys Newsboys
O Orphanages Orthodontics
A-to-Z List of Entries
P Parachurch Ministry Performers and Actors Pets Photographs by Lewis Hine Placing Out Plantations Poliomyelitis Pornography Portraiture Poverty Preachers in the Early Republic Prostitution
R Reformatories, Nineteenth-Century Reformatories, Twentieth-Century Revolutionary War Rock Bands Roosevelt, Theodore Runaway Boys
S Same-Sex Relationships Schoolbooks Schools for Boys Schools, Public Scientific Reasoning Sexuality Sexually Transmitted Diseases Siblings Skateboarding Skiing Slave Trade Slavery Smoking and Drinking
Sports, Colonial Era to 1920 Sports, 1921 to the Present Suicide Sunday Schools Superheroes
T Teams Television: Cartoons Television: Domestic Comedy and Family Drama Television: Race and Ethnicity Television: Westerns Tennis Theatre Toys Transitions (through Adolescence)
V Vaudeville Video Games Violence, History of Violence, Theories of Vocational Education
W Washington, Booker T., and W. E. B. Du Bois World War II
Y Young Men’s Christian Association Young Men’s Hebrew Association
xiii
Contributors and Their Entries
Christine Acham University of California–Davis Davis, California Television: Race and Ethnicity
Harry M. Benshoff University of North Texas Denton, Texas Horror Films Television: Westerns
Eric Amsel Weber State University Ogden, Utah Scientific Reasoning
John Bloom Research Consultant Carlisle, Pennsylvania Baseball Cards
Joyce Appleby University of California–Los Angeles Los Angeles, California Early Republic
Linda J. Borish Western Michigan University Kalamazoo, Michigan Sports, Colonial Era to 1920 Sports, 1921 to the Present Young Men’s Hebrew Association
Lonnie Athens Seton Hall University South Orange, New Jersey Violence, Theories of
William L. Borror Media Presbyterian Church Media, Pennsylvania Parachurch Ministry
Joe Austin Bowling Green State University Bowling Green, Ohio Gangs Graffiti
Kay P. Bradford Brigham Young University Provo, Utah Fathers
Peter W. Bardaglio Goucher College Baltimore, Maryland Civil War
Silvia Sara Canetto Colorado State University Fort Collins, Colorado Suicide
xv
xvi
Contributors and Their Entries
E. Wayne Carp Pacific Lutheran University Tacoma, Washington Adoption
Ronald D. Cohen Indiana University–Northwest Gary, Indiana Left-Wing Education
Gregg Lee Carter Bryant College Smithfield, Rhode Island Guns
Phyllis Cole Pennsylvania State University– Delaware County Media, Pennsylvania Emerson, Ralph Waldo
John Chapin Pennsylvania State University–Beaver Monaca, Pennsylvania Illegal Substances Smoking and Drinking Alice A. Christie Arizona State University–West Phoenix, Arizona Computers
Caroline Cox University of the Pacific Stockton, California Revolutionary War Gary Cross Pennsylvania State University– University Park University Park, Pennsylvania Toys
Cindy Dell Clark Pennsylvania State University– Delaware County Media, Pennsylvania Holidays
Roger Daniels University of Cincinnati Cincinnati, Ohio Immigrants
Jennifer Clement Foundations, Inc. Moorestown, New Jersey Books since 1960
Richard G. Davies Culver Academies Culver, Indiana Military Schools
Priscilla Ferguson Clement Pennsylvania State University– Delaware County Media, Pennsylvania Big Brothers Boys Town Foster Care Jobs in the Nineteenth Century Orphanages Placing Out Poverty Reformatories, Ninteenth-Century Schools, Public
Steven Deyle University of California–Davis Davis, California Slave Trade Vincent DiGirolamo Princeton University Princeton, New Jersey Newsboys
Contributors and Their Entries
xvii
Andrew Doyle Winthrop University Rock Hill, South Carolina Competition
Julia A. Graber Teachers College, Columbia University New York, New York Transitions (through Adolescence)
Mark Dyreson Pennsylvania State University– University Park University Park, Pennsylvania Teams
Amy S. Greenberg Pennsylvania State University– University Park University Park, Pennsylvania Fire Companies
Judith Erickson Indiana Youth Institute Indianapolis, Indiana Clubs
Sean Griffin Southern Methodist University Fort Lauderdale, Florida Television: Cartoons Television: Westerns
Janice I. Farkas Pennsylvania State University– Delaware County Media, Pennsylvania Grandparents Barbara Finkelstein University of Maryland–College Park College Park, Maryland Abuse Timothy P. Fong California State University– Sacramento Sacramento, California Asian American Boys Gerald R. Gems North Central College Naperville, Illinois Boxing Football Sports, Colonial Era to 1920 Sports, 1921 to the Present Steven P. Gietschier The Sporting News St. Louis, Missouri Baseball
Marya Viorst Gwadz National Development and Research Institutes, Inc. New York, New York Prostitution Lisbeth Haas University of California–Santa Cruz Santa Cruz, California California Missions Alan J. Hawkins Brigham Young University Provo, Utah Fathers Richard Hawley University School Hunting Valley, Ohio Schools for Boys Glenn Hendler University of Notre Dame Notre Dame, Indiana Alger, Horatio
xviii
Contributors and Their Entries
Daniel J. Herman Central Washington University Ellensburg, Washington Hunting
Earnestine Jenkins University of Memphis Memphis, Tennessee African American Boys
Christine Leigh Heyrman University of Delaware Churchville, Maryland Preachers in the Early Republic
Lisa Jett University of Memphis Memphis, Tennessee Schools, Public
Mabel T. Himel University of Memphis Memphis, Tennessee Schools, Public
Erwin V. Johanningmeier University of South Florida Tampa, Florida Intelligence Testing
N. Ray Hiner University of Kansas Lawrence, Kansas Masturbation
Ava F. Kahn Visiting Scholar California Studies C enter, University of California– Berkeley Berkeley, California Bar Mitzvah
Peter C. Holloran Worcester State College Worcester, Massachussetts Runaway Boys William Indick Cornell University Ithaca, New York Violence, History of Kathy Merlock Jackson Virginia Wesleyan College Norfolk, Virginia Films Christina S. Jarvis State University of New York–Fredonia Fredonia, New York Great Depression World War II Tom Jelke Indiana University Bloomington, Indiana Fraternities
Debra Lynn Kaplan University of Nebraska Lincoln, Nebraska Divorce Joseph F. Kett University of Virginia Charlottesville, Virginia Adolescence Michael Kimmel State University of New York– Stony Brook Stony Brook, New York Masculinities Mark S. Kiselica The College of New Jersey Ewing, New Jersey Fathers, Adolescent
Contributors and Their Entries Carolyn J. Lawes Old Dominion University Norfolk, Virginia Mothers
Richard A. Meckel Brown University Providence, Rhode Island Disease and Death
Theodore Lewis University of Minnesota–Twin Cities Twin Cities, Minnesota Vocational Education
Bernard Mergen George Washington University Washington, DC Bicycles Comic Books Games Skiing
Bruce Lindsay University of East Anglia Norwich, England Accidents Circumcision Orthodontics Poliomyelitis
Constance Milbrath University of California–San Francisco San Francisco, California Artists
Philip E. Linhares Oakland Museum of California Oakland, California Cars
Robin D. Mittelstaedt Ohio University Athens, Ohio Skateboarding
David I. Macleod Central Michigan University Mt. Pleasant, Michigan Boy Scouts Camping Young Men’s Christian Association
Douglas Monroy The Colorado College Colorado Springs, Colorado Mexican American Boys
Faith Markle Cornell University Ithaca, New York Violence, Historical
xix
Kevin Muller University of California–Berkeley Berkeley, California Portraiture
J. Kelly McCoy Brigham Young University Provo, Utah Siblings
Gail S. Murray Rhodes College Memphis, Tennessee Books and Reading, 1600s and 1700s Books and Reading, 1800s Books and Reading, 1900–1960
Jay Mechling University of California–Davis Davis, California Jokes
Gene Myers Western Washington University Bellingham, Washington Pets
xx
Contributors and Their Entries
Murry Nelson Pennsylvania State University– University Park University Park, Pennsylvania Basketball Douglas Neslund Independent Scholar Sylmar, California Boys’ Choruses
Robert C. Post Massachusetts Institute of Technology Cambridge, Massachussetts Drag Racing Heather Munro Prescott Central Connecticut State University New Britain, Connecticut Sexually Transmitted Diseases
Thomas Newkirk University of New Hampshire Durham, New Hampshire Superheroes
Roblyn Rawlins State University of New York– Stony Brook Stony Brook, New York Discipline
Steven Noll University of Florida Gainesville, Florida Learning Disabilities
Harold Ray, Emeritus Western Michigan University Kalamazoo, Michigan Tennis
Jamie O’Boyle Cultural Studies and Analysis Philadelphia, Pennsylvania Amusement and Theme Parks
Jacqueline S. Reinier, Emerita California State University– Sacramento Sacramento, California Apprenticeship Douglass, Frederick Franklin, Benjamin Manners and Gentility Plantations Schools, Public Slavery Sunday Schools
Sabrina Wulff Pabilonia University of Washington Redmond, Washington Allowances Jo B. Paoletti University of Maryland–College Park College Park, Maryland Clothing Leslie Pasternack The University of Texas–Austin Austin, Texas Vaudeville Bruce Pegg Syracuse University Syracuse, New York Rock Bands
Theresa Richardson University of South Florida Tampa, Florida Intelligence Testing Pamela Riney-Kehrberg Iowa State University Ames, Iowa Farm Boys
Contributors and Their Entries Paul Ringel Brandeis University Waltham, Massachussetts Reformatories, Twentieth-Century Brian Roberts University of Northern Iowa Cedar Falls, Iowa Gold Rush Music Monica Rodriguez Sexuality Information and Education Council of the United States (SIECUS) New York, New York Sexuality Glenn Ian Roisman University of Minnesota Minneapolis, Minnesota Jobs in the Twentieth Century Patrick J. Ryan University of Texas–Dallas Richardson, Texas Clubs Nicholas Sammond University of California–Santa Cruz Santa Cruz, California Television: Domestic Comedy and Family Drama Ritch C. Savin-Williams Cornell University Ithaca, New York Same-Sex Relationships Carol Schafer Pennsylvania State University–Beaver Monaca, Pennsylvania Theatre
Constance B. Schulz The University of South Carolina Columbia, South Carolina Adams, John Jefferson, Thomas Nationalism and Boyhood: The “Young America” Movement Photographs by Lewis Hine Dorothy A. Schwieder Iowa State University Ames, Iowa 4-H in the Midwest David Setran Wheaton College Wheaton, Illinois Muscular Christianity Nancy Shoemaker University of Connecticut Storrs, Connecticut Native American Boys Carl-Petter Sjovold Sacramento City College Sacramento, California Fishing James F. Smith Pennsylvania State University– Abington Abington, Pennsylvania Gambling Mia M. Spangenberg ECPAT-USA New York, New York Pornography Peter N. Stearns George Mason University Fairfax, Virginia Emotions
xxi
xxii
Contributors and Their Entries
Nan D. Stein Wellesley College Center for Research on Women Wellesley, Massachussetts Bullying Sharon Braslaw Sundue Harvard University Cambridge, Massachussetts Indentured Servants Jobs in the Seventeenth and Eighteenth Centuries David S. Tanenhaus University of Nevada–Las Vegas Las Vegas, Nevada Juvenile Courts
Robert Wagner Retired Journalist Portage, Michigan Tennis Barbara M. Waldinger Marymount Manhattan College New York, New York Hofstra University Hempstead, New York Melodrama Kevin B. Wamsley The University of Western Ontario London, Ontario, Canada Ice Hockey
William N. Tilchin Boston University Boston, Massachussetts Roosevelt, Theodore
Elliott West University of Arkansas Fayetteville, Arkansas Cowboys Frontier Boyhood
James Tobias University of Southern California Los Angeles, California Video Games
Chris Wienke University of Pittsburgh Pittsburgh, Pennsylvania Bodies
Richard L. Venezky University of Delaware Newark, Delaware Organisation of Economic Co-Operation and Development Paris, France Schoolbooks
Karl Tilman Winkler Independent Scholar Strasbourg, France Juvenile Delinquency
Shauna Vey City University of New York New York, New York Performers and Actors Joseph A. Vorrasi Cornell University Ithaca, New York Violence, History of
Cary D. Wintz Texas Southern University Houston, Texas Washington, Booker T. and W. E. B. Du Bois Alfred Yee California State University– Sacramento Sacramento, California Chinese American Boys
Foreword
B
us from the two- and three-foot aliens running around our feet. If the past is another country, childhood is another planet. That distance—or rather our adult recognition of it—has been the source of a lot of humor and poignancy. It’s at the heart of classics like Peter Pan. The appeal of Never-Never Land is not so much an escape from aging and from the responsibilities of maturity. I suspect it’s not even the lure of a life of perpetual play. We are drawn instead by the tantalizing paradox of remembering that as children we were fundamentally different and yet knowing that we cannot remember exactly how we were different. The movie Big reverses the journey to make the same point. A boy is transported from early puberty into the adult body of Tom Hanks and the grownup world of jobs, bureaucratic politics, and sexual relations. It is full of honest laughs: Hanks at a party trying to eat tiny pickled corn as he would a full-sized ear, unintentional double-entendres (when he asks a woman if she wants to be on top or on bottom, he’s talking about bunk beds), and what appears to be a man in his thirties running in hyperdrive around a toy store. At its heart, however, is that sad, sweet feeling of disconnection from our own beginnings. As a rising executive in a toy company, the boy/man continuously shows how out-of-reach his colleagues are from
oys will be boys. But what is that? Adults who spend much time around children often face a contradiction. We have memories of our own early years, intimate and vivid. We mull them over and sometimes bestow on the rising generation the reflective wisdom we squeeze from them. Over time these memories become an important part of who we think we are; they give us a sense of footing in a shifty world. And then, while we’re eating dinner or sitting on the porch or driving to the store, a child or grandchild will say something so utterly strange, coming from such a bizarrely tilted angle, that it stops us cold. To the child it’s an offhand comment on the ordinary. To us it’s a whack in the face, sudden and irrefutable proof that the very young inhabit a place we cannot go and that our claims on our own childhoods are not nearly so firm as we think. The challenge of historians is to imagine their way into a distant time. This involves re-creating not merely events and settings but states of mind. It’s not easy. A Hopi farmer of the American Southwest in the sixteenth century, a New England merchant in the age of clipper ships, a teacher in her classroom in the 1920s— they lived by assumptions and values much different from ours. And yet the gap between us and them sometimes seems narrow compared to the one that separates
xxiii
xxiv
Foreword
their youthful customers and what moves them. They just don’t get it. Neither do we, and it is our awareness of that alienation, always in tension with what we think we were like and with our love and concern for our own children, that gives the study of boys and girls its special fascination and power. That, and the knowledge that children have had a far greater influence than we like to recognize. Part of our growing away from our early years often involves idealizing childhood as a time when the young live essentially apart, and so have little effect on life around them. Yet even a glance around us shows that is not the case now, and common sense tells us it never has been. If nothing else, boys and girls have played crucial roles in their families’ economic lives for most of history, and today their economic clout as consumers is enormous. Children have shaped their communities for good and ill and in many ways for most of our history, especially before the fairly recent trend of segregating children from the society of adults. Our adult fascination with them, or rather with what we make them into in our memories, has been the stuff of fiction, art, films, and so much more in our popular culture. This remarkable work of reference is more than just a collection of wellresearched, detailed, and thoughtful articles on various aspects of American boys and boyhood. It is certainly that. The editors have made every effort to recruit its contributors from the leading scholars of American childhood and the culture of youth. The entries reflect the most recent research, and each ends with a list of the best sources for pursuing its topic further. Compounding these virtues, however, is the exceptional range of the subjects between the covers of Boyhood in
America: An Encyclopedia. With such an imaginative range of topics, it can be read as a multifaceted exploration of children’s influence on American culture and life, of adult perceptions of childhood and of that mental and emotional world of the boy that we grownups recall but cannot truly remember. In that last regard, undertaking that formidable task of bridging the gap between adulthood and boyhood, or at least to learn something from close and careful observation, consider for instance the entry on what is for most of us one of the least pleasant memories of youth—bullying. It turns out there is considerable research with some surprises on the topic. Drop the image of the socially marginal lout. Bullies usually are well connected and popular. Pushing people around seems to enhance status, and indeed most bullying takes place before bystanders, although intervention is rare. What this suggests about youthful society, or for that matter human nature, is a bit sobering. Both in our memories and in media portrayals, bullies flaunt their physical advantage, which raises the question of body types and their idealization. American boys have admired the mesomorphic (muscular) over the ectomorphic (skinny) or endomorphic (fat). No surprise there. Or in the studies showing that mesomorphs tend to get preferential treatment in school and higher pay when they get out. But this research does encourage us to consider the implications, and when dovetailed with broader patterns of social change, some interesting connections crop up. Changes in the late nineteenth century challenged the tight hold of the masculine ideal, among them the shift from relatively independent farmwork to the urban factory and the assertive push of the new feminism. Mesomorphs arise!
Foreword It was then we see the “muscular Christianity” movement, the organization of contact sports (notably football), and the appearance of groups like the Boy Scouts with its themes of fitness and cultivation of muscularity. According to a recent study, grownups laugh an average of fifteen times a day. Children laugh about four hundred times. The statistic feeds our romanticized view of boyhood as a far happier time, but it also opens the issue of what has amused boys and why. Several entries try to tackle the question, and they all pull us back toward what is ultimately unreachable, the mind of the boy. But play is also a kind of middle terrain between the two worlds. Children are never more independent than when devising their own games and acting out their fantasies, yet parents always have tried to shape and direct their children by manipulating their playtime. The same is true of sports, an extension of play. On vacant lots and streets boys play their own pickup games. On well-groomed gridirons and diamonds and soccer fields the generations overlap. Boys compete, grownups coach. Entries on aspects of play are part of another thrust of this encyclopedia. They consider the many interactions between boys and adults. The entry on toys traces the spectacular expansion of the industry, especially in the twentieth century, a continuing negotiation between what boys want and what adults think they should have. As the buying power of the young has increased, so has their influence on the shape and messages of playthings. In the 1960s parents may have steered their sons away from war toys and soldierish “action figures,” but the mesomorphic male fantasies still carried the day, from Mutant Ninja Turtles and Power Rangers to Masters of the Universe.
xxv
Team sports show us another angle on the same phenomenon. Always there have been team games among all of the hundreds of cultures that have called America home, and most have been passed from one generation of boyhood to the next. Conceived and played entirely by the young, they have been the means of forming youthful communities and establishing pecking orders. Then, as with toys, the transition to modern America brought a vigorous adult invasion of boyhood’s play. “Organized” sports came to mean contests organized by adults and at least partly for adult purposes. They reflected parental anxiety about rearing sons in an urban industrial culture and about America’s new status within the community of nations. Boys still played baseball and other team sports as an outlet for aggression, to test themselves against others, for the joy of physical contact and exertion, and to try to enhance their status. Now, however, their elders meant to use sports to usher boys into the life of young men in a new America. Suspicious of cities, fathers meant organized athletics to keep their sons healthy and instill old values of farm and town, to coach the cooperative virtues of business success, and to keep them busy and out of the city’s various moral sinkholes. How well this worked, of course, is another question. Other entries lay out in some detail boys’ enthusiastic pursuit of the pleasures of the streets, arcades, and seedier entertainments of the early twentieth century. Nowhere is this pull-and-tug better demonstrated than in comic books. Adults draw and produce them, but only if the books jibe with what boys want and daydream will they bring a profit. Their enormous popularity also makes them natural targets for nervous and protective parents. The resulting triangulation
xxvi
Foreword
among boys, publishers, and parents has made for an interesting story, with iconoclastic and rebellious comic characters expressing youthful rebellion and not infrequent movements to regulate content and to ban the most offensive sorts. And in time comics have become icons of time past, collected “like the relics of saints” by men reaching for irretrievable boyhood. Superheroes, born of comic books, both acted out boys’ fantasies and answered the feelings of powerlessness common to both sexes. Superman, Spiderman, Batman, and others invariably battled for the good, but remarkably they too were sometimes criticized as subversive. To a leading critic of the 1940s Superman encouraged racism (he was from a “super-race,” after all), Batman and Robin were chummier than two males should be, and Wonder Woman was a man-hating lesbian. One form of expression is literally in the hands of boys and uncensored by adults—graffiti. In its entry you will learn that this ancient and universal phenomenon, found from Mayan ruins to China’s Great Wall, has been studied by scholars for nearly two hundred years. There are even subfields; specialists in the scratchings in public toilets deal in “latrinalia.” Authorities carefully categorize boys’ graffiti: names and initials, amorous, obscene and erotic, bits of wisdom and rhymes, and statements of protest, with gang symbols a recent addition. Here is one of many entries that treat seriously what is usually dismissed as trivial. Still other entries take one step further the exploration into negotiated terrain between boys and adults. Those on boy actors and entertainers and youthful characters in films, for instance, concern an area of employment and young males’ impact on an important industry, the evolving preferences of an adult and
teenage public in how boys are portrayed, and finally one of the uglier instances of abuse and exploitation of the young. From the sale and virtual enslavement of boys by early circuses, to the blighted childhoods of boy stars in an expanding entertainment industry, to the well-publicized cases of recent film and television figures overworked and milked of fortunes by parents and managers, the story of boys in show business ironically illustrates a side of child life rarely portrayed in movies or on the stage. Reading those articles, one might easily be drawn toward others opening onto issues of boys in the workforce, of mistreatment in the least and most privileged families, of boys’ roles as consumers as well as purveyors of popular culture, and of the changing images of young males in art, and each of those might branch in its own directions. The encyclopedia is a fine browsing ground and an enticement into the collective origins of every American man. It is also, of course, a tool of more focused research. Students from the middle grades to university faculties can pursue particular interests and find their way to other works. Here the range of topics makes this reference work useful beyond American childhood to the study of American social, cultural, political, medical, and spiritual history, merely to begin the list. Whether opening Boyhood in America: An Encyclopedia out of scholarly purpose or simple curiosity, readers will be rewarded with a body of fascinating information, a wealth of insights, and at least several steps toward what continues to draw us but will always elude us, the world of the boy. —Elliott West Fayetteville, Arkansas
Preface
A
ing that they do not necessarily follow under the pressures and constraints of daily life. And essential to understanding boyhood are the experience and perceptions of boys themselves, as they work, play, interact with friends and family, and try to make sense of the world around them. Although American popular culture has focused on male youth, particularly in times of rapid social change, it is surprising how little scholarly work has been done on boys. Both of the editors of these volumes have written books on the history of childhood, Jacqueline S. Reinier on the first half of the nineteenth century and Priscilla Ferguson Clement on the second half.1 But we both have spent our careers as teachers of women’s history, coming to understand the social construction of gender through the perceptions and experiences of women and girls. Academic programs in women’s studies have flourished since the early 1970s; a recent Web search reveals more than five hundred such programs in various colleges and universities throughout the United States. Women’s studies has made gender visible, has uncovered the experiences of ordinary as well as exceptional women, and has come to portray women and girls as agents in their own lives. Yet the body of this work tends to view men generically. Not until much
n encyclopedia that dares to cover a topic as comprehensive as American boyhood must first define its terms. Boyhood is a stage of the life cycle in terms of time, beginning at birth and continuing until the individual reaches some definition of self-sufficiency. Throughout American history and among the various groups that have participated in American culture, the length of boyhood has varied greatly. While some young males have been thrust into adult responsibilities and roles quite early in life, others have taken longer to assume cultural definitions of maturity. If we are to set an age group covered by the authors contributing to this encyclopedia, however, it would be from birth through the teenage years until about the age of twenty. Gender is a social construct, or how various aspects of American culture have viewed maleness (or femaleness), which begins even before birth and continues through the life cycle. How the infant male child becomes a boy is a social process, which varies according to race, ethnicity, class, and region in the various time periods of American history. While articles in the encyclopedia focus on adult attitudes toward boys, they also discuss behavior, which in real life does not always reflect an individual’s attitudes. Parents and educators, for example, may proclaim theories of childrear-
xxvii
xxviii
Preface
more recently, in the late 1980s and 1990s, have courses and programs in men’s studies appeared, which seek to make masculinity visible, and to study men, not primarily as historical actors, but in their lives as men. Masculinities studies focus on the social construction of the ideology of masculinity and how it has changed over time. Some scholars who look at masculinity have been interested in its early development, while others have been concerned with the effect of the presence of fathers on their sons. But surprisingly little research has been conducted on the actual experience of boys in the social contexts that have been important to them, for example, playing with toys, participating in organizations such as the Boy Scouts, competing in sports, or, recently, learning to negotiate in cyberspace.2 This encyclopedia, then, is an effort to focus on boys. Because we are both historians, we have selected topics that reflect the scope of American history, from the early seventeenth century to the present, with careful attention to the differences in race, ethnicity, class, and region that have comprised our multicultural society. Certain time periods in the history of boyhood, such as the experience of boys in the Civil War, the Gold Rush, or World War II, have been explored. Other articles deal with individuals whose boyhoods are well documented and who are typical in one way or another of growing up in a particular time period. Boys at work and other economic issues are discussed, as well as social welfare and juvenile delinquency. Some of the topics deal with the intersection of biology and culture. While biological factors in the development of boys have remained fairly consistent over time, significant social and cultural change has occurred concerning
health, mortality, and physical and emotional issues of particular importance to boys, such as masturbation, sexuality, or the way that boys have viewed and experienced their bodies. Other topics reveal attitudes and behavior of adults toward boys, for example, schools and schoolbooks, religion, films and television programs, competitive sports, and the increasing organization by adults of boys’ free time. Contemporary issues that adults worry about have been addressed, for example, drinking and smoking, guns, and substance abuse. But special efforts have been made to probe the experience of boys themselves, not only in their family life but also through exploration of worlds that boys inhabit in their leisure time. As artists, hunters, fishermen, drag racers, musicians, or gang members, American boys have been agents in actively shaping their own lives, and in turn, the nation’s culture. For each of over 150 topics we have been fortunate in finding distinguished and knowledgeable authors, all of whom are experts in their own fields of history or the social sciences. Occasionally an article has brought up the fascinating topic of boyhood as an analogy for the United States itself. Especially in the nineteenth century, Americans celebrated the expressive exuberance and physicality of boyhood. The chance of youth to begin anew and not be what one’s parents were symbolized the hope for democratic opportunity. Boys could be admired for seizing the initiative and engaging in enterprising activity. But the destructive potential of male youth has elicited fear in American culture, especially in uncertain times when the nation’s values seemed at stake. Boys have been chastised for their constant competition and their measuring of masculinity
Preface with and against each other. Because they have been viewed as the nation’s future, their actions have been given consequences sometimes out of proportion to their actual impact. Not only in recent years, but also throughout the nation’s history, their behavior has symbolized a dangerous force that needs to be contained, and often has been identified as what is wrong with the United States. In editing this encyclopedia we have thoroughly enjoyed working with our many authors, some of whom are old friends or colleagues and others whom we have met for the first time. We have approved all of the articles and have edited some of them; however, we are fully aware that the responsibility for the final product is our own. We are grateful for the aid of our advisory board in clarifying concepts and locating participants. We have benefited greatly from the fine work of Marie Ellen Larcada, editor of the six encyclopedias in the American Family series. We could not have completed the project without the cheerful and meticulous work of our managing editors, Jennifer Loehr and Karna Hughes. We are especially grateful to Sara Whildin and Susan Ware, librarians at Penn State Delaware County Campus, who tirelessly tracked down articles and books on
xxix
boys and uncovered the addresses of many potential contributors to the encyclopedia. And we appreciate the help of our assistants, Lynne Goodman and Richard Farra, who contacted authors for corrections in the final stages of production. Finally, because we have learned more about American boys than we ever thought possible, we would like to dedicate this encyclopedia to the next generation—our grandsons, Jackson Clement, Joe Bael, and Joshua Reinier. —Priscilla Ferguson Clement Media, Pennsylvania —Jacqueline S. Reinier Berkeley, California
Notes 1. Jacqueline S. Reinier, From Virtue to Character: American Childhood, 1775–1850 (New York: Twayne Publishers, 1996); Priscilla Ferguson Clement, Growing Pains: Children in the Industrial Age, 1850–1890 (New York: Twayne Publishers, 1997). 2. For this discussion of masculinities, the editors are indebted to Michael S. Kimmel, author of Manhood in America: A Cultural History (New York: The Free Press, 1996) and The Gendered Society (New York: Oxford University Press, 2000).
Introduction
T
strong sense of community. Dutch and English settlers in New York, however, who early established a capitalistic society, valued competition more. Colonial boys began to work at early ages as members of farming families in the North and in the South. About the age of fourteen, some were apprenticed to a master to learn a craft. And by the age of eighteen, most male youths were beginning to assume adult roles. In the early nineteenth century, parents continued to dress very young boys and girls alike, in loose gowns and frocks, soft shoes, and short hair with bangs. Distinct gender roles, however, took shape after the American Revolution. Portraits of little boys in frocks make a point to depict them as masculine, by including such toys as rocking horses, hammers, and pony whips. Girls of the same era, if posed with any toy, held dolls. Boys of the age of five or six no longer wore clothes resembling those of male adults, but were dressed in short jackets or soldiers’ or sailors’ uniforms. In predominantly rural America, most boys still lived and worked on farms. As capitalism transformed the northeastern economy, however, some found new opportunities as clerks, teachers, or lawyers. Those who moved west with their families learned independence and self-reliance, herding cattle on their own on horseback or traveling long distances
he entries in this encyclopedia cover a wide range of topics, different periods in American history, and various groups of boys. Yet it is possible to establish certain patterns in the history of boyhood by time period and by topic. The experience of white boys will be examined first and then that of Native American, African American, Asian American, and Hispanic American boys. In the seventeenth and eighteenth centuries adults viewed boys as a part of the general child population until they reached the age of four or five years. Before that time parents dressed boys and girls alike, in long gowns and caps, and then gowns rather similar to those of adult women. With few toys available for purchase, boys made their own riding sticks or balls, and kept small animals and birds as pets. Those adults who could purchase toys were more likely to do so for their male than female children. Boys and girls both read books that taught moral values; not until the 1740s did John Newbery begin to write books that were designed to amuse as well as to instruct children. By the eighteenth century, sensitivity was an admired trait, and boys were not condemned for crying. And, throughout the colonial period, competition was a contested value. New England Puritans frowned on competitive behavior, which contradicted their
xxxi
xxxii
Introduction
on errands for their parents. Reading material available to boys emphasized character traits adults expected—hard work, thrift, honesty, and responsibility—and rationality rather than fantasy. Competition in boys became a more universally accepted virtue. Not only did an industrializing economy place a premium on competition, but also many venues opened to boys, such as membership in local fire companies, where contests with competing groups were standard practice. The anger generated by competition was an acceptable emotion if exhibited in a public arena, but was not tolerated in middleclass families or when directed against women and girls. Parents expected boys to master fear and demonstrate courage, and competitive sports became increasingly popular. By 1861, boys confronted their fears directly during the Civil War when some served as soldiers or drummer boys. More coped with the loss of loved ones or, if they lived in the South, with fear of invasion and forced movement from their homes. By the late nineteenth century the experience of boys from different social classes diverged more widely than before. Boys from middle- and upper-class families enjoyed a prolonged boyhood. Dressed, as were girls, in frocks and skirts until the age of three, they graduated to short pants called knickerbockers, which become the standard male dress until the teen years. Not until they neared maturity and entered the workforce did male youths wear long pants. Boys played with toy tools, which prepared them for adult work. Yet they also enjoyed plenty of free time to play on their own outdoors. They formed their own sports teams, and read books that emphasized pure pleasure, adventure, or boyhood pranks. Working-class boys stayed in school a shorter period of
time and went to work sooner than did their peers from affluent families. Nevertheless, they could escape work routines by using some of their wages to participate in an expanding popular culture, attending melodramas in city theaters and visiting amusement parks. They also bought dime novels with lurid plots of crime and violence. By the end of the century, adults expressed their concern about both working- and middle-class youth. Some reformers established playgrounds and clubs to entice working-class boys away from what they considered dangerous and immoral popular amusements. Others worried that the influence of mothers and female public school and Sunday school teachers made middleclass boys passive and feminized. Adult males pressed for organized activities through groups such as the YMCA, churches, the movement known as muscular Christianity, or sports teams. Increasingly the time of boys was managed more by adult males and less by boys themselves. Men and boys, perhaps in response to the drive for female suffrage and more educational and work opportunities for women, asserted themselves to make gender differences more pronounced. Boys worked hard to acquire physical strength, engaging in strenuous bodybuilding and sports programs. By the early twentieth century children were viewed as distinctly gendered from birth, and parents no longer dressed young boys and girls alike. The short pants worn until adolescence disappeared, and boys wore long pants regardless of their age. In other ways, however, the experience of boys and girls merged. Both sexes were expected to hide their anger, and parents helped boys as well as girls to manage their fears, no longer requiring that boys do so on their own.
Introduction With advances in technology, toy cars, airplanes, radios, and cameras were marketed directly to boys. Middle-class boys who received allowances and workingclass boys who earned wages could become consumers and purchase small toys on their own. Books such as the Tom Swift and the Hardy Boys series encouraged this interest of boys in technology. Adults continued to manage boys’ time through organizations such as the YMCA and the Boy Scouts. Boys from all social backgrounds attended school longer, usually continuing through high school, where they participated in a new youth peer culture including sports and dating. With the exception of children from immigrant families who entered the workforce in their teens, fewer boys went to work full-time at a young age. Even during hard times in the 1930s, most boys remained enrolled in school, although many helped out their families with parttime after-school employment. Some older boys did leave home to ride freight trains in search of work. During World War II, boys collected scrap metal, planted victory gardens, bought war bonds, and prepared themselves physically for wartime service. They read comic books depicting superheroes using technology to overcome evil, and admired comic book transformations from weak and ordinary to strong and powerful, as many of them sought to transform themselves. At the beginning of the twenty-first century, boys’ dress remains gendered from an early age, although since the 1960s the clothes of young boys have differed from those of teens. Boys continue to be consumers, buying many of their own toys. Products, such as baseball cards, which are marketed directly to them, encourage capitalistic behavior, such as deciding which to save and which
xxxiii
to trade. Most boys now attend school through high school and many complete their education in college. Their lives continue to be managed closely by parents and other adults, who supervise their play, clubs, and sports activities. Contemporary boys are encouraged to mask emotion in order to prepare for careers in the corporate or service worlds. Yet much of their experience is no longer connected to the work life of their fathers. Fantasy figures with no connection to real life have become increasingly popular as toys. Boys seek escape through playing violent video games, reading superhero comic books, and watching horror movies. Although counterculture and feminist values of the 1960s and 1970s have led parents to encourage boys to be cooperative, competition among them has not disappeared. And, as women have increased their workforce participation and have demanded recognition and rights, the emphasis on physical strength to distinguish masculinity has increased. Today adults worry about “problem” behaviors in boys—suicide, accidents, prostitution, and gang membership—but perhaps fail to celebrate positive qualities that have been admired in the past, such as exuberance, freedom, and hopefulness. Although minority youths have participated in this more general history of boyhood, their experience has also differed in significant ways. Boys from diverse racial backgrounds have developed their own boy cultures. Although members of mainstream culture have sometimes feared and often disparaged minority boys, they also have admired and even copied cultural patterns and qualities that are distinctive among these groups. Most Native American boys did not have any formal schooling until the end of the nineteenth century, but acquired
xxxiv
Introduction
their education from parents and other relatives. They also passed through various age-graded rituals until they reached maturity, usually in their teen years. In the late nineteenth century, the United States government determined to bring Native American children into the mainstream by forcing them to break with their families and traditional communities. Native American boys and girls were removed to boarding schools and educated there for many years. Yet, even though they were separated from their families and forced to learn a new language and history, children in these schools developed their own distinct culture. By the early twentieth century, traits that Native Americans inculcated in boys, such as hunting and tracking skills and the ability to manage alone in the wilderness, were qualities admired by whites and incorporated into the activities of organizations such as the Boy Scouts. Most African American boys of the eighteenth and nineteenth centuries experienced life as slaves and never enjoyed the freedom and independence of their white contemporaries. Slave boys went to work early in life and had to control aggressiveness in order to survive. While righteous anger was acceptable among white youth, it was not among slave boys. Slave boys played their own games and made their own toys, but few of them had the opportunity for education offered white boys. While many authors wrote books for white readers, virtually none were published that featured young, admirable African American characters until the civil rights movement of the 1960s. The Civil War, which brought freedom to former slaves of all ages, was immensely important to African American boys. After emancipation, most of
them continued to live in the South where they went to work early and attended schools inferior to those of whites. Clubs and sports teams that welcomed whites did not admit blacks. As African American families migrated north during and after both world wars, black boys had more equal access to education and participated in popular culture. The Civil Rights Movement has brought more equal access to schools, clubs, and sporting activities in both the North and South. Yet African American boy culture has remained distinctive. Black youths helped initiate and lead the Civil Rights Movement, and some of them joined the Black Panthers. In the late twentieth and early twenty-first centuries, African American boys have been active in developing hip-hop culture, which, like the Civil Rights Movement, has been admired by the white majority. In recent years the numbers of Asian American boys in the United States have increased greatly. Representing people whose ancestry originates in many countries, Asian American boys, like all young immigrants, have negotiated between the dominant mainstream culture and that of their parents. Stereotyped as good students of the model minority, Asian American boys have also adopted goals of the Civil Rights Movement and aspects of African American youth culture and slang. Representative of the new generation are multiracial individuals such as the golf superstar Tiger Woods, who claims Native American, African American, and white ancestry, as well as Thai and Chinese. Hispanic boys, representing various Central and South American countries, now belong to the largest minority population in the United States. Those from Mexico, whether descendents of the Span-
Introduction ish and Mexican eras of the Southwest or children of more recent immigrants, experienced many decades of irregular schooling, poverty, and discrimination. In agricultural regions of Texas and California, boys as young as five contributed to family income, picking crops with their families. In urban areas they were more likely to attend school and pick up odd jobs, engaging in their favorite pastimes, movies and sports. Mexican American boys have been integrated into mainstream culture through Hollywood, baseball, and education. Yet in barrios and schools, car and sports clubs, cliques and gangs, they have also participated in the creation of a distinctive regional transborder culture, not only in the Southwest, but also in other areas of the United States to which their families travel. To conclude, two broad trends in the history of boys of all backgrounds throughout American history have become apparent. First, boys have been encouraged to be both followers and breakers of rules. Parents have encouraged boys to follow family rules and to participate in games and sports where adhering to rules closely is required. Yet they also have recognized that boys, who will grow up, raise families, and participate in the nation’s political and economic life, will need to exercise independence, which may involve some rule breaking. Rules that parents tolerate breaking, however, have not necessarily been the rules that boys themselves have chosen to break. Boys have chafed under adult management and have rebelled through stealing,
xxxv
drinking to excess, or taking drugs. More boys than girls have been caught up in the juvenile justice system. Second, boys have engaged in violent behavior, which adults have expected, sometimes admired, and often condemned. Few adults have sanctioned the extreme behavior to which some boys have been socialized by their authority figures, yet lower levels of violence have been approved as well as disapproved. Parents have expected boys to be physically tough, and to prove that toughness in contests with nature, fights with other boys, or sporting events. Violence by white boys against slaves or ethnic minorities in times of war has been accepted. Most parents do not approve, nevertheless, of violence that results in serious physical injury. As boys have tried to prove toughness and physical prowess in violent encounters, both on and off the playing field, they have not always been able to walk that fine line between acceptable and unacceptable levels of violent action. Yet, in all periods of American history boys of all backgrounds have proved to be amazingly creative and resilient. Clearing land and minding livestock on farms, mastering new technologies faster than their masters as apprentices, working in mines and factories, hawking newspapers, excelling in school, and negotiating in cyberspace, they have contributed enormously to the development of the nation. In many areas, but especially music, games, sports, cars, and computers, boy culture has indeed become American culture.
A Abuse
dle decades of the seventeenth century reflected a growing awareness of boys as an endangered species in need of regulation and restraint. A vision of child maltreatment has inspired serial generations of social reformers to generate policies designed to exorcise violence, sexual assault, and physical brutality from the arsenal of permissible actions taken by boys and against boys and girls alike. Ironically, the discovery of abuse proceeded simultaneously with the discovery of boys and young men as perpetrators as well as victims. It has led reformers to establish principles of custody and oversight designed to protect and to incarcerate young men and often to impose punishments rather than attend systematically to a continuing reconstruction of their environments—a situation that has helped to sustain child abuse as a cultural practice for boys and girls alike. What follows is a minihistory of the discovery of child abuse in the United States, the evolution of approaches to prevention and punishment, and the conditions that sustain it as a cultural practice—most especially as it is revealed with regard to boys and young men. The attempt to exorcise violence from the lives of boys and young men is expressed in a series of social discoveries that, over the course of time, have revealed an array of once-invisible forms of
Abuse against children and youth is nothing new in human history. Indeed, infanticide, child beating, economic exploitation, neglect, abandonment, prostitution, child barter, “baby farming,” and enslavement are all time-honored traditions in human history and in the lives of boys and girls alike. What is less time-honored is a vision of children as vulnerable and in need of protection, of families as protective enclaves and tutorial institutions, of government agencies as guarantors of children’s well-being, of workplaces as potentially exploitive sites, of relatives and neighborhoods as sources of danger, of schools as appropriate sites of socialization for the young, of systematic punishment as the just due of perpetrators of child abuse, and of boys and men as distinct classes of perpetrators and victims. Historically viewed, child abuse has befallen boys and girls alike, but the forms that it has taken have been deeply gendered. This entry will focus specifically on boy-centered features of child abuse as they have evolved over time in the United States. The concept of child abuse, like the concept of childhood vulnerability, of men as keepers of communal bonds, and of boys as beings in need of self-restraint, is relatively recent. The appearance of the concept of child maltreatment in the mid-
1
2
Abuse
A Philip de Bay illustration of a priest spanking a boy, from a 1902 publication (Historical Picture Archive/Corbis)
assault against children. There was the discovery among seventeenth-century theologians of the need for systematic education, age-appropriate discipline for young men, restraints on sexual expression and physical violence, and a focus on what Anthony Rotundo (1993) has called a communal concept of masculinity. There was the discovery of boyhood innocence among Romantic poets and Transcendental philosophers who, in the early decades of the nineteenth century, projected childhood as a divine rather than a corrupt condition and called, among other things, for a physical and moral liberation of children’s minds and bodies and a modicum of self-discipline and self-government from its boys. There
was the discovery by moral reformers of the early nineteenth century of street children and beggars as a class of people in need of protection and benevolent “moral” tutors and of child neglect as a deplorable and even dangerous social condition, most especially when it left unruly boys on their own or rendered them vulnerable to bullying and exploitation. There was the discovery of child labor as a form of assault that fell with violent force on young boys. There was the invention of moral persuasion as a more humanitarian approach than physical coercion or corporal punishment to the regulation of otherwise ungovernable boys. In the late nineteenth century and early decades of the twentieth century, violence against boys and girls was revealed with the discovery of adolescence as a vulnerable and bombastic period of youth development, juvenile delinquency as a specialized category of curable criminality, boys as potential perpetrators as well as victims in need simultaneously of protection and regulation, and battered children as a special class of children in need of shelter. Over the course of the twentieth century, attempts to expand concepts of abuse and root out violence against children have taken new forms as scholars, theologians, journalists, and educators discovered and revealed the existence of long traditions of incest, sexual abuse, labor exploitation, and child beating in families across the social spectrum and across centuries. Among these reformers were boy workers who redefined concepts of manhood in the late nineteenth and early twentieth centuries and elevated combativeness, aggression, and physical prowess as important norms and standards of masculinity. This group of re-
Abuse formers, while deploring violence against women, nonetheless took patterns of assault by men on boys and boys on boys as natural consequences of their manliness. Recently with the public outing of date rape, child pornography, prostitution, and pedophilia and the discovery of homophobia, new and once-invisible and silenced realities have become visible. So too has the historical specificity of the belief in inherent male aggressiveness. Through the efforts of contemporary child advocates, the omnipresence of institutional punishments such as physical beatings, sexual violence, and the harsh treatment of incarcerated juveniles has claimed the attention of the public, as have statistics documenting the existence of forced sex and incest practiced mostly by boys and men against women. The discovery of different forms of child abuse is not the only effect of an evolving perspective that children and youth are vulnerable and in need of protection. The identification and punishment of perpetrators constitute another. In an ironic turn of fate, the discovery of boys as perpetrators has proceeded simultaneously with a concern for their regulation and protection. The mistreatment of apprentices in the seventeenth and eighteenth centuries resulted not only in the discovery of boys as an ungovernable class of potentially disruptive citizens but revealed the existence of abusive masters vulnerable to fines. The perceived neglect of certain classes of urban children in the nineteenth century led to their removal from families to orphanages and “baby farms,” where yet a different class of perpetrators were created—the keepers of the asylum. Typically these were young men who had no particular preparation, were relatively
3
unsupervised, and received almost unlimited authority to do as they would in the name of moral guidance. The entry of wage-earning boys into coal mines, cigar shops, and whiskey bottling works over the course of the nineteenth and twentieth centuries inspired the discovery of labor exploitation as a form of child abuse, factory owners as potential perpetrators, and inhumane conditions as seedbeds of criminal behavior, especially among young boys. The passage of compulsory education laws requiring children to spend long periods of time in school had the potential to transform teachers into potential perpetrators and, with the passage of stringent reporting laws, into victim identifiers as well. More recently, increasing levels of sexual freedom have revealed the existence of abuse practiced by youths on other youths in the form of gun assaults, gay bashing, date rape, and so on. The emergence of laws prohibiting sexual harassment and assigning punishments to perpetrators constitutes a new discovery that has both reflected and revealed the widespread practice of abuse in churches, households, and neighborhoods. The discovery of different forms of child abuse and the identification of boys and men as both perpetrators and victims are not the only effects of an evolving perspective that children and youth are vulnerable and in need of protection. Efforts to transform the status of children and root out abuse also show up in the emergence of child advocacy as an approach to social action and a strategy for political and organizational development as well. The emergence of of the Children’s Aid Society and religiously based protectories in the latter decades of the nineteenth century, the National Federa-
4
Abuse
tion of Day Nurseries and the Children’s Bureau during the Progressive era, the emergence of child-focused physicians and psychologists in the early twentieth century, the founding of the Young Men’s Christian Association, followed by the Boy Scouts of America and an array of other regulatory settings for boys; the creation of the United Nations Declaration of Human Rights at midcentury and institutions like the United Nations Children’s Fund (UNICEF) and the Children’s Defense Fund in the latter half of the twentieth century—all changes reflect continuing efforts to define abuse, identify violence, and protect children from its more draconian expressions. Among the more important agendas that have emerged recently are those centering on the identification and financial accountability of fathers as well as mothers and on the assignment of responsibility and a measure of blame to boys and young men. The attempt to exorcise violence from the lives of children has found expression in a gradual, if limited, involvement of local government agencies in the business of child protection over the course of two centuries. There is a corpus of law creating alternative institutions for the rearing of children considered to be victims of abandonment, neglect, malnutrition, battering, and sexual abuse. Orphanages came in the nineteenth century, to be replaced by substitute families outside cities and ultimately by a limited welfare system. Specialized public institutions regulate the lives of previously unsupervised or abused young people: juvenile reformatories in the early twentieth century, foster care systems, and child abuse prevention centers among others at the twentieth century’s end.
Public agencies for the protection of the young include Children’s Aid Societies in the nineteenth century, the Children’s Bureau in the early twentieth century, child protective services, family court, a National Center for Missing and Exploited Children, and child abuse hotlines in the latter decades of the twentieth century (Jenkins 1998). More recently, centers for the identification of recalcitrant fathers have emerged and constituted the rediscovery of young men as perpetrators of neglect. Legal traditions regulate the conditions of child labor, and laws in more than thirty states limit, if not prohibit, the use of corporal punishment in public schools. Other laws limit the power of patriarchal authority within families, define children’s rights, and assess penalties for spousal and child abuse. There are chinks in the armor of the traditional legal assumption that concepts of family reconciliation are necessarily in the best interest of children. What is more, new legal standards are expanding the effective definition of child abuse to go beyond battery and sexual assault and to include protection from violent environments. These new legal standards might threaten custody for those parents who fail to protect their children from witnessing violence and abuse in the household. Yet traditions of violence against children have persisted, notwithstanding a historically evolving concept of childhood and child abuse; the gradual discovery of child innocence, child neglect, child rights, baby battery, sexual assault, incest, and excessive corporal punishment as social conditions in need of remediation; a 200-year history of child advocacy; the expansion of government in the business of child protection and the
Abuse involvement of physicians and teachers in its identification; the emergence of institutional mechanisms for the reporting of abuse; and the presence of thousands of people who, over the course of two centuries, have condemned violence in all its forms. Abuse against children is still visible in the informal spaces of children’s lives in neighborhoods, streets, playgrounds, and schools; in the grounded routines of legally constituted educational institutions; in the deep structures of gender, class, and race relations in the United States; in publicly and privately constituted institutions for adjudicated young people; and in the protected domestic enclaves of hundreds of thousands of families in the United States as well as those of religious institutions. It is hidden by the silences that result from the array of incest taboos that both shield and condemn assaults against young children, male and female. Boys have emerged as a newly discovered class who are being redefined as criminals rather than adolescents or delinquents, subjected to mandatory “zero tolerance” expulsion laws, incarcerated at increasing rates, bound over as adults at age thirteen, subject to capital punishment, and otherwise penalized as adults might be. The persistence of child abuse in the United States is a subject that has claimed the attention of an array of scholars, educators, policymakers, and child workers who, as they seek to discover its various manifestations, also seek to understand the social, economic, political, and cultural bedrock that sustains or protects the practices of abuse against children. Some of this research explains the persistence of child abuse as a reflection of historical definitions of manliness that elevate the status of combative behavior, physical ag-
5
gression, toughness, and domination as idealized states. Other scholars characterize abuse against children—both inside and outside schools—as a natural condition in a society that protects gun ownership; rationalizes the use of force to settle disputes; makes heroes of gun-toting, physically dominating bullies; and otherwise tolerates or even celebrates public expressions of violence. The persistence of child abuse may also reflect historical commitments to political practices that protect family privacy and church autonomy and limit the capacity of government to regulate childrearing or to provide blankets of protection for children who are outside the reach of public authorities: children under five years of age; children who attend relatively unregulated child care centers in neighborhoods, churches, and homes; homeless children; children of working parents; young people who inhabit the streets and malls when school is out; boys who participate in the aggressive worlds of contact sports and locker room brawls; and gay boys who are victimized by their peers. Violence against children may have to do with economic commitments that privilege wage labor, limit support for the care of dependents, impose contradictory demands and pressures on already overburdened families, limit the quality and quantity of support they can expect, and otherwise do little to prevent outbreaks of violence. Child abuse may even persist because of powerful religious beliefs that support the use of corporal punishment as a legitimate instrument of moral education. Finally, the persistence of child abuse may be an outcome of social and legal policies that identify and punish perpetrators and remove battered
6
Accidents
children from abusive family situations but fail to attend systematically to the improvement of children’s environments. The good news is that child abuse has been discovered and condemned. The bad news is that the cultural, political, economic, or social wherewithal to prevent it or to root it out is not as yet fully conceived. Barbara Finkelstein
References and further reading Aries, Phillipe. 1962. Centuries of Childhood: A Social History of Family Life. Translated by R. Baldick. New York: Alfred A. Knopf. Books, Sue, ed. 1998. Invisible Children in the Society and Its Schools. Mahwah, NJ: Erlbaum. Connell, R. W. 1995. Masculinities. Berkeley: University of California Press. Fass, Paula S., and Mary Ann Mason, eds. 2000. Childhood in America. New York: New York University Press. Finkelstein, Barbara. 2000. “ A Crucible of Contradictions: Historical Roots of Violence against Children in the United States.” History of Education Quarterly 40, no. 1: 1–22. Gordon, Linda. 1988. Heroes of Their Own Lives: The Politics and History of Family Violence, Boston, 1880–1960. New York: Viking. Greven, Phillip. 1990. Spare the Child: The Religious Roots of Punishment and the Psychological Impact of Physical Abuse. New York: Vintage. Jenkins, Philip. 1998. Moral Panic: Changing Concepts of the Child Molester in Modern America. New Haven, CT: Yale University Press. Polakow, Valerie, ed. 2001. The Public Assault on America’s Children. New York: Teachers College Press. Rotundo, E. Anthony. 1993. American Manhood: Transformations in Masculinity from the Revolution to the Modern Era. New York: Basic Books. Schlossman, Steven L. 1977. Love and the American Delinquent: The Theory and Practice of “Progressive” Juvenile Justice, 1825–1920. Chicago: University of Chicago Press.
Accidents Accidents are part of everyday life for all children. In the vast majority of cases the injuries suffered are minor: grazed knees, cut fingers, or bruised shins. A substantial number of accidental injuries are more serious and can result in major debility or death. In the United States, boys suffer serious accidental injuries in many different settings and from many different causes. Although developments in many fields have greatly reduced the number of serious accidents and accidental deaths, reductions in other forms of illness have been more effective. Accidental injury is now the most important cause of death in American boys. The word accident suggests a random event, occurring by chance, unpredictable, and unavoidable. The history of health care shows that many accidents can be avoided or prevented, and the term unintentional injury is now preferred by the public health community. However, the term accident is still in general use and will be used throughout this entry. In the earliest years of the United States, almost every aspect of a boy’s life carried a high risk of serious accident. For thousands of immigrant children the very act of traveling to North America carried a high risk of injury or death. Accidents on board the immigrant ships of the eighteenth and nineteenth centuries were common. Falls were especially frequent: falling on board ship could result in longterm crippling injury, and a fall overboard often led to death by drowning. Life onshore was also risky. At home, particularly in poor housing with its cramped and badly maintained conditions, boys were frequently injured as a result of burns or scalds. Busy and densely populated towns carried risks
Accidents from accidents on the streets, and the more sparsely populated rural areas carried their own risks from play around agricultural machinery or in rivers and lakes. Many boys began work at an early age, and the use of unprotected machinery in unsafe working environments meant that industrial or agricultural injuries caused the deaths or long-term injury of thousands of boys as well as men every year. In the nineteenth century statistical evidence about accidents was rarely collected, except at the local level. Hospitals, for example, kept records of the children they treated, and injured children figured largely in these. The 1895 Annual Report of the Children’s Hospital of Philadelphia listed more than 500 cases of “Recent Accidents” among the 748 children it admitted that year, although it did not identify which injuries were suffered by boys and which by girls. The list included more than 220 wounds (three caused by gunshots), 16 burns and scalds, and 26 fractures. In addition, 16 children were admitted having been bitten, four of them by men. Work, play, and educational environments all held serious risks, and accidents in all of them could prove fatal. In December 1907 a coal mine explosion in Monongah, West Virginia, caused the deaths of 362 miners, many of whom were boys. Less than a year later a fire at Lakeview School in Collingwood resulted in the deaths of 178 children. Major disasters such as these led to an increasing awareness of the need for greater safety and accident prevention strategies. The reduction in the employment of boys as factory, mine, or agricultural workers also helped to cut down work-related accidents in children, but many more chil-
7
Harry McShane, a child laborer who lost his arm in a spring factory accident, Cincinnati, Ohio, 1908 (Library of Congress)
dren were dying from the effects of infectious disease. However, by the middle of the twentieth century vaccination programs and other public health initiatives had changed the picture of illness and disease among American children: accidents had become the leading cause of longterm disability and death. By 1946, according to Herbert Stack, the director of the Center for Safety Education at New York University, accidents were causing more deaths in schoolchildren than the ten most common infectious diseases combined (Stack 1946).
8
Accidents
A young boy has stitches removed from his face in the doctor’s office. (Shirley Zeiberg)
Stack was convinced that education was the major weapon in the fight to reduce accidents and called for the education of all children in fields such as automobile, fire, home, farm, and firearms safety. In 1952 the American Academy of Pediatrics (AAP) established its Committee of Accident and Poison Prevention. In succeeding years the U.S. government introduced a series of acts to promote safety, including the Poison Prevention Act and the Child Protection and Toy Safety Act. Over the next two decades, boys’ health continued to improve and mortality rates continued to fall, but the impact of accidents remained high. For boys of
all racial backgrounds aged one to fifteen years, accidents were by far the major cause of death. The U.S. National Health Survey (USNHS) for 1959–1961 found that almost one in three children under seventeen years of age was reported as suffering an accident annually: an average of 18,983,000 in each year of the study, with boys more likely to be affected than girls. The USNHS found that falls were the most common type of accident reported, that five- to nine-year-olds were the most vulnerable age group, and that more than half of all reported accidents happened in the home. In 1960, accidents were the cause of 30 percent of
Accidents deaths in children from one to four years of age and 40 percent of deaths in children aged five to fourteen years. In the school-age group, motor vehicle accidents caused most of the deaths, whereas in the preschool group, fires and explosions caused two-thirds more deaths than motor accidents. Drowning was another major cause of death in childhood. Racial differences in accident rates were also reported, with nonwhite children being five times more likely to die in fires or explosions and four times more likely to die from accidental poisoning than white children. In the last half of the twentieth century, information on childhood accidents became much more detailed. Eighteenthand nineteenth-century data were extremely limited and gave a poor picture of the effects of accidents. Early-twentieth-century data gave a clearer picture but rarely identified differentials between boys and girls, age groups, or races. More recent data enabled a more detailed assessment of accident statistics, which could be used to develop more effective programs aimed at reducing the numbers and effects of accidents. Contemporary American boys are far less likely than their forebears to suffer serious debility or death as a result of accidents. However, a risk of such serious outcomes does remain, and the picture of accident injury and mortality concerns policymakers and families alike. Preliminary data from the National Vital Statistics Report for 1998 showed that accidents killed far more children than any other single cause. In the one- to fouryear-old age group, accidents killed 12.4 children per 100,000, whereas the next biggest killer, congenital anomalies, killed 3.5 per 100,000. In the five- to fourteen-year-old age group, accidents were
9
responsible for 8.0 deaths per 100,000 children, whereas malignant tumors resulted in 2.6 deaths per 100,000 children. In the five- to fourteen-year-old age group, motor vehicle accidents resulted in more deaths than all other accidents combined: 4.5 per 100,000 children. In 1995, accidents resulted in the deaths of 6,600 children, more than half of whom were boys. Five leading causes of accidental injury are identified by the Centers for Disease Control and Prevention (CDC): motor vehicles, drowning, fires/burns, firearms, and suffocation. Between them, these categories accounted for more than 80 percent of injury deaths in children during 1995. In almost every category of age or accident, boys were more likely than girls to suffer the accident and to die as a result. Death rates for boys and girls were similar in only two areas: motor vehicle accidents for infants and children aged one to four years and fire and burn deaths in children aged five to fourteen years. In no category were girls more likely to die than boys. The largest difference was in accidental deaths due to firearms in the ten- to fourteen-year-old age group, in which boys were six times more likely to be killed than girls. Nonfatal accidents are far more numerous than fatal ones, of course, but most go unreported unless they are serious enough to need professional medical attention. One important exception to the decline in childhood accidents is sports injuries. According to the National Safe Kids Campaign, injury rates for baseball fell marginally between 1987 and 1995, but those for football and basketball both rose, to 550 per 100,000 children in the case of basketball. The relatively new activity of inline skating first produced statistics in 1993: in only two
10
Actors
years the injury rate grew from 50 per 100,000 children to 150 per 100,000. The rise in sport-related injuries reflects an increase in the number of children playing sports. Much of this increase is the result of more girls becoming active in sports. Continued success in accident prevention, improved treatment for injuries, and additional participation by girls in traditionally maledominated sports may well see an equaling-out of accident risks between boys and girls in the early years of the twentyfirst century. Bruce Lindsay See also Disease and Death References and further reading Children’s Hospital of Philadelphia. Annual Report. 1895. Committee on Injury and Poison Prevention, American Academy of Pediatrics. 1997. Injury Prevention and Control for Children and Youth. Edited by Mark D. Widome. Elk Grove Village, IL: American Academy of Pediatrics. Gillham, Bill, and James A. Thomson, eds. 1996. Child Safety: Problem and Prevention from Preschool to Adolescence: A Handbook for Professionals. New York: Routledge. Goodman, Nan. 1998. Shifting the Blame: Literature, Law and the Theory of Accidents in Nineteenth-century America. Princeton: Princeton University Press. Stack, Herbert J. 1946. “Greater Safety for Our Youth: An American Opportunity.” Journal of Educational Sociology 20, no. 2: 114–123. Walker, Bonnie L., ed. 1996. Injury Prevention for Young Children: A Research Guide. Westport, CT: Greenwood Press. Ward Platt, M. P., and R. A. Little. 1998. Injury in the Young. Cambridge, England: Cambridge University Press.
Actors See Performers and Actors
Adams, John John Adams (1735–1826) was a lawyer and revolutionary leader from Massachusetts, part of the committee (with Benjamin Franklin and Thomas Jefferson) that drafted the Declaration of Independence, one of the diplomats who negotiated the Treaty of Paris (1783) securing American independence, the first vice president (1789–1797) and second president (1797– 1801) of the United States, and the founder of a dynasty of Adams statesmen and writers. The eldest son of a farmer and selectman of the town of Braintree, Massachusetts, John Adams spent his boyhood surrounded by extended family and the close-knit if sometimes contentious community in which he received his earliest education. Much of the contention centered around the church, within which his parents and two articulate ministers conscientiously raised him. Although close to the coast and less than a day’s travel from Boston, Braintree was a predominantly agricultural community. There as a boy John Adams worked with his father in the fields and with his teachers on books and was formed by a New England culture that valued hard work, religion, and learning. He also roamed the fields and marshes with a gun, played at games with other boys from the town, and (according to memories in his old age) admired the girls. From these influences he developed into an adult with a sharply critical mind and a penchant for self-criticism. He became an optimistic conservative and a stubborn supporter of New England values of independent but well-regulated self-governance, values that he linked to the scientific, religious, and political principles of the eighteenth-century Enlightenment. John Adams was born at the foot of Penn’s Hill on October 30, 1735, a year after the 1734 marriage of forty-three-
Adams, John
11
Home of John Adams in Braintree, Massachusetts (Library of Congress)
year-old “Deacon” John Adams (1691– 1761) to twenty-five-year-old Susannah Boylston (1709–1797) of Brookline, Massachusetts. Deacon Adams’s great-grandfather, Henry Adams, had emigrated from Somersetshire, England, in 1630 in the “Great Migration” of Puritans to New England. One of the founders of the town of Braintree, Henry established there a malt house where 100 years later his young great-great-grandson could visit his great-uncle Peter Adams, who was still at work in the family business. Deacon John Adams was one of eleven children, most of whom remained settled in Braintree. Susannah Boylston also came from a large extended family, which included a great uncle, the physician Zabdiel Boylston, who first introduced inoculation for smallpox in Massa-
chusetts. As an adult John Adams described to a cousin his vivid memories of visiting his grandmother’s home and his mother’s family in Brookline. Both his Adams grandparents had died before he was born, yet he recalled in his 1802 Autobiography that at the age of seven he had been shown a letter of advice his grandmother Hannah Bass Adams had written to her children before her death. That letter, he wrote, “appeared to me then wonderfully fine. From his Mother probably my Father received an Admiration of Learning as he called it, which remained with him, through Life, and which prompted him to his unchangeable determination to give his first son a liberal Education” (Butterfield 1961, vol. 3: 256; following quotes are from this source unless otherwise noted). As a boy,
12
Adams, John
John Adams as an older man (Library of Congress)
Adams was surrounded and influenced by a network of kin whose status was “in the middle rank of People in Society: all sober, industrious, frugal and religious” (vol. 3: 254, note 3). Mostly farmers and small craftspeople or tradespeople, they served their communities as local elected officers, teachers, ministers, and members of the militia. Deacon John and Susannah Adams had only two more children, Peter (1738– 1823), whom John Adams described in 1802 as “my Neighbor, my Friend, and beloved Brother” (vol. 3: 255), and Elihu (1744–1775), who died of disease while serving in the militia in the American Revolution. The family was small in comparison to many in the community and close-knit. Susannah would live out her long life in close contact with and affection from her oldest son. In 1758 when
he was twenty-three, Adams wrote an entry in his diary that dramatically illuminates the equality of relation between his mother and father despite the disparity of their ages and provides a rare window into the daily life of his boyhood. As selectman responsible for the town’s poor, his father had brought home a destitute girl to board at the town’s expense. “How a whole Family is put into a Broil sometimes by a Trifle,” recorded John Adams. “My P. and M. disagreed in Opinion about boarding Judah, that Difference occasioned passionate Expressions, those Expressions made Dolly and Judah snivell, Peter observed and mentioned it, I faulted him for it, which made him mad and all was breaking into a flame, when I quitted the Room.” His mother, he mused after describing the scene in detail, “seems to have no Scheme and Design in her Mind to persuade P. to resign his Trust of Selectman. But when she feels the Trouble and Difficulties that attend it she fretts, squibs, scolds, rages, raves. None of her Speeches seem the Effect of any Design to get rid of the Trouble, but only natural Expressions of the Pain and Uneasiness, which that Trouble occasions. Cool reasoning upon the Point with my Father, would soon bring her to his mind or him to hers” (vol. 1: 65–66). Adams had his first lessons from his mother, whom he remembered as a great reader. He then attended a dame school operated by Mrs. Moses Belcher, wife of a local deacon, who lived across the road from him. Elementary ciphering (arithmetic), mastery of the catechism, and basic literacy taught from one of the many editions of The New England Primer were the educational staples of such dame schools, which were supported throughout New England’s towns to ensure that all children learned to read
Adams, John the Bible. For pupils whose parents had greater ambitions for them, many communities also had “grammar” schools, usually taught by young men who had completed their education at Harvard College and were keeping school while waiting to be “settled” into a church pastorate. Deacon John Adams’s oldest brother Joseph, after his graduation from Harvard, had been the first teacher in the Braintree School in 1710. By the time Deacon Adams sent his son John there to learn Latin, hoping to prepare him for college and the ministry, the teacher was another Harvard graduate, Joseph Cleverly. Later John Adams remembered Cleverly as a “tolerable Schollar and a Gentleman” but so “indolent” that he neglected to teach his pupils arithmetic. Young Adams got his own copy of Cocker’s Decimal Arithmetic (3rd ed., London, 1703), which still survives among his books, and proceeded to teach himself. Whether it was frustration with his teacher or a greater love of outdoor activity than of books, Adams’s autobiography describes his frequent truancy in some detail. “I spent my time as idle Children do,” he wrote, “in making and sailing boats and Ships upon the Ponds and Brooks, in making and flying Kites, in driving hoops, playing marbles, playing Quoits, Wrestling, Swimming, Skaiting, and above all in shooting, to which Diversion I was addicted” (vol. 3: 257). When Cleverly scolded him for bringing his gun to school, Adams hid it in an old woman’s home in the neighborhood and went after school to kill crows and squirrels and, when he was older, to hunt wild fowl in the marshes. His father, who despaired of his inattention to books and Latin, tried to discourage him from a youthful ambition to be a farmer instead of a scholar by putting him to a demand-
13
ing day of cutting thatch but then sent him back to school when he said he preferred the thatch. Not all of John’s time was spent outdoors, however: “I was of an amorous disposition,” he confided to his family in his autobiography, “and from the age of ten or eleven Years of Age, was very fond of the Society of females. I had my favorites among the young Women and spent many of my Evenings in their company” (vol. 3: 260). Finally, when he was fourteen, in the face of his father’s determination to make him a scholar, Adams requested that he be enrolled in a private boarding school in Braintree run by Joseph Marsh, son of a former minister of the North Precinct Church. There he applied himself at last, under a teacher who was both “a good instructor and a man of learning” (vol. 3: 259, note 6). Eighteen months later Adams successfully translated the passage in Latin assigned him as the examination for entrance to Harvard College and just before his sixteenth birthday matriculated at Cambridge to study under the tutor Joseph Mayhew. There he “soon perceived a growing Curiosity, a Love of Books and a fondness for Study, which dissipated all my Inclination for Sports, and even for the Society of the Ladies. . . . Mathematics and natural Phylosophy attracted the most of my Attention, which I have since regretted, because I was destined to a Course of Life, in which these Sciences have been of little Use” (vol. 3: 261–262). Two pastors of the Braintree North Precinct Church in which his father served as deacon provided another set of influences upon the mature man John Adams became, a man whose faith was characterized more by enlightenment Unitarian beliefs than by the more traditional Calvinism of his Congregational
14
Adolescence
heritage. John Adams was baptized a week after his birth by the Reverend John Hancock (1702–1744, father of the Massachusetts merchant and signer of the Declaration of Independence), a Harvard graduate who served the congregation from 1726 until his death. Hancock valued toleration and rejected the evangelical fervor and strict Calvinism of the “Great Awakening” that swept through New England in the 1730s and 1740s. His successor, the Reverend Lemuel Briant (1722–1754), also a Harvard graduate, was both more radical and more contentious than his predecessor and published sermons that challenged some of the central tenets of Calvinist orthodoxy. At an ecclesiastical council held at Deacon John Adams’s home in March 1753, Briant was accused by John Adams’s uncle Ebenezer Adams and others in the congregation of rejecting the original sin of infants and of teaching from an unauthorized “Scripture catechism” of his own devising rather than the “Westminster catechism.” Although John Adams had left home by then to become a student at Harvard, he wrote of this ecclesiastical council much later: “I saw such a Spirit of Dogmatism and bigotry in Clergy and Laity, that if I should be a Priest I must take my side, and pronounce as positively as any of them, or never get a Parish. . . . I thought that the Study of Theology . . . would involve me in endless Altercations and make my Life miserable” (vol. 3: 262). Adams became instead a lawyer and a politician, both of which involved him in endless altercations. Yet the experiences of his boyhood and the New England heritage he regarded throughout his life with pride gave him a set of values through which he sought to identify and support what was right in those altercations with
a stubborn tenacity not unexpected from the boy who had insisted at the age of ten on being master of his own free time. Constance B. Schulz References and further reading Butterfield, Lyman H., ed. 1961. The Adams Papers: Diary and Autobiography of John Adams. Cambridge, MA: Belknap Press of Harvard University Press. ———. 1966. The Earliest Diary of John Adams: June 1753–April 1754, September 1758–January 1759. Cambridge, MA: Belknap Press of Harvard University Press. Cappon, Lester J., ed. 1959. The AdamsJefferson Letters. Chapel Hill: University of North Carolina Press. Ellis, Joseph J. 1993. Passionate Sage: The Character and Legacy of John Adams. New York: W. W. Norton. Ferling, John E. 1992. John Adams: A Life. Knoxville: University of Tennessee Press. ———. 1994. John Adams: A Bibliography. Westport, CT: Greenwood Press. Nagle, Paul C. 1999. Descent from Glory: Four Generations of the John Adams Family. Paperback ed. Cambridge, MA: Harvard University Press. Peabody, James B., ed. 1973. John Adams: A Biography in His Own Words. New York: Newsweek, distributed by Harper and Row. Schulz, John A., and Douglas Adair, eds. 1966. The Spur of Fame: Dialogues of John Adams and Benjamin Rush, 1805–1813. San Marino, CA: Huntington Library. Smith, Page. 1962. John Adams. Vol. 1. Garden City, NY: Doubleday.
Adolescence Adolescence is the period of life from puberty to maturity. The word is sometimes used interchangeably with youth, but sociologists usually apply the term adolescence to the experience of youth after 1900, when education began to be prolonged and entry into the labor force de-
Adolescence layed. Commencing with the pioneering work of G. Stanley Hall, psychologists often have portrayed adolescence as a time of unusual emotional stress, resulting in part from the widening gap between sexual maturity and the assumption of adult responsibility. No one doubts that the distance between sexual maturation and the end of schooling has widened, partly because of a decline during the twentieth century of the age of onset of puberty but mainly because of the prolongation of education for the great mass of American youth. Equally striking has been the narrowing age segmentation of the youth group. Before the late nineteenth century, youth groups encompassed a broad age range from late childhood to the midtwenties, and viewed as a stage of life, youth signified a long period of semidependence or independence before marriage. In contrast, since the early twentieth century, preteens, early teens, midteens, and late teens increasingly have passed time with their exact or near-age peers. This development has been especially notable in schools, where there has been a strong tendency toward age homogeneity by grade level, but it also has affected advertising and marketing, which have increasingly targeted subpopulations of adolescents. In combination with delayed workforce entry, the age segmentation of schools and of the adolescent peer culture has made adolescents extremely conspicuous and has encouraged the rise of a vast professional literature on them. The modern discourse on adolescence originated with G. Stanley Hall’s twovolume work Adolescence (1904). Borrowing an idea from nineteenth-century evolutionary thought, Hall speculated that “ontogeny” (the history of the individual) recapitulated “phylogeny” (the history of the “race”). Hall deployed this
15
Adolescent boy (Skjold Photographs)
theory to support his contention that each stage of life required a kind of catharsis before the next stage could safely be entered. For Hall, puberty marked a “new birth” characterized by emotional sensitivity and a heightened awareness of beauty, and breaking with the conventional silence about sexual maturation, he attributed these qualities to puberty itself. Unlike his predecessors, who had frequently spoken of teenage girls in these terms, Hall argued that boys also experienced an emotional transformation at puberty and that they were entitled to a moratorium on the exercise of adult responsibilities during adolescence. In contrast to a tradition of advice books aimed at male youths, which typically had targeted young men
16
Adolescence
between the time when they left home and the time when they settled down by marrying, and which had anxiously urged their fast assumption of adult behavior, Hall believed it essential that teenagers of either sex be allowed to indulge their emotional yearnings. Yet Hall, who broke taboos by the bluntness of his treatment of sexual maturation, opposed premarital sex of any sort. Like most of his contemporaries, he believed that masturbation led to disease and insanity. So, adolescence in Hall’s eyes was necessarily, almost tragically, a time of storm and stress because of the inevitable conflict between Eros and civilization. Born in 1844 and raised in a devout Protestant family that prohibited discussions of sex (“the dirty place”), Hall led a deeply conflicted life. He gradually drifted from orthodox Christianity, but in Adolescence he drew heavily on the evangelical Protestant tradition of religious conversion during the teen years to “prove” that puberty coincided with the birth of high ideals, and he then concluded that the high ideals be allowed to flower during adolescence without accompanying demands that young people actually experience conversion or join churches. In addition, he was torn between the romantic idealism that he had acquired in his youth from listening to Ralph Waldo Emerson and the demands of empirical laboratory psychology, the field that he eventually chose to follow. Yet, despite his nineteenth-century roots, his ideas met with a remarkable reception in the early 1900s, the more so because Adolescence was an immensely long and difficult book. Hall’s leading idea, that anything smacking of precocity, finesse, or sophistication in teenagers represented a premature incursion of adulthood, was widely echoed by edu-
cators, youth workers, and social scientists in the 1900–1920 period. The favorable reception of Hall’s work resulted from the intersection of several factors. First, the publication of Adolescence coincided with the first notable spurt of high school enrollments, which doubled in the 1890s and continued their exponential increase between 1900 and 1930. In contrast to educational institutions of earlier times, which usually had contained a broad range of ages, early twentieth-century high schools were primarily institutions for teenagers. As the large corporation replaced the family firm, the growing scale of business enterprises posed problems for middle-class parents, who could no longer rely on friends and relatives to see that their children were either placed or moved up in jobs compatible with their social status. Increasingly, middle-class parents and by the 1920s the upper tier of working-class parents chose to delay the entry of their children into the workplace by prolonging their education. For their part, public school educators found themselves under increasing criticism from taxpayers, who complained that high rates of high school “dropouts” (a neologism of the early 1900s) signified a waste of taxpayer dollars, and in response educators introduced vocational tracks to complement the traditional college-preparatory curriculum of high schools and to facilitate the emergence of the high school as a universal institution of adolescents. Not all teenagers were viewed as adolescents, as young people in need of protection and prolonged dependency. Young people who dropped out of high school and slipped into the world of casual labor became young men and women rather than adolescents. Expectations about their behavior differed funda-
Adolescence mentally from those for teenagers in school, who increasingly occupied a selfcontained world of school studies and activities. In their landmark study Middletown (Muncie, Indiana), Robert S. and Helen M. Lynd described a huge change in the experience of growing up that took place between the 1890s and the 1920s. In the 1890s high schoolers in Muncie had participated in a range of adult activities, but by the 1920s Muncie’s adults formed the audience for juvenile activities by crowding the school gymnasium every Friday night to watch the school basketball team play. Set within a longer time frame, the change becomes even more striking. Before the 1870s, young people aged from roughly ten to twelve to the midtwenties had formed an identifiable social grouping in many American towns. Boys and girls had mixed with young men and women during the antebellum era in self-improvement clubs and dramatic societies. Boys and young men had joined volunteer fire companies and volunteer military companies, and they had engaged in mummery and charivaris (“shivarees”). After 1870, the decline of self-improvement clubs, the professionalization of fire companies, and the organization of the National Guard eroded this broad age grouping, and the expansion of public secondary education after 1890 accelerated the erosion. One sign of this increasing age-segmentation was the establishment of adult-sponsored organizations to serve young people in narrow spans of years. During the 1890s, for example, the Young Men’s Christian Association (YMCA), which traditionally had aimed at the religious and ethical growth of males between the ages of fifteen or sixteen and twenty-five, started to recruit prepubescent and pubescent boys into its pro-
17
gram. Most of these boys were lowerclass and neither in school nor at work. The quick evaporation of the YMCA’s initial assumption that its older members would guide these younger ones led to a new focus on planned activities for preteens. In particular, basketball, invented in 1891 by an instructor at the YMCA’s training college in Massachusetts, became popular with the boys. The Boy Scouts of America, established in 1910 and modeled on a similar organization started two years earlier in Britain, primarily targeted middle-class teenagers who attended or planned to attend high schools. But scouting enjoyed most of its enrollment success with boys aged twelve and thirteen, and most of the boys who left scouting by eighteen had abandoned it before the age of fifteen. One likely reason for scouting’s lack of success with older teenagers was that its leaders shared Hall’s abhorrence of casual contacts between the sexes at a time when coeducational high schools were increasing these contacts. By the 1910s, magazines were projecting images of the “new woman,” one more interested in self-fulfillment than self-sacrifice and inclined to engage in drinking, smoking, dancing, and sports. In the 1920s, Hollywood painted glamorous images of “flaming youth,” and a national debate over practices like dating and “petting” broke out in the press. For the most part, this debate focused on collegians, but it spilled over into controversies about dating among high schoolers, which was fast replacing more structured customs like parlor visiting, “keeping company,” and chaperonage. Still an ideal in 1920, the notion of the high school as a universal institution for adolescents increasingly became a reality over the ensuing decades. Demand for
18
Adolescence
the labor of juveniles fell victim to several factors, including the long-term decline of the proportion of the American population engaged in agriculture, where demand for juvenile labor had been keen, and the Great Depression, which undercut the market for juvenile labor and thereby contributed to universalizing high school attendance, especially among boys. World War II temporarily reversed this tendency, but postwar peace and prosperity restored it. Yet even as the ideal of universal attendance at high school became a reality, social anxieties about teenagers mounted in the 1950s and took the form of a major scare about rampant juvenile delinquency that resembled the scare in the 1920s about adolescents’ sexual transgressions. Interpreting these episodes of public concern about adolescents presents some difficulties. In neither the concern over sexuality in the 1920s nor that over delinquency in the 1950s was there much evidence of a real change in the values of teenagers. Polls in the 1920s indicated that high schoolers and collegians held conservative opinions about the value of monogamous marriage and the evils of premarital coitus. Similarly, James Gilbert’s 1986 study of the delinquency scare of the 1950s demonstrates that there is little evidence that delinquency rates actually were rising. In each episode, adults appear to have projected their apprehensions about changes in the larger society onto teenagers. There is no question that high schoolers in the 1920s, especially girls, were making new rules for themselves, but this merely reflected the sexual revolution that was affecting the whole society. Similarly, adult anxieties about juvenile delinquency in the 1950s need to be set within the context of larger social and cultural
changes. A stress on consumption and independence marked the youth culture of the 1950s. Seventeen began publication in 1944, and in the years after World War II Eugene Gilbert, himself just out of high school, began his career as a consultant to corporations on the potential of adolescents as consumers. In the 1950s, advertisers, who in the 1920s primarily had targeted the “typical” American family, grew more sophisticated in pitching messages to subpopulations, especially early teens to midteens. The driver’s license, obtainable in most jurisdictions at sixteen, had become the real rite of passage. Yet new patterns of consumption among teenagers in the 1950s would never have been possible without allowances and gifts from parents themselves. It is likely that the apprehensions of parents and social commentators have been intensified by their ambivalence about changes in the larger society, especially the penetration of mass consumption into every corner of life, and by the media, which has tended to magnify everything new and threatening and which in the 1950s projected troubling images of youth in motion pictures like Rebel without a Cause and Blackboard Jungle. Episodes of public concern about the misbehavior of adolescents have continued to occur with regular frequency, and since the late 1960s these have increasingly become intertwined with initiatives in public policy. The effect of the growing link between public concern and calls for changes in youth policy has been simultaneously to bathe youth in perpetual controversy and to obscure the subtle relationship between shifts in the values of young people and those of adults. For example, during the early 1960s many collegians departed from the prevailing opposition of moralists to premarital sex
Adolescence and constructed a new rule for themselves: as long as a “meaningful” relationship between two young people existed, premarital sex, and not just “petting,” was permissible. Partly in response, a small group of educators sought to nudge the rather prim sex education movement, which had started in the early 1900s, toward the position that a man and a woman could achieve a nonexploitive sexual relationship outside marriage. By the late 1960s, however, this position had become linked in the minds of Christian fundamentalists and rightwingers to the counterculture, the antiwar movement, and national decay. Sex education in the public schools, which hitherto had inspired neither much controversy nor interest, came under ferocious attack on the principle that its modestly liberal tilt in the 1960s was unraveling the social fabric. Conservatives were not the only ones to engage in scare tactics. In the 1970s, a campaign by political liberals to persuade the nation that it was menaced by an “epidemic” of teenage pregnancies led Congress to pass the Adolescent Health Services and Prevention and Care Act in 1978. Not surprisingly, scare tactics to secure policy changes distorted the problem that policies were supposed to solve. For example, the 1970s furor over the “epidemic” of teenage pregnancies ignored the fact that the birthrate for women aged fifteen to nineteen actually declined sharply between 1960 and 1975. Attacks on the sex education movement failed to take into account the facts that the first rumblings of the sexual revolution predated the incursion of faint liberalism into the sex education movement, and that even as teenagers have rewritten the rules, adults’ views of premarital sex also have changed. In the 1950s less than one-
19
quarter of adults endorsed premarital sex, but that figure rose to more than one-half in the 1970s. To tease out the precise relationship between shifts in the values of youths and adults would be a difficult task, but the image of warring battalions seems a less appropriate description than that of a twisting snake dance. Joseph F. Kett See also Boy Scouts; Clubs; Fire Companies; Juvenile Delinquency; Transitions; Young Men’s Christian Association References and further reading Bailey, Beth L. 1988. From Front Porch to Back Seat. Baltimore: Johns Hopkins University Press. Brumberg, Joan J. 1997. The Body Project: An Intimate History of American Girls. New York: Random House. Coleman, James S. 1961. The Adolescent Society: The Social Life of the Teenager and Its Impact on America. New York: Free Press. Fass, Paula. 1977. The Damned and the Beautiful: American Youth in the 1920s. New York: Oxford University Press. Gilbert, James B. 1986. A Cycle of Outrage: America’s Reaction to the Juvenile Delinquent in the 1950s. New York: Oxford University Press. Gillis, John R. 1974. Youth and History: Tradition and Change in European Age Relations, 1770–Present. New York: Academic Press. Hall, G. Stanley. 1904. Adolescence: Its Psychology, and Its Relations to Physiology, Anthropology, Sociology, Sex, Crime, Religion, and Education. 2 vols. New York: D. Appleton. Kett, Joseph F. 1977. Rites of Passage: Adolescence in America, 1790–Present. New York: Basic Books. Licht, Walter. 1992. Getting Work: Philadelphia, 1840–1959. Cambridge, MA: Harvard University Press. Lynd, Robert S., and Helen Merrell Lynd. 1929. Middletown: A Study in Contemporary American Culture. New York: Harcourt, Brace. Macleod, David I. 1983. Building Character in the American Boy: The Boy Scouts, YMCA, and Their
20
Adoption
Forerunners, 1870–1920. Madison: University of Wisconsin Press. Modell, John. 1989. Into One’s Own: From Youth to Adulthood in the United States, 1920–1985. Berkeley: University of California Press. Moran, Jeffrey P. 2000. Teaching Sex: The Shaping of Adolescence in the Twentieth Century. Cambridge, MA: Harvard University Press. Ross, Dorothy. 1972. G. Stanley Hall: The Psychologist as Prophet. Chicago: University of Chicago Press.
Adoption Adoption in Western culture is a social and legal process whereby a parent-child relationship is established between persons not related by birth. American adoption practices have changed radically over the past two and a half centuries. Originally an informal, spontaneous occurrence comparable to apprenticeship, adoption has become a formalized legal institution governed by statute in fifty separate state jurisdictions, with increasing federal involvement. During the twentieth century the professionalization of social workers emerged, along with uniform standards for regulating adoptions by the U.S. Children’s Bureau and the Child Welfare League of America. Adoption has changed since World War II from an elitist institution that restricted the adoptability of children to one that includes foreign, minority, older, physically and mentally disabled, and children who test positive for human immunodeficiency virus (HIV). Moreover, the period since 1950 has seen a movement away from secrecy to an embrace of open adoption and legislative mechanisms for uniting adult adopted persons with their biological parents. In spite of all these changes, however, Americans’ cultural bias toward blood
ties remains pervasive, and adoption is still viewed by many as a form of secondrate kinship. Although adoptions took place in the United States before the twentieth century, they were infrequent. Nor were the children who were adopted the adorable infants usually associated with adoption. Typically, the children who were adopted in the colonial period or, more frequently, in the nineteenth century ranged in age from six to sixteen years, and most of them were boys. These boys were mostly orphans or half-orphans who were adopted (made heirs) by relatives. By the mid-nineteenth century, state legislatures began enacting the first general adoption statutes designed to ease the burden legislatures assumed from the many private adoption acts they were forced to enact to change surnames and to clarify inheritance rights. The most important of these statutes, “An Act to Provide for the Adoption of Children,” the first U.S. adoption statute, was enacted in 1851 by the Massachusetts legislature. Reflecting Americans’ new conceptions of childhood and parenthood, the Massachusetts Adoption Act, as it was commonly called, emphasized the welfare of the child and established the principle (if not the practice) that judges were to determine whether prospective adoptive parents were “fit and proper.” In addition, it made possible the severance of the legal bonds between biological parents and their children. The enactment of the Massachusetts Adoption Act marked a watershed in the history of the American family and society. Instead of defining the parent-child relationship exclusively in terms of blood kinship, it had become legally possible to create a family by assuming the responsibility and emotional outlook of a biological
Adoption
21
An adopted boy with his mother (Skjold Photographs)
parent. In the next quarter-century, the Massachusetts Adoption Act came to be regarded as a model statute, and twentyfive states enacted similar laws. The true beginning of child welfare reform in adoption began during the Progressive era (1900–1917) as a response to the nation’s high infant mortality rate, itself a product of the unsanitary conditions in vastly overcrowded, industrial cities that lacked medical knowledge of contagious diseases. With mortality rates near 98 percent in public infant hospitals, wealthy, socially prominent, progressive women reformers took it upon themselves to care for homeless infants. They soon found themselves running adoption agencies, initially supplying their friends’ requests for babies and later expanding their operations to meet childless cou-
ples’ demands for infants. In this way, the first private adoption agencies, such as the Alice Chapin Adoption Nursery (1911) and the Child Adoption Committee (1916), which later became Louise Wise Services, sprang up in New York City. Similar institutions, such as the Cradle Society (1924), would soon follow. Other measures were taken to safeguard children. In 1917, child welfare reformers enacted the Children’s Code of Minnesota, which became the model for state adoption laws in the next two decades. It was the first state law that required an investigation to determine whether a proposed adoptive home was suitable for a child. The statute also provided for a six-month probationary period of residence by the child in the home of the adopting parents. Moreover, it or-
22
Adoption
dered that adoption records be closed to inspection by the public but not to the adoption triad: adopted persons, adoptive parents, and birth parents. Three other reforms in adoption practice and law mark the Progressive era. Child welfare advocates were successful in lobbying many states for the removal of the word illegitimate from birth certificates and “inventing” the amended birth certificate to shield children from public opprobrium of their adoption. Child welfare reformers were also successful in advocating that children should not be separated from the family of origin for light or transient reasons, such as poverty. All these reforms heralded the expanded role of the state in regulating adoptions. Progressive-era social workers institutionalized their reform efforts in two public and private national organizations. In 1912, the U.S. Children’s Bureau was established and was the leading institution until World War II for providing the public with information about adoption. It was also instrumental in setting standards for adoption agencies and guiding state legislatures, social workers, researchers, and the public on every aspect of adoption. In 1921, the Child Welfare League of America (CWLA), a private, nonprofit institution, was founded; it would become increasingly important in setting adoption standards for public and private agencies. By the 1950s, the Children’s Bureau’s power would be eviscerated by Congress, and the CWLA would emerge as the leading authority in the field of adoption. Acting as a counterweight to the reform of adoption practices was Americans’ cultural definition of kinship, based on blood, which stigmatized adoption as socially unacceptable. During the late nineteenth and early twentieth centuries,
a broad segment of the American public believed that adoption was an “unnatural” action that created ersatz or secondrate families. Medical science contributed to popular cultural prejudices against adopting a child by coupling the stigma of illegitimacy with adoption. After 1910 the rise of the eugenics movement and psychometric testing led adopted boys and girls to be linked to inherited mental defects. Adopted children were thus doubly burdened: they were assumed to be illegitimate and thus medically tainted and they were adopted, thus lacking the all-important blood link to their adoptive parents. In the late 1930s, the CWLA began to address the issue of adoption standards. Responding to the widespread deviations from sound adoption casework principles and the increasing number of adoptions by third parties such as doctors and lawyers, the CWLA in 1938 published its first set of adoption standards. These fit on a single page. They were grouped under three separate headings providing “safeguards” for the child, adoptive parents, and the state, respectively. Although before World War II a slight majority of children admitted into adoption agencies were boys, a majority of adoptive parents preferred girls. Researchers have speculated that the reasons why adoptive parents preferred girls ranged from the belief that they were easier to raise than boys to the idea that the ultimate decisionmaking power rested with adoptive mothers who wanted girls for companionship. The upheaval of World War II resulted in additional significant changes in the history of adoption. For a variety of reasons, social workers and state bureaucrats began for the first time to shroud adoption records in secrecy, preventing
Adoption adoption triad members from gaining access to family information about their own lives. Another sign of change was that prospective adoptive parents expressed preference for boys rather than girls; perhaps they wished to replace symbolically the men who went off to war. But by the end of the war, the shortage of adoptable children led prospective adoptive parents to be more flexible. Now many more would-be adoptive parents were willing to accept children of either sex. The “demand” by childless couples for infants also led to radical changes in adoption practices. The baby boom, beginning in the mid-1940s and reaching its peak in the late 1950s, with its dramatic rise in marriages and births, led more infertile couples than ever before to seek to adopt. Adoption agencies were inundated with requests for children. Adoptions rose spectacularly: between 1937 and 1945, adoptions grew threefold from 16,000 to 50,000 annually; a decade later the number of adoptions had nearly doubled again to 93,000, and by 1965, 142,000 adoptions took place every year (Theis 1937, 23; Smith 1947, 24). Although adoptions increased in number and popularity in the twenty years after World War II, the availability for adoption of white, born-out-of-wedlock infants declined radically in the following decades. A number of factors were responsible for the decline, including the 1960s sexual revolution, the Supreme Court’s legalization of abortion in Roe v. Wade (1973), and unwed mothers’ decision not to relinquish their babies. These profound cultural, social, legal, and demographic changes in American society caused a substantial decline in the number of adoptions and precipitated important shifts in adoption policy. First, by
23
1975, adoption agencies across the nation began to stop taking requests for healthy, white infants. Social workers often informed prospective adoptive parents that they would likely wait three to five years for such a child. Second, as early as 1948, in response to their inability to meet their clients’ requests for infants to adopt, social workers abandoned the idea of the “unadoptable” child and broadened the definition of adoptability to include any child who needed a family and for whom a family could be found. With the enlarged definition of adoptability, social workers for the first time initiated serious efforts to place “special needs” children—disabled, minority, older, and foreign-born—in adoptive homes. Third, by 1965, the shortage of infants for adoption and the emphasis on minority adoption led social workers to practice transracial adoption. Agencies were surprised to discover that a white family would occasionally request a black infant for adoption or, when approached by caseworkers, agree to adopt a black baby. By 1965, transracial adoption had become the “little revolution,” as agencies all over the nation increasingly placed black babies with white families. Four years later, the CWLA revised its standards to reflect the new practice, unequivocally stating that agencies should not use race to determine the selection of an adoptive home. In 1971, transracial adoptions reached their peak when 468 agencies reported 2,574 such placements. Transracial adoption was highly controversial. The first manifestation of discontent emerged in 1972 when the National Association of Black Social Workers successfully denounced transracial adoption as cultural genocide; by 1975 only 831 transracial adoptions occurred. In the following years, transracial
24
Adoption
adoptions declined steeply, as child welfare workers preferred to keep African American children in foster care rather than place them with a white family, even though repeated studies demonstrated that transracial adoptions were successful. Concern over social workers’ discriminatory practices prompted Congress to enact the Howard M. Metzenbaum Multiethnic Placement Act of 1994, which prohibited adoption agencies from denying any person the opportunity of becoming an adoptive parent based solely on the person’s race, color, or national origin. A fourth consequence of the demographic decline in babies available for adoption was to redefine the population of adoptable children so that the process of adoption became more inclusive and less concerned with “matching” the physical, mental, racial, and religious characteristics of adopted children with adoptive parents. Increasingly, the population of adoptable children was composed of older children, members of minority groups, and children with special needs. In the 1990s, drug-exposed infants, children with acquired immunodeficiency syndrome (AIDS), and infants born HIV-positive were added to the special needs category. Because social workers were unable to find adoptive homes or free them legally for adoption, these children, currently numbering 100,000, became fixtures in foster care, where they were shunted from one caretaker to another. This situation prompted Congress to pass the Adoption Assistance and Child Welfare Act of 1980, one of the first federal laws to address the problems of adopted children. Congress’s landmark legislation mandated that child welfare agencies provide preplacement services, take steps to reunify children
with their biological parents, and periodically review cases of children in longterm foster care. “Permanency planning” legislation, as it was called, had as its goal either to return children to their family of origin or place them in an adoptive home. By 1993 the federal government was distributing an estimated $100 million to forty states to fund this program. Consequently, there was an increasing number of older child and special needs adoptions in the 1980s and 1990s. A fifth effect of the decline in adoptable infants was open adoption, a major and controversial innovation in adoption practice that began in the mid-1980s. In an effort to encourage birth mothers to relinquish their babies for adoption, caseworkers began allowing some birth mothers to decide who would parent their child. The result was open adoption, in which the identities of birth and adoptive parents were exchanged, and in some cases, some degree of continuing contact between the parties was encouraged. Accompanying the revolution in adoption practices during the last three decades of the twentieth century was the birth of the adoption rights movement (ARM). Though its roots lay in the 1950s, when twice-adopted former social worker Jean M. Paton began her lifelong crusade to provide adopted persons with a voice and a cause, the ARM did not become a major social issue until two decades later. In 1971 there emerged the movement’s most vocal and visible leader, Florence Fisher, a New York City housewife. After twenty years of searching for and finally finding her birth mother, Fisher founded the Adoptees Liberty Movement Association (ALMA). ALMA’s example sparked the creation of hundreds of other adoptee search groups
African American Boys across the United States, Canada, and the United Kingdom. By 1978, the multiplicity of adoptee search groups led to the formation of a national umbrella organization, the American Adoption Congress. Research suggests that males are much less likely than females to be interested in contacting their birth parents. Women make up 67 percent of adult adopted persons who search for members of their birth families (Kadushin and Martin 1998, 587). Adoption rights activists, composed mostly of adult adopted persons and birth mothers, contend they are entitled to identifying information in the adoption records. Through court challenges, reform of state legislation, and state initiatives, they have pursued their agenda of repealing laws that sealed adoption records. The result of their lobbying efforts has been mixed. Only four states, Kansas, Alaska, Tennessee, and Oregon, permit adopted persons to have access to their original birth certificates. Because the rights of some birth parents, who have been promised confidentiality by adoption agencies, clash with the rights of adopted adult persons, who want unrestricted access to the information in their adoption records, states have tried to accommodate both parties with formal and informal mutual consent adoption registries and state-appointed confidential intermediary systems. E. Wayne Carp References and further reading Carp, E. Wayne. 1998. Family Matters: Secrecy and Disclosure in the History of Adoption. Cambridge, MA: Harvard University Press. Carp, E. Wayne, ed. 2001. Historical Perspectives on American Adoption. Ann Arbor: University of Michigan Press.
25
Hollinger, Joan H., et al., eds. 1989. Adoption in Law and Practice. New York: Mathew Bender. Kadushin, Alfred, and Judith A. Martin. 1998. Child Welfare Services. 4th ed. New York: Macmillan. Smith, I. Evelyn. 1947. “Adoption.” Pp. 22–27 in Social Work Year Book 9. New York: Russell Sage Foundation. Theis, Sophie van Senden. 1937. “Adoption.” Pp. 23–25 in Social Work Year Book 4. New York: Russell Sage Foundation. Zainaldin, Jamil S. 1979. “The Emergence of a Modern American Family Law: Child Custody, Adoption and the Courts.” Northwestern University School of Law 73 (February): 1038–1089.
African American Boys The history of African American boys is a story of struggle and triumph against the odds. For much of their historical experience in the United States, young black Americans have been children without a childhood. For several centuries the majority of young blacks lived as enslaved people in the American South, an experience that forced them to grow up early. Throughout American history, black parents have strived to teach the children in their care essential survival skills, and they succeeded to a remarkable degree. But they could not protect enslaved children from the reality of white authority and punishment, separation from families and loved ones, physical and psychological abuse, or hard work at an early age. During the colonial era, the North American mainland received the bulk of its African population. Most were imported from along the African west coast but came from villages located in the interior. Generally, the slave cargoes that made the horrifying trip across the Atlantic Ocean known as the Middle Passage were composed of twice as many African males as females. Africans pre-
26
African American Boys
Photograph by Gordon Parks (Library of Congress)
ferred to hold back women and children, who were more easily assimilated into new societies than older males. But North American slaveholders valued men and women almost equally, since their crops could be worked by either sex. In the early seventeenth century, a young African male might expect to be freed during his lifetime. The first Africans brought to the Virginia colony in 1619 were slaves, but slavery was not yet firmly established, and they were treated like English indentured servants. Significant numbers of young men worked their way out of slavery. By the late seventeenth century, however, most of the colonies had passed legislation making slavery a permanent institution
dependent upon the exploitation of African slaves. Published narratives, such as that of a kidnapped boy named Venture Smith (1798), relate the experiences of capture on the African continent and slavery in North America. Smith, whose African name was Broateer, was captured in 1735 at the age of six by enemy Africans raiding his Mende village. Broateer retained memories of his father and the existence of a strong, elite male culture among his people. His father, Prince Saungm Furro, had taught his son to value freedom, integrity, honor, resolve, and the advice of old men and to abuse no one’s rights. He was to value these qualities of good character above all worldly things, including material wealth and land. As Broateer, who renamed himself Venture Smith, grew to manhood in the colonies, these lessons learned in his African boyhood stood him in good stead. Smith survived the rigors of the Middle Passage and a succession of physically abusive masters. He even took up arms in his own defense when one of his young white male owners attacked him with a pitchfork, and a second owner struck him with a bat. In the latter incident, Smith did not return the violence but divested his owner of the weapon and then appealed to a higher white authority for justice. Although he did not receive justice, he handled the situation in a judicious manner. Smith conducted himself in the same manner in another abusive situation concerning his wife Meg. When his second owner’s wife beat Meg with a whip, Smith stepped between the two women. When his mistress began to strike him, he snatched the whip and threw it into the fire. Smith met and judged each confrontational episode with what he considered
African American Boys to be honorable and appropriate conduct, based on the values and principles learned in his African boyhood. His selfdiscipline, determination, and thrift enabled him to purchase his freedom in 1765 when he was thirty-six years old. During the next ten years, he worked and saved enough money to free his wife from bondage, in addition to three of their children and three other adult male slaves. Venture Smith’s experience in colonial America suggests that memories of African childhood, lessons well learned, and values and beliefs kept close to the heart would form the core of a new world African American culture that would enable black youth to endure slavery in the century to come (Desrockers 1999). By the year 1800, African Americans had been in North America some 200 years. They were native-born second-, third-, and even fourth-generation Americans. Following the American Revolution, slavery was soon abolished in the North, and large numbers were freed in the South: approximately 10 percent of the African American population became free. Although bondage remained the reality for most black Americans, nineteenth-century slave life was on the brink of a major transformation. During the first half of the nineteenth century, some 1 million slaves were moved from the eastern seaboard states across the continent to the American interior. Some scholars refer to this overland movement as the Second Great Migration, or enforced movement of blacks, after the Middle Passage. It was driven by the movement of settlers west to a frontier that required a steadily growing labor force. As slaves were forced to accompany white settlers moving into western territories, the 200-year-old civilization
27
they had established along the eastern seaboard was torn apart. Hundreds of thousands of black Americans were taken from what they now considered their place of home and birth. Young blacks were the first forced to leave their homelands. As plantation society and culture were transplanted west, the new frontiers demanded the strength of young enslaved males and females to clear land and reproduce as the South’s major labor force. Many of these slave pioneers were young children accompanying older relatives or were sold alone. More than one-third of enslaved U.S. youth were separated from their families and sold to work on plantations and farms in the interior. Seaboard slave society was left largely bereft of its children as spouses and siblings were separated and communities were increasingly composed of the elderly. In contrast, the frontier African American population was very youthful. The majority of black youth were now growing to adulthood in the lower South. As a result, slave youth became extremely important in the reconstruction and growth of African American culture across the continent during the nineteenth century. According to Wilma King in Stolen Childhood: Slave Youth in NineteenthCentury America (1995), rearing children within the firmly established institution of slavery was the greatest challenge that confronted African American parents in the nineteenth century. They struggled to give their children the survival skills that would enable them to endure oppression and maintain their self-worth as human beings. King describes how parents accomplished this goal, teaching values, beliefs, and culture creatively fashioned from their memories of Africa, their 200 years of experience in North
28
African American Boys
African American boy, ca. 2000 (Skjold Photographs)
America, and influences borrowed from Native American and European cultures. For example, the manner in which they named children reflected African cultural practices. A child was usually named after close relatives, and a firstborn male child was frequently named after his father or grandfather. From birth, life for an African American child was precarious. Lack of prenatal care and adequate nutrition ensured that slave children, both boys and girls, were small in stature and suffered frequent illness and a short life span. If a slave boy survived his early years to enter the workforce at about ten years of age, his health improved because he was allotted more food. Until then, however, black parents endured the pain of the
sickness and high mortality rates of their sons and daughters. More than anything else, work defined the lives of African American children in the nineteenth century. Throughout history children have worked to help support their families. Enslaved children’s work, however, was of no benefit to their parents, who were not in control of their children’s work. Children performed adult jobs and were forced to take on responsible roles quite early. Slave children attended babies, brought wood for the household, pulled weeds in fields, swept yards, ran errands, and hunted game for food. Work assigned to young children was usually not gender-specific, and both boys and girls performed agricultural work and such domestic work as cleaning house, making clothes, and serving meals. Older children were more likely to perform work defined by gender. For a young boy, pushing a plow signaled manhood. Male youth also had opportunities to acquire craft skills and were trained as apprentices to be blacksmiths, carpenters, wheelwrights, painters, tanners, coach drivers, shoemakers, harness makers, and furniture makers. Girls could become weavers, basket makers, seamstresses, housekeepers, cooks, and laundresses. Young boys learned trades from older male slave craftsmen, who often were family members. Their owners also apprenticed them out to white artisans. A well-trained blacksmith or carpenter was an invaluable asset to his owner. In addition, such skills made it possible for male slaves to purchase their freedom as adults. These enslaved children, like their parents, were significant participants in U.S. economic growth. Even in urban settings among free blacks, children’s lives were defined by
African American Boys work. In the North, free black men, as heads of families, faced stiff competition for jobs from native white and immigrant men. The lack of steady employment for men forced black women and children into the labor force. As soon as black children were old enough, they sought work at boardinghouses or hotels, running errands, assisting in food preparation, or peddling food. One of the most dangerous jobs carried out by young black males was that of chimney sweep. Boys between the ages of four and eight were apprenticed to black and white chimney sweepers, who hired these small young boys to crawl down chimneys and scrub them clean. Frequent falls and broken bones were common. Breathing soot also made the boys subject to lung disease and cancer. Yet working as chimney sweeps appealed to black boys because of the considerable independence associated with the job. They preferred it to work in domestic settings or public establishments, where they would have been closely watched by their employers. Although work was a dominant part of their lives, African American children, enslaved and free, carved out important space for play and leisure time. During play and recreational time, children had opportunities to form social relationships with their peers. They also participated in such adult family and community activities as religious services and dances. When children played among themselves, they engaged in gender-specific play. Boys enjoyed play that gave them the freedom to demonstrate competition and strength. Games such as marbles were popular with boys. Enslaved children were inventive in making their own toys, and objects like marbles could be molded from clay and dried in
29
the sun until hard. Horseshoes, ball games, foot races, and jumping contests also were popular male-oriented games that tested strength and skill. Playing the dozens was especially prevalent among boys. This oral contest encouraged verbal skills, sharpened the wits, and channeled aggression into a nonphysical form of playful combat. Black children also enjoyed storytelling, through which they learned about their African heritage, how they came to the United States, their family history, and moral lessons. After the Civil War ended in 1865, passage of the Thirteenth Amendment abolished slavery. African American youth had played significant roles in the Civil War. Some boys traveled and worked with the Union forces as musicians. They marched in uniforms and shoes, raising their own self-esteem as well as the admiration of their peers. Some young males worked as body servants to their masters in the Confederate Army. Other teenagers still on southern plantations resisted by refusing to work for mistresses left in charge. Knowing they would not be whipped, they ran away in droves as the Union Army marched through the South. Large numbers of male youth worked with the support services of the army, building roads and bridges, delivering supplies, and burying the dead. Once African Americans gained their freedom, they faced severe political, social, and economic challenges. Control of their labor and economic independence was a major concern: the attempt to control working children was one of the most important ways in which former slave owners tried to reinstate slavery. The agrarian South, still dependent on black labor, feared the independence freed people would gain if they owned
30
African American Boys
their own labor and land. Opposition to black economic independence supported the passage of apprenticeship laws directed at children, and African American boys and girls became pawns in the struggle to control black labor. Many southern states passed legislation allowing white employers to bind black children to labor contracts until they were eighteen or twenty years of age. Ideally, this paternalistic system would provide minors with food, clothing, medical care, and training in husbandry, housekeeping, or industry in exchange for their labor. Black children were to be taught reading, writing, and arithmetic. Apprenticeship laws also gave employers the same right given a father regarding the use of force, in order to compel obedience from a child. In reality, apprenticeships were habitually violated by white employers. Some falsified documents in order to keep children in the system longer. Numerous boys and girls classified as orphans after emancipation were bound over to the apprenticeship system before their parents could claim them. More than 2,500 black youngsters were apprenticed throughout the South in this manner during the first month of emancipation. Often white employers did not retain apprentices in their own households but hired them out to other employers. Children were frequently abused and denied pay by their employers, and many ran away. The need to protect African American children from exploitive economic practices was one of the major reasons black parents withdrew them from agricultural work under white authority. The new labor system that evolved throughout the South was sharecropping. Under this system, black workers agreed to farm land and share what was produced with the
landowner in exchange for using the land. At first, sharecropping seemed to offer blacks some degree of the autonomy from whites they so avidly desired. But laborers remained subject to white authority, as decisions concerning cultivation, choice of crops, and marketing remained with the landowners. Sharecropping brought a vicious cycle of debt peonage that black farmers could not escape. By the 1890s most black Americans were tied to the rural South and the sharecropping system. Among black Americans who reached the age of ten in 1890, half the boys and one-sixth of the girls worked as agricultural laborers. African Americans who became independent farmers by purchasing and working their own land were under constant threat from night riders ready to drive black families from the land. Sharecropping broke up black families. Although it had been accepted as a system that would keep parents and children together, older children were forced to hire out and migrated to towns and cities looking for work to help support the family. Black male youth found work as laborers in industry or in the commercial sector, and girls worked as domestic servants. By the late nineteenth century, sharecropping families often comprised adults with babies and small children, whereas teenage sons and daughters were absentee workers. The twentieth century brought rapid change to African Americans as their lives were affected by migration, World War I, the Great Depression, and World War II. Migration, the physical movement of black families looking for a better life, particularly affected youth. Children, like their parents, had to acclimate to new regions of the country and adapt to the lifestyle of the city. Black south-
African American Boys erners saw migration as opportunity. The urban North was not a perfect place, but it offered freedom from sharecropping, higher wages, the chance to attend school, and the possibility of participating in the political process, none of which was possible in the South. As many as 500,000 blacks went north between 1916 and 1919. Nearly 1 million migrated to northern and southern cities between 1920 and 1930. Black men were able to find work in manufacturing in such cities as Detroit, Milwaukee, Cleveland, Pittsburgh, and Chicago. They worked in the steel, iron, rubber, automobile, and meatpacking industries. The Great Migration continued until the 1970s. Although most black Americans continued to live in the South, the Great Migration transformed African Americans from a rural southern people into an industrialized urban people, who, although rooted in the South, were now spread out across the country. When African Americans moved to the urban North and southern cities, they survived by bringing cultural practices, values, and beliefs that had been established centuries before in the South with them. Restructuring the family and maintaining close ties with kin were basic to urban survival. Migration was neither a random nor linear relocation of people. People tended to migrate in steps or stages, with children accompanying adults or sometimes left behind with relatives, while their parents established themselves in the city. Recent writings by black men such as Joseph W. Scott in Black Men Speaking (Johnson and McCluskey 1997) reveal what these eventful years must have been like for black boys. Scott migrated with his family to Detroit, Michigan, from rural Georgia during the Great De-
31
pression when he was ten years old. Having moved into a Polish ghetto, they were the first black family on the block. Polish boys chased Scott and his brothers home from school the first few days after they arrived in the neighborhood. His mother put a stop to these childhood rites of initiation, however, when she met the boys on the porch as they “skidded” home one afternoon and told them they could either take a beating from her or whip those boys “real good.” To Scott, migration meant watching his father work until his hands were callused and bleeding from pounding out axles and gears for Chevrolet trucks and cars. His father’s skin and hair were scorched from the heat. Gloves barely protected his hands against the giant tongs he wielded to twist the steel into shape. The gloves were always threadbare because Scott’s father could seldom afford new ones when he was pushed to the limit supporting a wife and raising ten children. In Scott’s mind, his father’s gloves and bruised hands came to symbolize the ultimate sacrifice a parent makes for a child. His most vivid memories of his adolescent years were of each daybreak when his father knelt on his knees, his callused hands clasped in prayer, and in a fervent voice asked God to help him make it through just one more day. In Scott’s parents’ household, work was considered a major rite of passage for young boys. His father always managed to put food on the table during the Great Depression, even though he earned only a poverty-level income and refused to apply for welfare. Only honest work was acceptable: no stealing, gambling, or other forms of questionable income were permissible. Scott began to work at odd jobs such as cleaning garages when he
32
African American Boys
was nine years old. He could not keep his earnings but turned them over to his mother, who decided how the money should be spent to help the family. Scott’s family measured manhood by financial contributions to the family’s survival. As each child began to work, the parents began to treat him or her like an adult, entrusting them with important family matters and responsibilities. Another important lesson Scott learned as the child of recent migrants from the South was the value of a formal education. Scott’s parents had only a sixth-grade education between the two of them, but they instilled in their children the desire to learn. His father read aloud a passage from the Bible each evening, struggling with every word. But he persevered, and from his efforts, his children saw that learning required extraordinary determination and that intelligence was an acquired skill, not a gift. Formal learning was valued in Scott’s household for a very practical reason. It helped prepare an individual for solving life’s problems; without education, one was at the mercy of others. Last but not least, Scott learned valuable lessons about male-female relationships from his parents. One of the most important lessons concerning manhood was being what his father called “personally competent” or self-sufficient. Scott’s father approved of his wife’s teaching the boys how to perform housework and take care of babies. His father did not like doing women’s work but was proud that he could iron, cook, wash, sew, and care for young children. Scott’s parents believed that in the husband-wife partnership, neither should be totally dependent upon the other for gender-specific services. For instance, Scott’s mother “could do anything a southern rural black man could do.” She could handle a shotgun, hunt, fell trees,
lay bricks, do carpentry and plumbing, ride bareback, lift logs, and make whiskey. Scott learned that boys and girls should grow into self-sufficient, independent adults. Marriage sometimes required that men and women exchange traditional gender roles, and individuals should be capable of doing so when necessary. These progressive notions about marriage and gender roles were not unique to Scott’s family but common among rural black southerners. Possibly, they date back to centuries of black men and women performing the same type of work. By the second half of the twentieth century, African Americans were focused on the struggle for citizenship rights. In the 1950s the modern freedom struggle was initiated by local black activists as they employed grassroots strategies to attack segregation and other forms of racial discrimination. These strategies signaled a new direction in black political selfempowerment and resistance. Black leaders, supported by unified groups of blacks, refused to be intimidated by violent confrontations with whites and committed themselves to the practice of nonviolent resistance, which proved to be an extremely effective plan of political action. During this period, black youth in the South watched and learned from the resistance work of their parents and local leadership. They played a major role in the new freedom movements. In many cases, elementary and high school youth were the first to integrate public institutions. For example, the students known as the Little Rock Nine, who braved violent mobs to desegregate Central High School in Little Rock, Arkansas, in 1957 were regarded as heroes by young blacks across the country. Black student youth also questioned the caution and patience
African American Boys practiced by many of the older civil rights organizations. As a result, the Student Nonviolent Coordinating Committee (SNCC) was formed to develop leadership among young people. Black student activists engaged in sit-in demonstrations in public places such as restaurants and participated in freedom rides in order to expose the segregated transportation facilities in the South. Both female and male black youth participated extensively in student organizations such as SNCC. During the late 1960s and 1970s, black urban youth became increasingly militant and frustrated with traditional civil rights activism. Although the civil rights movement had been successful in the South, it was largely ineffective in addressing the social ills of large urban areas, particularly entrenched poverty and high unemployment. Black male youth became attracted to such radical groups as the Black Panther Party, founded by Huey Newton and Bobby Seale in Oakland, California, in 1966. The Panthers, which attracted mainly young blacks, was the most widely recognized militant and political organization of the late 1960s. In 1968 the top ranks of leadership were age twenty-six to thirtyfour. Second-level leaders ranged in age from twenty-one to twenty-six. But those who made up the bulk of the organization’s support were between sixteen and twenty-four years of age. Female members in the organization were usually kept in the background. Black Panther ideology personified a decidedly aggressive, in-your-face, confrontational expression of political views that appealed to black male youth fed up with the old-style leadership. They openly carried arms, encouraged black people to defend themselves against
33
white violence with guns, and stood their ground when the right to bear arms was questioned by government authorities. In what they called their 10-Point Platform and Program, the Black Panthers demanded “land, bread, housing, education, clothing, justice and peace” (Horton and Horton 1995, 174). Their appeal to black male youth soon began to cross class, regional, and ideological lines. The Panthers succeeded in recruiting Stokely Carmichael, a prominent member of SNCC, and were planning a coalition with the youth branch of that organization until the alliance was thwarted by the Federal Bureau of Investigation (FBI), which identified the Black Panthers as a major target of its counterintelligence program in 1967. By the end of the 1960s, more than twenty Panthers had been killed in confrontations with law enforcement, and many others faced long jail terms. The early 1970s witnessed the destruction of the Black Panther Party as an effective political organization on the national level. Black Power consciousness did not die out with the end of radical organizations in the 1960s but survived into the 1970s as an explosive cultural movement characterized by vigorous political debates concerning African American identity and culture. This cultural movement influenced the revival of black theater, dance, and soul music; encouraged the growth of poetry workshops; and stimulated interest in African culture, pride in racial heritage, and the emergence of black studies departments. Black American youth were prominent in all aspects of this cultural movement. Currently, African American male youth are in crisis, even though they occupy a dominant role in the nation’s psyche and popular culture. Their image is
34
African American Boys
overwhelmingly negative. Images in the media portray young black males as criminal, drug-addicted, uneducated, and prone to violence. Yet they also dominate American popular culture, particularly music, and specifically hip-hop, with tales of swaggering “baad” men, crime, and street life. These images of black male youth culture are deeply rooted in the economic problems that surfaced in the mid-1970s following the successes of the civil rights movement. Decline in middle- and low-income wages, setbacks in the automobile industry, and the removal of low-skilled jobs from the inner city to the suburbs are just some of the forces that contributed to high rates of unemployment among blacks. Black men were particularly hard hit when their unemployment rate jumped from 9.5 percent in 1974 to 14.8 percent in 1975. Since then these figures have not moved below double-digit numbers, and large numbers of black males have remained detached from the labor force. Those hardest hit by the unraveling of the low-skilled jobs lived in the inner cities, and their situation was worsened by the intractable poverty that already existed. During the mid-1970s the United States began to see the high rates of unemployment, homelessness, crime, drug use, and single-parent families headed by women that are now so readily identified with the inner city. This phenomenon created intense public debate about the large numbers of blacks who appeared to be mired in poverty. As the devastating economic problems continued, policymakers and sociologists of the 1980s and 1990s blamed the urban black poor, labeled the “underclass.” Black people’s own “nature” and aberrant social behavior, they argued, made them poor. Liberal initia-
tives of the 1960s, such as President Lyndon B. Johnson’s Great Society, had not only been ineffective but had failed. In short, they contended, the black poor have economic problems because of defects in their own character. African American male youth responded to the social and economic crisis in which they found themselves through cultural expression. In the 1970s, black Americans, Puerto Ricans, and West Indians in New York City launched hip-hop, a significant cultural movement of the late twentieth century. Primarily comprising black and Latino youth culture, hip-hop is an inclusive mix of such ingredients as graffiti art, break dancing, definitive language and dress styles, and rap music. Rap music is hip-hop culture’s most distinctive and profitable expression. Contemporary youth culture’s 1990s version of the blues, it reaches young people on an international level more effectively than family, teachers, educators, or current political leaders can. Like other forms of oral expression, rap has a long history in African American culture dating back to African oral traditions, preaching, work songs, and blues, in addition to playing the dozens or toasting (oral stories performed in rhyme). Pioneering disc jockeys and such technology as turntables, drum machines, mixers, and digital samplers have allowed rappers to sample music from old recordings. Rappers who have captured the attention of young black men include Grandmaster Flash and the Furious Five, Africa Bambatta, KRS-One, Public Enemy, 2 Live Crew, L L Cool J, Ice-T, and 2PAC. Although Queen Latifah, M. C. Lyte, and Salt-N-Pepa are women who rap with attitude, it remains a musical expression that reveals primarily the feeling of black male youth.
African American Boys Some rappers continue the tradition of critique begun by Grandmaster Flash and such groups as Public Enemy. Their radical political messages are influenced by black nationalism, the teachings of Malcolm X, and Afrocentric themes. “Gangsta” rap describes the category of rap featuring profane and sexually explicit lyrics that maintain the music’s controversial position in American popular culture. These violent, misogynist lyrics regularly incite the enforcement of obscenity laws, the wrath of women’s groups, and the condemnation of black community organizations seeking more positive role models. Yet, in spite of this resistance, rap is a multibillion-dollar industry. Young black males who have experienced the inner-city life that gave birth to rap and hip-hop culture see it as the only platform from which they can publicly express their frustration with their social and economic circumstances. In rap they express their views about the realities of early death; sex and relationships with women; the gang and drug culture that controls their communities; their confrontations with the police, the judicial system, and prison; acceptance of violence and the criminal life as their only means of survival; faith in God or the absence of faith; the need to create babies to keep their memories alive; the anti-intellectual facade they have developed to protect their fragile egos; and the profound sense of abandonment and disrespect they have for older black men who were never there to guide them along the path to manhood. These are the obstinate issues that confront too many black boys. Disheartening economic and social statistics also tell the story. About one-third of the African American population, or 10 mil-
35
lion people, living mostly in the inner cities, is stuck at the bottom of the economic ladder. Their unemployment rate is twice that of whites, hovering around 56 percent. More than 70.3 percent of black women are unmarried when their first child is born and will raise their children without a father. In 1995, there were 827,440 black men in their twenties serving time in prison, on probation, or on parole. Yet most African American boys will never enter the criminal justice system. By the mid-1990s more than 2 million blacks had acquired four or more years of college education. More black people hold local, state, and national offices than ever before. In 1992 blacks owned and operated 620,912 businesses. And in the realm of culture and the arts, African Americans continue to amass significant awards (Johnson and McCluskey 1997, ix–xx). Nevertheless, a critical mass of black boys requires the attention of black men. A generation has been lost without responsible grandfathers, fathers, uncles, older brothers, male cousins, or friends to pass down important survival strategies and rituals. In Black Men Speaking, John McCluskey Jr. and Charles Johnson write that African American men have always lacked social and economic power. Yet they managed to raise families well, love wives, find work, worship their gods, contribute to the building of the nation, and create great works of the imagination. Thirty years ago black manhood stood for hard work, thrift, strong faith, and family. What sustained black Americans was passing their culture along to the next generation. Somewhere along the way, the weak links in the chain were allowed to predominate. Although black boys have been resilient throughout American history, they need African
36
Alger, Horatio
American adults, both male and female, to instruct them in the old ways and create new rituals of manhood worthy of their ancestral legacy. Earnestine Jenkins See also Apprenticeship; Civil War; Douglass, Frederick; Great Depression; Indentured Servants; Jobs in the Nineteenth Century; Jobs in the Seventeenth and Eighteenth Centuries; Plantations; Poverty; Schools; Sexuality; Slave Trade; Slavery; Washington, Booker T., and W. E. B. Du Bois; World War II References and further reading Clark-Hine, Darlene, and Earnestine Jenkins, eds. 1999. A Question of Manhood: A Reader in U.S. Black Men’s History and Masculinity. Vol. 1. Bloomington: Indiana University Press. Desrockers, Robert E., Jr. 1999. “Not Fade Away: The Narrative of Venture Smith, an African American in the Early Republic.” In A Question of Manhood: A Reader in U.S. Black Men’s History and Masculinity. Vol. 1. Edited by Darlene Clark-Hine and Earnestine Jenkins. Bloomington: Indiana University Press. Horton, James, and Lois E. Horton, consulting eds. 1995. A History of the African American People. New York: Smithmark Publishers. Johnson, Charles, and John McCluskey Jr., eds. 1997. Black Men Speaking. Bloomington: Indiana University Press. King, Wilma. 1995. Stolen Childhood: Slave Youth in Nineteenth-Century America. Bloomington: Indiana University Press.
Alger, Horatio Horatio Alger, Jr. (1832–1899), wrote more than 100 novels as well as biographies of public figures, short stories, and poetry. Nearly all his fiction and nonfiction were aimed at boys and young men, and each of these books told the story of a young male protagonist, down on his luck but climbing upward in social sta-
tus. Alger’s novels were later described as “rags-to-riches” stories extolling the rewards of hard work, but in fact most of his heroes rise through “luck and pluck,” and except in the rare cases where they come into an inheritance, they seldom attain more than middle-class respectability and comfort. Ragged Dick was one of the best-sellers of 1867, but none of Alger’s other books attained such popularity during his lifetime. Though Alger was a nineteenth-century writer, the “Horatio Alger tale” now referred to by self-made people and presidential candidates is a twentieth-century invention. Alger was born in Revere, Massachusetts, where his father Horatio, Sr., was a Unitarian pastor. The son seemed destined to follow in his father’s footsteps and entered Harvard at age sixteen. By his graduation in 1852, he had decided to become a writer, having already published several poems and essays, some of which were collected in his first book, Bertha’s Christmas Vision: An Autumn Sheaf (1856). Finding writing an unreliable source of income, however, Alger enrolled in Harvard Divinity School in 1857, graduating in 1860. After a grand tour of Europe, he returned to be declared exempt from military service because he was nearsighted and too short. He contributed to the Union cause, however, by penning his first book for boys, Frank’s Campaign (1864), the story of a teenage boy who keeps up the family farm while his father fights in the war. The novel sold well and was reviewed positively, but still doubting his ability to make a living as a writer, Alger agreed to become pastor of the First Unitarian Church in Brewster, Massachusetts. While occupying this position he continued writing, publishing another book for boys and working on a new girls’
Alger, Horatio book. Then early in 1866 a parishioner charged Alger with “a crime of no less magnitude than the abominable and revolting crime of unnatural familiarity with boys, which is too revolting to think of in the most brutal of our race” (quoted in Scharnhorst and Bales 1985, 67). Alger resigned in disgrace and moved to New York, where it seems he was followed by no news of his pederastic past. Despite the outrage of his former parishioners, who took steps to ensure he could never take up the ministry again, there is no evidence the incident ever materially affected his later career, though several years later he did discuss it with members of the James family, Henry, Sr., and William. From 1867 until his death, Alger devoted himself to juvenile fiction. Living in the city exposed him for the first time to the street children who became the models for his most famous protagonists, and he frequented the Children’s Aid Society and the Newsboys’ Lodging House, both founded by the reformer Charles Loring Brace. Drawing on this material, in January 1867 he published the first installment of Ragged Dick; or, Street Life in New York in the monthly magazine Student and Schoolmate. A year later, Loring published the story in book form, and it quickly reached a best-selling status that Alger would never again match in his lifetime. Ragged Dick set the pattern for most of Alger’s subsequent novels. It opens with its title character living on the street and surviving by selling newspapers and shining shoes. Dick is not a “model boy,” for he is given to the “extravagance” of going to the Old Bowery Theatre, smoking good cigars, gambling, and treating his friends to oyster stew. Like all Alger’s protagonists, though, Dick is honest by
37
nature, “frank and straight-forward, manly and self-reliant” (7). In the opening scene, he is hired by a wealthy businessman to give his son a tour of New York. Dick demonstrates his comprehensive knowledge of the city, outwits a swindler and returns the man’s ill-gotten gains to their rightful owner, is accused and proven innocent of theft himself, and displays a propensity to pun. His encounter with the wealthier boy, along with the kindness of the boy’s father, inspires Dick to renounce his extravagant activities, use the money he has saved to rent a room, and aspire to a morally and economically better life. Quickly realizing that literacy is crucial to success, he offers to share his bed with his friend Fosdick if the younger boy will teach him to read. After a number of adventures, including capturing another thief, Dick bravely dives off the side of a ferry to save a rich man’s daughter from drowning. In gratitude, the man sends him a new suit of clothes and hires Dick as a clerk. Having earlier dropped the epithet “Ragged” from his name, on the last page Dick declares himself to be “Richard Hunter, Esq.,” signaling his transformation into “a young gentleman on the way to fame and fortune” (132). Although elements of street life and the route out of it are undoubtedly sentimentalized in tales like this one, Alger brought to his stories some qualities of realism and material specificity that had been lacking in boys’ fiction like the Oliver Optic series by William Taylor Adams, a previous editor of Student and Schoolmate. Alger’s familiarity with the city enabled him to describe its geography and physical details with great accuracy; indeed, Dick’s tour of New York ranges from City Hall to the still-unfinished Central Park, including public
38
Alger, Horatio
buildings, department stores, and mansions of the rich and famous. In this and other books, readers were taken into communal lodging houses for the poor, cheap boardinghouses, and luxurious suites in the Astor Place Hotel. They were shown something of what it was like to sleep on the street in a box, to shine rich men’s shoes, and to be hungry and cold without a penny in one’s pocket. And they were provided with details of how boys in these spaces and situations were able to cope, ranging from places to find cheap meals and cheaper beds to itemized budgets explaining how to live on a minuscule income, still setting aside a few cents a week for a rainy day. Some of Alger’s later novels lack such specificity—for instance those set in the West rather than in the urban East—but even in those he aspired to a degree of accuracy, taking a journey to the west in 1876–1877 to gather material and experiences for future novels. And although they may seem didactic to us today, Alger’s morals were conveyed somewhat more subtly than those in antebellum children’s fiction. Where Alger’s realism verges farthest into fantasy is in his account of the human and economic interactions that lead boys from rags to respectability. In Alger’s world—whether the scene takes place in the city, on a train, in the West, or in the country—the boy protagonist is always on public display. There is always a wealthy patron waiting in the wings, able to recognize the virtue and capacities of the hero and ready to reward him for rescuing the tycoon’s daughter from drowning, for foiling a confidence man or thief, or for aiding a younger boy in distress. The older, richer man is able to see the street boy’s qualities beneath the rags and dirt that cover them: as one such
benefactor describes the protagonist of Silas Snobden’s Office Boy: “He is a very good looking boy, and he looks good, which is still better” (91). Surprisingly for those who imagine the Horatio Alger tale as one in which a virtuous, hardworking boy pulls himself up by his own bootstraps, more often than not what sets the boy on the road to the middle class is a wealthy man’s benevolence, based on a similar snap judgment of a boy’s appearance. With Ragged Dick, Alger began a pattern that dozens of his later books would follow. The novels end with their protagonists comfortably ensconced in their new respectability: clean; literate; owning a set of new clothes; renting a cozy apartment; working as clerks or personal assistants to much richer men; and possessing books, furniture, and small but growing bank accounts. And in the final lines Alger or his publisher promises a sequel. In the next novel in the series, the previous book’s protagonist is generally a minor character who aids a younger boy in his rise to respectability. Thus Ragged Dick was followed by Fame and Fortune, Mark the Match Boy, and three more in the Ragged Dick series, which in turn were followed by Luck and Pluck; or, John Oakley’s Inheritance, the first of the eight-book Luck and Pluck series. Alger used this formula faithfully for the rest of his life, deviating only when he occasionally tried to write adult fiction for women (as in The Disagreeable Woman and A Fancy of Hers) and by once making his homeless “street Arab” a girl (in Tattered Tom, who early on reveals her real name to be Jane). Still more rarely, Alger put his formula in the service of a social cause. Phil, the Fiddler; or, the Adventures of a Young Street Merchant depicted the depredations of Italian
Alger, Horatio immigrant padrones, who sent young, illiterate Italian boys into the street to play music and then absconded with the money they had earned. Alger claimed that his novel put an end to this practice by exposing it to public view. The curious thing about this claim is that if Alger’s novels were really addressed to young boys, their readership was unlikely to have the power to effect such a change. However Alger imagined his audience, it expanded exponentially after his death. At the end of his life he estimated his total lifetime sales at 800,000 volumes (Scharnhorst and Bales 1985, 149). Then, starting a year after Alger’s death, Edward Stratemeyer released eleven novels of his own under Alger’s name. A decade later, many of Alger’s own books were back in print and reaching annual sales of 1 million or more. His work declined into obscurity again in the 1920s, but at the same time there emerged a mythical version of “the Horatio Alger story” as popular magazines and newspapers constructed a completely fictitious version of his fictions. A 1943 Atlantic Monthly article, for instance, inaccurately described them as evincing a “faith in laissez-faire, in the best of all possible worlds, in the inevitability of rags to riches” (quoted in Scharnhorst and Bales, 154). In times of economic crisis from the Depression to the recession of the 1980s, the Horatio Alger myth became a bulwark against criticism of American capitalism to be invoked by popular historians or by Ronald Reagan, himself a winner of the Horatio Alger Award along with Dwight Eisenhower, Conrad Hilton, Norman Vincent Peale, and Alfred Fuller, founder of the Fuller Brush Company. As if in oblique commentary on this mythologization, in 1928 the journalist
39
Herbert R. Mayes wrote the first biography of Horatio Alger: Alger: A Biography without a Hero. Undaunted by having access to no actual facts on Alger’s life whatsoever, Mayes made up some juicy ones. He invented a diary Alger supposedly kept, quoted from nonexistent letters, and attributed to Alger a set of affairs with women in Paris. Fifty years later, Mayes publicly confessed that what he had concocted was nothing but “a fairy tale,” but like the misrepresentation of Alger’s novels as “rags-to-riches” tales, this fiction had more power than the facts. In the ensuing decades, Mayes’s “research” became the basis for encyclopedia entries and more full-length biographies, several of which not only reproduced Mayes’s deliberate falsehoods but added several of their own. A 1974 biography by Edwin P. Hoyt added, with sensational flair, the facts about Alger’s dismissal from the ministry but otherwise recapitulated the problems of previous books. Only with Gary Scharnhorst and Jack Bales’s 1985 The Lost Life of Horatio Alger, Jr. did we get a reliable account of Alger’s life and works. Glenn Hendler See also Newsboys References and further reading Alger, Horatio, Jr. 1973. Silas Snobden’s Office Boy. 1889–1890. Reprint, Garden City, NY: Doubleday. ———. 1985. Ragged Dick; or, Street Life in New York. 1868. Reprinted in Ragged Dick and Struggling Upward. New York: Penguin. Cawelti, John. 1965. Apostles of the SelfMade Man: Changing Concepts of Success in America. Chicago: University of Chicago Press. Hendler, Glenn. 1996. “Pandering in the Public Sphere: Masculinity and the Market in Horatio Alger.” American Quarterly 48, no. 3 (September): 414–438.
40
Allowances
Horatio Alger Association of Distinguished Americans, http://www. horatioalger.com. Mayes, Herbert R. 1928. Alger: A Biography without a Hero. New York: Macy-Masius. Moon, Michael. 1987. “‘The Gentle Boy from the Dangerous Classes’: Pederasty, Domesticity, and Capitalism in Horatio Alger.” Representations 19 (Summer): 95–97. Nackenoff, Carol. 1994. The Fictional Republic: Horatio Alger and American Political Discourse. New York: Oxford University Press. Scharnhorst, Gary, and Jack Bales. 1981. Horatio Alger, Jr.: An Annotated Bibliography of Comment and Criticism. Metuchen, NJ: Scarecrow Press. ———. 1985. The Lost Life of Horatio Alger., Jr. Bloomington: Indiana University Press.
Allowances Allowances are sums of money regularly transferred from parents to boys, primarily for teaching them to manage money wisely, and are commonly viewed as discretionary income. However, some parents may expect allowances to cover more than discretionary spending. For example, one parent may expect a boy to pay for bus fare, school lunches, and extra clothing with an allowance, whereas another parent will expect the allowance to cover birthday and Christmas gifts. Many parents expect their sons to save a small portion for future purchases or charitable contributions. Parents often link allowances to household chores, but this practice is hotly debated by parenting experts. Some parents feel that children, as family members, are entitled to receive allowances. Others believe that unearned allowances used for discretionary purposes do not prepare children for adult responsibilities such as rent, car payments, or school loans. They
recommend that boys should earn their allowances by performing household chores to sufficiently prepare them for the adult world, where one works in exchange for wages. However, the latter arrangement may create a source of tension within the family if the boy decides to decline the money and, therefore, fails to do chores. As a compromise, some parents give small allowances with no strings attached and then allow their sons to earn extra money by doing chores parents might otherwise pay someone outside the household to perform, such as babysitting or mowing the lawn. Some parents have reported giving allowances to their three-year-olds, but most start at age five or six. Often, boys will receive allowances until they graduate from high school or even while attending college. Younger siblings often receive allowances at an earlier age, since they are aware of the allowance arrangements of their older siblings and desire equitable arrangements. Some parents, particularly those who subscribe to the entitlement philosophy of allowances, may stop giving allowances when their sons start working at part-time jobs. It is recommended that the allowance amount given be sufficient to purchase commodities suited for the child’s age. Current recommendations for an appropriate allowance fall between 50 cents and $1 a week for each year of life, but within the family budget and comparable to what the child’s peers receive. This allowance can then be increased as necessary or on an annual basis, such as on the boy’s birthday. Some boys take on more responsibilities as their allowance increases. For example, a teenager may be expected to begin purchasing some clothing. It is recommended that younger children receive their allowances frequently—once or twice
Allowances
41
A young boy earns his allowance by doing the dishes, 1937. (Bettmann/Corbis)
weekly—always on a regularly scheduled day. Older, more mature children may be able to budget their allowances on a biweekly or monthly schedule. Giving an allowance is usually a family custom, with parents giving allowances if they received allowances as children. In the late twentieth century, roughly half of all children received allowances (Pabilonia 2000). Parents who do not give their children allowances give money to their children spontaneously when the need arises. Allowances were not always as prevalent as they are today. In the 1890s, only 28 percent of 4,000 California children
surveyed received allowances (Kohler 1897). Between the 1870s and 1930s, the value of children changed from being economically useful to economically useless as a result of a strong anti–child labor movement (Zelizer 1985). As children’s earnings from the marketplace declined, allowances from their parents rose as their principal source of money. In Middletown (1929), Robert S. Lynd and Helen M. Lynd provide an example of this change by discussing the spending behavior of high school students living in a small midwestern town in 1924. In contrast to their parents’ generation, these high school students carried more spend-
42
Allowances
ing money and were required to make more frequent and larger monetary transactions on their own, in part to pay for their more elaborate social life. Parents in Middletown appeared to be socializing their sons to be self-sufficient; parents were less likely to give sons spending money and encouraged them to earn their own spending money. In contrast, their daughters were socialized to be monetarily dependent as their mothers were; they were more likely to receive allowances or cash as the need arose. Those receiving all their spending money from their parents included 53 percent of the girls but only 15 percent of the boys; 21 percent of girls and 17 percent of boys received some of their spending money in the form of an allowance. Even during the Depression, some parents found money for allowances. In 1933, among 110 families living in twelve midwestern and central states, half the parents gave allowances, usually starting when their children were between the ages of four and eight. Many children (36 percent) received extra money in addition to regular allowances. Children up to age eight received about 10 cents per week, and twelve-year-olds received up to 25 cents per week (Morgan 1939). During this period, parents gave allowances to teach their children to save. Allowances were relatively small, and penny candy was a favorite purchase. Following World War II, however, allowances were given with the emphasis on teaching children to be consumers. All or most of an allowance would be spent at the child’s discretion, to learn how to budget money and to make wise purchases. Although candy was still an important purchase category, in the latter part of the twentieth century, boys spent their allowances on more expen-
sive items such as entertainment, toys, and clothing (McNeal 1987). By 1981, when double-digit inflation had taken its toll, Money magazine reported that children aged five to twentyone received allowances double those received in 1972 by children; in the same period, adults’ personal income had risen 70 percent. Weekly allowances in 1981 ranged from 25 cents for children under age seven to $5 for children age thirteen. Youths aged thirteen to fifteen received total cash transfers from their parents averaging $10 per week, whereas those aged sixteen to nineteen received more than $20 per week. Even in the late 1980s, parents gave allowances so their children could learn to responsibly handle money and purchase items for themselves (McNeal 1992). About 55 percent gave allowances, usually weekly. The average weekly allowance for children aged four to twelve was $2.34. Unlike the parents of two decades earlier, 60 percent of parents expected children to work for their allowances. About three-fourths of children saved some of their allowances (Sports Illustrated for Kids 1989). Children’s savings rate was about 30 percent, with savings lowest for eight- and nineyear-olds. Young boys saved less than young girls, possibly because boys are traditionally socialized to become independent spenders at an early age (McNeal 1992). Children who were allowed greater discretion in the use of their allowances tended to save less. All children saved at home, and many of the older children had commercial savings accounts. The majority of children (58 percent) reported buying candy with their money. Other popular items were toys, soft drinks, Christmas presents, books, and magazines.
Allowances In 1987, the Gallup Youth Poll found that the probability of receiving an allowance decreased sharply as a boy’s age increased. Fifty-five percent of youths aged thirteen to fifteen received allowances, but only 33 percent of youths aged sixteen to seventeen received allowances. Among those receiving allowances, the majority (88 percent) performed chores in exchange for their allowances. Teenage boys received, on average, smaller allowances (total cash transfers) than did girls, $8.39 and $11.71 per week, respectively, probably reflecting differences in time spent performing household chores. Boys were more likely than girls to do chores involving outdoor duties, such as mowing the lawn, and less likely to do indoor tasks, such as cooking, laundry, and babysitting. These indoor tasks tended to be greater in number and collectively required greater time input than outdoor tasks (Gagner, Cooney, and Call 1998). Teens from the South were the most likely to receive allowances, but teens in the Midwest received the largest allowances. Teens living in metropolitan areas received higher allowances than those living in smaller cities (Bezilla 1988), which may reflect differences in teens’ spending opportunities and the cost of living. Among teens at a selective New York City high school, boys who had never held jobs were more likely to receive allowances (Miller and Yung 1990). Many boys reported that their allowances stopped when they started working. Contrary to their parents’ attitudes, teens perceived allowances as either family entitlements or housework payments rather than as devices to learn to manage money. In the 1990s, receipt of an allowance depended upon age and race: younger children and African American children were
43
the most likely allowance recipients. In 1992, boys aged twelve to eighteen received average monthly allowances of $39.53, 13 percent more than what girls received; but boys were paid 19 percent less than girls for extra chores performed around the house (chores not required for receipt of an allowance). Caucasian children received the highest allowances, with African American children receiving more than other minority children (Meeks 1998). As in the 1980s, older boys were less likely to receive allowances as their probability for being employed increased: 59 percent of twelve- and thirteen-yearold boys received allowances in 1996, versus only 37 percent of sixteen-year-old boys. Allowance amounts increased with parents’ income and decreased as the number of siblings within the household increased (Pabilonia 1999).The most common frequency for dispersal of allowances was weekly, although a sizable percentage of youths received allowances monthly (64 and 15 percent, respectively). Median allowance amounts for boys receiving allowances increased with age as their parents allowed them more independence in their expenditures. Different researchers report different amounts for allowances at the end of the twentieth century. According to Sabrina Pabilonia (1999), the median weekly allowance for twelve- and thirteen-year-old boys was $4.80, and the median weekly allowance for sixteen-year-old boys was $7.70. The only gender difference in the median allowance received was for sixteen-year-olds, with boys receiving approximately $1 less than girls. Zillions magazine (“National Allowance Survey” 1999) reported that in 1998 half of all children aged eight to fourteen received allowances. On average, eight- and nineyear-olds received $3.74 per week, ten-
44
Allowances
and eleven-year-olds received $5.19 per week, children aged twelve and thirteen received $6.66 per week, and fourteenyear olds received $9.45 per week. Throughout the 1990s, Zillions (“National Chores Survey” 1999) documented that children received extra money for doing special chores. Teenage Research Unlimited (2000) found that in 1999, 26 percent of twelveto nineteen-year-olds received allowances. The unemployment rate was low in the late 1990s. More teens worked and fewer received allowances than in the past, although the lower percentage receiving allowances than reported in previous studies is probably partly attributable to the inclusion of eighteen- and nineteen-year-olds, who would be the most likely to be working. According to the Rand Youth Poll (2000), which included money received from any source besides market earnings, boys aged ten to twelve received, on average, $21.90 per week from their families. They spent their money on entertainment, food, hobbies and toys, and clothing. Thirteento fifteen-year-old boys received $30.50 per week on average and spent most of it on food, whereas girls of the same age spent most of their money on clothing. Sixteen- to nineteen-year-old boys received, on average, $41.15 per week and spent most of it on dating and entertainment. As the boys aged, their savings rate decreased from 31.3 percent for ten- and twelve-year-old boys to 14.9 percent for sixteen- to nineteen-year-old boys. Most of the boys’ savings were short-term savings for the purpose of purchasing largeticket items. Only 20 percent claimed to be saving for long-term goals, such as their college education. Sabrina Wulff Pabilonia
References and further reading Bezilla, Robert, ed. 1988. America’s Youth: 1977–1988. Princeton, NJ: Gallup. Gagner, Constance T., Teresa M. Cooney, and Kathleen Thiede Call. 1998. “The Effects of Family Characteristics and Time Use on Teenage Girls; and Boys’ Household Labor.” Princeton University Center for Research on Child Wellbeing. Working Paper Series no. 98-1. Grace, Catherine O’Neil. 1998. “Kids and Money: Valuable Lessons.” The Washington Post, June 23, Z22. Kohler, Anna. 1897. “Children’s Sense of Money.” Studies in Education 1, no. 9: 323–331. Lynd, Robert S., and Helen Merrell Lynd. 1929. Middletown: A Study in American Culture. New York: Harcourt, Brace. McNeal, James U. 1987. Children as Consumers: Insights and Implications. Lexington, MA: D.C. Heath. ———. 1992. Kids as Customers: A Handbook of Marketing to Children. New York: Lexington Books. ———. 1999. The Kids Market: Myths and Realities. Ithaca, NY: Paramount Market Publishing. Meeks, Carol. 1998. “Factors Influencing Adolescents’ Income and Expenditures.” Journal of Family and Economic Issues 19, no. 2: 131–150. Miller, Joanne, and Susan Yung. 1990. “The Role of Allowances in Adolescent Socialization.” Youth and Society 22, no. 2: 137–159. Morgan, Winona L. 1939. The Family Meets the Depression. Minneapolis: University of Minnesota Press. “National Allowance Survey: How Do You Compare?” 1999. Zillions (January– February): 8–11. “National Chores Survey.” 1999. Zillions (March–April): 20–23. Pabilonia, Sabrina Wulff. 1999. “Evidence on Youth Employment, Earnings, and Parental Transfers in the National Longitudinal Survey of Youths 1997.” Presented at the NLSY97 Early Results Conference at the Bureau of Labor Statistics, Washington, DC, November 18–19. ———. 2000. “Youth Earnings and Parental Allowances.” University of Washington working paper. Rand Youth Poll. 2000. Teen-age Personal Spending Continues to Climb While
Amusement and Theme Parks Youths’ Overall Impact on Economy Intensifies. New York: Rand Youth Poll. Sherman, Miriam. 1986. “Children’s Allowances.” Medical Aspects of Human Sexuality 20, no. 4: 121–128. Sports Illustrated for Kids Omnibus Study. 1989. Cited on p. 29 in Kids as Customers: A Handbook of Marketing to Children. Edited by James U. McNeal. New York: Lexington Books, 1992. Teenage Research Unlimited. 2000. Teens Spend $153 Billion in 1999. Northbrook, IL: Teenage Research Unlimited. Tuhy, Carrie. 1981. “The Star Wars Generation Takes on Inflation.” Money 11 (July): 88–96. Zelizer, Viviana A. 1985. Pricing the Priceless Child: The Changing Social Value of Children. New York: Basic Books.
Amusement and Theme Parks Today the terms amusement park and theme park are used interchangeably, but in origins, design, intent, and effect they serve markedly different boyhood audiences. Although the American amusement park dates back to the end of the nineteenth century, the theme park first entered modern American consciousness with the televised opening of Disneyland in 1954. Amusement parks, themed or unthemed, offer limited but intense experiences. Their attraction lies in the immediate physical gratification of the thrill ride—the apparent defiance of physical laws of action and reaction, the exhilaration of speed, the push and pull of gravity, and the rush of adrenaline in response to the illusion of potential bodily harm. For the teenage male, challenging the limits and risk taking have high riskfor-gain value in building self-assurance and a sense of personal mastery necessary for self-actualization.
45
The term theme park came into public usage several years after Disneyland’s opening. It was coined by a journalist at the Los Angeles Times when it became obvious that Disney’s creation could not be faithfully described with the terminology of the traditional amusement park. The theme park is a virtual reality, a three-dimensional movie in which the audience moves around, interacts, and reads its own personal plots and subplots into the script. Theme parks are planned, built, integrated, and unveiled as a unified design in order to preserve their narrative integrity, with ride vehicles acting much as the lens of a movie camera, positioning the visitor’s viewpoint for the next scene in a scripted sequence of events. This detailed holistic evocation of a place in time is one of the major elements that distinguish the theme park from the amusement park. Most popular theme park attractions are narrative journeys that explore—in a child’s terms— big themes: exploration, discovery, risk, self-discovery, and good versus evil. Theme parks offer the opportunity for role reversal and father-son bonding. The Disney theme park audience slowly begins to erode around age fourteen, when boys begin to become cognizant of their impending adulthood, and they will not return on their own until they become fathers, renewing the cycle again with their own sons. During their own transition from teen to adult, they tend to shift their allegiance to more challenging venues such as the high-tech and high-speed thrill rides of themed amusement parks such as Six Flags Magic Mountain, Busch Gardens, and Universal Studio’s Islands of Adventure. New York’s Coney Island saw the development of the first true American amusement parks. George C. Tilyou in-
46
Amusement and Theme Parks
Coney Island, site of the first American amusement parks, 1910 (Library of Congress)
vented the first amusement park in 1897 by surrounding his 14-acre Steeplechase Park with a fence and charging admission in order to separate his clientele from the pickpockets, con men, and prostitutes who haunted Surf Avenue. From this island of safety, visitors were free to let down their guard and play. Many of the visitors were young working-class men and women who labored in city factories and shops and used some of their earnings for pleasure. Young men, who generally received higher pay than did women, often “treated” their dates to a trip to Coney Island and to admission to the rides there. Coney Island thus became one of the first destination resorts for young people, a place where they could escape the watchful eyes of their parents and other family members and court as they pleased. Frederic Thompson and Skip Dundy’s Luna Park (1903) and William H. Reynolds’s Dreamland (1904) were Coney Island’s contribution to the theme park genre. Unlike the better-known Steeplechase Park, which featured thrill rides
like the Ferris wheel and roller coaster, Luna Park and Dreamland used the “experiential” powers of the theme park medium to re-create other times, places, and worlds. Along with the thrill rides, visitors could take “A Trip to the Moon,” emerging from their space capsules to stroll the lunar landscape and be accosted by “moon men” (giants and dwarfs) and “moon maidens” handing out samples of green cheese. They could voyage “20,000 Leagues beneath the Sea,” experience the eruption of Vesuvius and destruction of Pompeii, and view the creation, the end of the world, and even go to hell—complete with fluttering tissue-paper flames and menacing demons. These were the precursors, in both spirit and technology, of contemporary attractions such as Disneyland’s Matterhorn and Universal Studio’s Earthquake: The Big One. In other cities across the nation, transportation companies saw amusement parks as a means to increase ridership on weekends. “Trolly parks” such as Willow Grove Park outside Philadelphia and Cedar Point in Sandusky, Ohio, grew into
Amusement and Theme Parks popular attractions but were still primarily aimed at the adult audience, featuring entertainment such as band concerts directed by John Philip Sousa. By the end of World War II, amusement parks were a dying industry. Not only had the war deprived them of their primary audience of young males, but also returning veterans had little interest in the manufactured thrills offered by the timeworn venues. The revival of the park genre in the 1950s was made possible by a unique confluence of events. They included the GI Bill, which fueled expansion of the middle class by providing the working population with an education and job opportunities formerly reserved solely for the elite, enhanced geographic and class mobility, more leisure time, and greater discretionary income. Equally important was the new system of interstate highways, the population explosion of the baby boom, and the equalizing nature of television, which enabled a continentwide, ethnically diverse population to instantly share a common set of values, memories, and cultural benchmarks. The rise of the new, mobile, television-raised middle-class family with a shared cultural experience made a permanent payas-you-go theme park feasible and allowed Walt Disney to reinvent the genre as a child-focused family entertainment. The first successful non-Disney theme park, Six Flags over Texas, opened in 1961 in Arlington, between Dallas and Forth Worth. Since the principles of communication by theming can be applied to many subjects, from pure fantasy to hard science, contemporary theme parks come in a broad spectrum of styles and specialties. From Disney’s single 200acre park in Anaheim, California, theme parks and the revitalized amusement parks have grown to an industry that
47
takes in more than $4 billion a year. At the same time, operating costs, including capital improvements, training, maintenance, enhancement, and marketing, are also high, including the trend toward constant innovations in attractions to encourage repeat visitation. After several corporate conglomerates acquired theme parks in the 1970s, these high capital reinvestment costs caused them to divest quickly. Today the most successful parks operate under a few corporate banners focused exclusively on entertainment, but their true value lies in the shared experiences and memories of boys’ first park adventures. The park experience is often considered part of a personal and family legacy, handed down through the generations. As safe public spaces, they offer both validating and challenging experiences that conform to established psychological development models of human needs. It is the way visitors use the parks, beyond their explicit design purposes or entertainment value, that makes them popular platforms for the acting-out of personal and shared identities integral to American boyhood. In the protected venues of the parks, boys can unleash their potential, trying out hero roles, mastering fear, and exploring new and fantastic worlds. The value of these experiences lies in their power to drive development and growth, to realize human potential. Park attractions are dramas that reward each age group according to its needs. Attractions are designed to provide experiences suitable for every age level. For small children, value may lie in the repetition, movement, color, and initiation into the three-dimensional arts of the carousel with its easily understood circle route, repeating calliope music, colorful horses, and the predictable but fantastic
48
Amusement and Theme Parks
Riding a carousel, 1951 (Archive Photos)
journey. For many small boys this is their first real journey away from their parents—an abbreviated gallop-and-return cycle, testing the outside world but well within the parents’ repeating and reassuring view. Other attractions act as transformers, putting small boys in familiar adult roles. Drive-yourself miniature automobiles are a park staple. They serve as a platform for role reversal, with the boy driving miniature gas-powered automobiles and the parent as the passenger, putting the boy in control and giving him an early introduction to the social experience of car culture. It is as much an experience of American values as of rules of the road. These and similar experiences assume importance both in confirming identity
and in forming bonds with others. Mastery of these attractions demarcates one life stage from another within childhood, working up to the momentous transition from childhood to adolescence. Children constantly push the envelope by measuring themselves against the “you must be this tall” signs that are the entry requirement to the more advanced attractions. So strongly are these boyhood experiences tied to the development of individual identity that they resonate well into adulthood. Inevitable changes to a key attraction can stir strong emotions, such as when adult males in their thirties and forties, proclaiming themselves “Toadies,” vehemently protested Disney’s closing of the irreverent and edgy “Mr. Toad’s Wild Ride” in favor of a new attraction
Apprenticeship featuring the more amiable—and heavily marketed—Winnie the Pooh. Once boys reach their teen years, thrill ride–based amusement parks are ideal venues for them to seek to test themselves as individuals, separate from their parents. At this stage, in the parks as in the home, peers are looked to as the primary legitimate validators of experience. Both theme and amusement parks are attempts to construct a total alternate universe, an enclosed environment physically separate and distinct from the everyday world. Today, the creation of an alternate world is available to the general public in the variety of themed environments, but the core attraction remains the same. Park environments are a reflection of the idealized identity of the visitor. Visitors gravitate to the parks that reflect and celebrate the virtues they most admire and cultivate in themselves. Jamie O’Boyle References and further reading Adams, Judith. 1991. The American Amusement Park Industry. A History of Technology and Thrills. Boston: Twayne Publishers. Blake, Peter. 1973. “The Lessons of the Parks.” Architectural Forum (June): 28ff. Findlay, John M. 1992. Magic Lands. Seattle: University of Washington Press. King, Margaret J. 1981. “Disneyland and Walt Disney World: Traditional Values in Futuristic Form.” Journal of Popular Culture (Summer): 114–140. Marling, Karal Ann, ed. 1997. Designing Disney’s Theme Parks: The Architecture of Reassurance. New York: Flammarion. Nasaw, David. 1985. Children of the City at Work and at Play. Garden City, NY: Anchor Press Doubleday. Nusbaum, Paul. 1994. “Crowded House: Fun and Gaming.” Philadelphia Inquirer, May 29, 11ff. Peiss, Kathy. 1986. Cheap Amusements: Working Women and Leisure in Turn-
49
of-the-Century New York. Philadelphia: Temple University Press. Snow, Richard. 1989. Coney Island: A Postcard Journey to the City of Fire. New York: Brightwater Press.
Apprenticeship Apprenticeship originated in the ancient Egyptian, Greek, and Roman worlds as fathers bound their sons to artisans in order to give them an education in the skilled trades. By the thirteenth century the guild system in western Europe institutionalized the practice to limit entrance to skilled crafts through formal training programs. Although guilds deteriorated with increased commercialization by the sixteenth century, state regulations bolstered the practice. The guild system did not take root in the American colonies, but customs and laws regulating apprenticeship were brought by European settlers to the new world. In colonial America, apprenticeship was the time-honored method of learning a craft. By the late eighteenth century, however, the ideology of the American Revolution and an increasingly commercialized economy caused the system to break down. As the nation industrialized in the nineteenth century, boys or their parents were paid wages, and formal apprenticeship was replaced by cheap juvenile labor. Nevertheless, some forms of apprenticeship continued into the twentieth century, sponsored by business and labor unions. As public schooling became widespread and compulsory education laws were passed, many of these programs sought to combine work with school. In the colonial period, sometimes as young as age seven or ten but usually about age fourteen, a boy would be placed by his parents or guardian under indenture. This contract was written in
50
Apprenticeship
Benjamin Franklin in his printing shop with apprentices, from a painting by Charles Mills (Library of Congress)
duplicate on a single sheet; after each part was witnessed and signed, the sheet would be torn and each party given a copy. The boy would serve until his early twenties, learning his craft by working, and providing his master with labor in return for instruction in skills. A master was obligated to supply room and board mixed with parental guidance, some education in reading and writing, and a youth’s freedom dues and perhaps a suit of clothes when he progressed to journeyman and worked for wages. The apprentice promised to obey his master, and not to marry, give away his master’s secrets, or buy or sell on his own account. Many colonial boys were exploited by their masters, who beat them or used their labor for other tasks. Yet fathers also sought to protect their teenage sons by binding them out to trusted relatives or breaking an indenture if the boy was unhappy or ill treated. And adherence to the practice was valued by arti-
sans in order to maintain quality of workmanship and to monitor entry into the crafts. The most famous colonial apprentice was Benjamin Franklin (1706–1790). Because he was fond of reading as a child, his father decided to apprentice him at the age of twelve to his brother James so that Benjamin could become a printer. The indenture provided that he serve until the age of twenty-one and be allowed the wages of a journeyman during his last year. Benjamin quickly learned the trade and avidly read books imported from London in his brother’s shop. He composed and printed vivid ballads, which sold briskly on the streets of Boston, and he taught himself to write by reading English periodicals and rewriting the material in his own words. At the age of sixteen his letters, supposedly written by a young widow, Silence Dogood, were anonymously slipped under the door of The New England Courant, his brother’s
Apprenticeship newspaper. Benjamin delighted in the favorable reception of his letters but chafed under harsh treatment by his brother, who beat the boy to enforce his authority. At the age of seventeen, Franklin fled his apprenticeship before his indenture was fulfilled and ran away to Philadelphia, where he succeeded as a printer because of hard work and the skills he had acquired. During the American Revolution the contractual arrangements of apprenticeship eroded, and the authority of masters was undermined. In 1779, in British-occupied New York, twelve-year-old Stephen Allen and his older brother William were apprenticed to a sail maker without a formal indenture. Their widowed mother simply agreed that each boy would stay until he reached the age of twenty-one. Stephen felt that he and the other apprentices were not “well clothed, well fed, nor well-lodged,” for they ate in the cellar and slept in the garret of their master’s house (Thomas 1825). As trade thrived in the occupied city, the master took on two more apprentices and moved them all into the sail loft, where they ate breakfasts of cocoa and bread and dinners of “bitter” and “burnt” warmed-up stew in the yard or on the street. Their master furnished them with the ready-made clothing that sailors wore—roundabout jackets and canvas trousers. Stephen read his Bible and spelling book, but his loose apprenticeship did not include education, and his formal schooling ended when he was twelve. When the British left New York and his master fled with them, Stephen found himself a free laborer in hard times at age fifteen. With his skills he was able to find work at another sail loft, even though journeymen there objected to the hiring of a boy who had not served a full apprenticeship.
51
After the American Revolution, apprentices continued to resent authority and assert their rights. When apprentices in Thomas Adam’s Boston printing house were served moldy bread, they complained about it to the mistress of the house and later threw an offending loaf into a neighbor’s yard. When fourteenyear-old Millard Fillmore objected to chopping wood and his master chastised him, the future president allegedly raised his axe and said, “If you approach me, I will split you down” (Rorabough 1986). Some boys who were empowered by their religious conversions refused to submit to ungodly masters. Others sought liberty from drudgery or harsh masters as Franklin had, by running away. Yet, the ideology of the American Revolution included apprentices in the community of craftsmen, and they joined masters and journeymen in rituals that celebrated solidarity in the trades. Workingmen and boys in leather aprons paraded in northeastern cities, demonstrating their support for the new federal Constitution or celebrating the Fourth of July. Speeches affirmed the right of workers to personal independence, political equality, and just pay. For, ideally, apprenticeship was only temporary dependence on the path to becoming an independent master through instruction in skills and values in the small shop. Nevertheless, the new relationships of an increasingly capitalist economy eroded these ideals. As larger units of production emerged and workers were paid wages, apprenticeship deteriorated. Tailors, shoemakers, and printers, caught in the fluctuating economic conditions and seeking to benefit from cheap or unpaid labor, substituted boy “helpers” for trained journeymen. As skilled workers drew together in the early labor movement, they objected
52
Apprenticeship
to the declining standards of apprenticeship and the hiring of juveniles. In 1808, the New York cordwainers’ (shoemakers) general strike focused on the hiring of boys. In 1811, the city’s journeymen printers circulated an appeal to masters, complaining that employment of “young half-ways” led to declining quality and an “unnecessary multiplication” of workers in the trade (Wilentz 1984). As free labor became more prevalent and apprenticeship declined, bound labor was degraded, reserved more exclusively for blacks emerging from slavery and orphaned or destitute children. In 1780 the Pennsylvania legislature was the first to emancipate slaves, yet it protected an owner’s investment by allowing him or her to retain slaves born before the law went into effect and to keep in bondage until age twenty-eight all children born to slave mothers thereafter. The result was a blurring of slave and indentured labor and a brisk market in black boys not yet free who would serve for a large portion of their productive lives. The practice influenced indenture of free blacks, who also served until they reached the age of twenty-eight rather than the usual term of age twenty-one. Although such indentures provided a source of cheap labor for white masters, they were also used by black parents emerging from slavery as they struggled to gain an economic foothold in Philadelphia or the surrounding countryside. Indenture also continued to be used, as it had been throughout the colonial period, to place orphaned or destitute boys. Through the vendue system, as late as the 1820s rural towns in New England and New York auctioned off dependent children, often separating them from their families, to serve the household that placed the highest bid. The orphan
asylums founded after 1815 kept younger boys in the institution, but when they reached the age of ten or twelve, they were placed under indenture. Of the fiftyseven boys “bound out” by the Female Orphan Society in Philadelphia from 1822 to 1832, thirty-one were sent to farmers to become agricultural laborers. The rest were apprenticed to trades, ranging from cabinetmaker, shoemaker, or tailor to specialized crafts such as Windsor chair maker, silk dyer, coppersmith, jeweler, or ornamental gilder. Early reform schools, such as the houses of refuge founded by municipal governments in the 1820s, also used indenture to place older, unruly boys. By the 1820s apprentices were paid in cash and could find themselves doing the work of journeymen. In 1822, sixteenyear-old David Clapp, apprenticed to a Boston printer, received funds to cover his board in another house and was paid additional wages. When his master fell into debt, David was left to print jobs alone without supervision or additional training. The boy confided to his journal, “I still continue to work alone, with nobody but the mice, who scamper around the silent office as if they thought it had been deserted on purpose to oblige them” (Clapp 1822–1823). Six months later his master was in jail, and David, busily printing a hymnbook and a medical journal, had taught himself the trade. In 1826, fifteen-year-old Horace Greeley, who like many boys of his generation wanted to escape hard labor on the family farm, arranged his own apprenticeship with printers of a newspaper in a Vermont country town. According to a verbal agreement, Horace would earn only his board for the first six months but then be paid $40 a year plus clothing until he reached the age of twenty. When the edi-
Artists tor left and a mercantile firm took over the paper, the office was “laxly ruled” and “as to instruction, every one had perfect liberty to learn whatever he could” (Greeley 1868). Greeley became one of the principal workers in the shop. With blistered fingers and sore back, hurried and overwhelmed by the work, he continued to publish the paper on an oldfashioned wooden press. Later in his life as editor of the New York Tribune, Greeley would support the ten-hour day for apprentices and wage-earning boys. Although the practice of apprenticeship deteriorated with the emergence of a commercial and industrial economy, it did persist in certain contexts. Until the late nineteenth century, lawyers and country doctors learned by reading and practicing with a master in their profession, although without an indenture. Municipal court authorities continued to bind out destitute or delinquent children until the early twentieth century, when Progressive reformers attacked child labor and enacted compulsory schooling laws. Traditional craft apprenticeship persisted in the practices of Germans and other immigrants, and innovative plans were inaugurated by businesses, which sought to develop their own skilled labor, and by craft and industrial unions, which hoped to limit entrance to work and to create strong union men. In the 1850s, the Baldwin Locomotive Works in Philadelphia indentured more than 300 sixteen-yearold boys for five-year terms. Eighty percent of these boys, who earned wages and lived at home or in boardinghouses, completed the program and were hired by the company as machinists, molders, boilermakers, blacksmiths, and carpenters. A new Baldwin plan in the early twentieth century recognized the emergence of the public school system by combining work
53
experience with night school and college classes. Labor unions also managed apprenticeship programs that combined work with schooling. As was the case in colonial America, apprenticeship was still linked with occupational mobility. Sons of union members received preference in enrollment in such programs, and as late as the 1930s serving an apprenticeship was still a crucial factor in promotion in craft and industrial work. Jacqueline S. Reinier See also African American Boys; Franklin, Benjamin References and further reading Clapp, David. 1822–1823. “Diary.” Worcester, MA: American Antiquarian Society. Franklin, Benjamin. 1959. Autobiography. New York: Holt, Rinehart, and Winston. Greeley, Horace. 1868. Recollections of a Busy Life. New York and Boston: H. A. Brown and J. B. Ford. Licht, Walter. 1992. Getting Work: Philadelphia, 1840–1950. Cambridge, MA: Harvard University Press. Reinier, Jacqueline. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers. Rorabaugh, William J. 1986. The Craft Apprentice: From Franklin to the Machine Age in America. New York: Oxford University Press. Thomas, John C., ed. 1825. “Memoirs of Stephen Allen.” New York: New York Historical Society. Wilentz, Sean. 1984. Chants Democratic: New York City and the Rise of the American Working Class, 1788–1850. New York: Oxford University Press.
Artists During the Italian Renaissance a young boy who could draw realistic pictures could be placed in apprenticeship to a great master to learn the art of painting or sculpture. Pursuit of the artist’s profession
54
Artists
was justified by strong support from the patronage of the church and other wealthy citizens, who offered a steady supply of lucrative commissions for portraits, historical or religious paintings, and frescoes and murals for walls or ceilings. In the American colonial period, a boy demonstrating artistic talent would also have been guided into an apprenticeship, perhaps as a sign painter, ornamental gilder, or silversmith or goldsmith. Some youths traveled the country as itinerant artists, whereas others whose talent was recognized by a wealthy benefactor studied portrait and history painting in Philadelphia, New York, Boston, or even London. By the beginning of the twenty-first century, however, children’s drawings were consigned to the world of child’s play, engaged in spontaneously by young children but eschewed by youth in favor of sports, video games, and computers. Although still the benchmark by which earlier societies are judged, the visual arts hold an uneasy place in the life of modern society. Parents who delight in their young boys’ active and imaginative drawings guide these same youngsters toward a steadier and more accepted career path as they get older. Art programs are deemed a luxury within U.S. public schools and are usually the first to be eliminated when budgets get tight. One consequence is that children growing up today learn to devalue their talents and often fail to develop their artistic potential. Despite the current lack of societal support for the development of young artists, almost all young boys and girls begin to draw spontaneously somewhere around the age of three or four. Those with exceptional talent may begin even sooner, sometimes before the age of two. But whether a boy begins sooner or later, the early developmental sequence is much
the same. The first marks, often circular sweeping lines, follow the natural motor movements of his arm. These early, uncontrolled scribbles eventually give way to single lines and true closed circles as he gains enough control to interrupt circular, push-pull, and sweep movements of the arm. By the age of four or five, boys as well as girls aspire to create forms that represent real objects, persons, or animals. Early attempts may be simple action representations in which circles, dots, and lines are accompanied by rich and detailed narratives. Once boys assimilate the shapes and lines they are able to draw to the content of their narrative, the drawing becomes a truly intentional representation with anticipated results. Just as often, drawing topics are suggested to a boy by the scribble itself. A circle with an angle is seen as a bird because the angle resembles the sharp angle of a bird’s beak. By adding details such as “legs” to the underside of the circle, the resemblance is accentuated and the drawing becomes more birdlike. Such fortuitously derived drawings may be repeated many times before a “new” drawing spontaneously appears. This new drawing looks very much like the old, but several essential features are modified, transforming the “bird” into a “doggy.” For the young boy, the same general structure serves multiple purposes. The most consistently recognizable early drawing that boys and girls produce is a circle with one or more dangling lines, called “tadpole man” for its resemblance to a tadpole. A lively debate surrounds the interpretation of this enigmatic figure. It is usually assumed that the toddler is attempting to draw a person, but understanding how much or what parts of the person are depicted has engendered serious research. Does the
Artists circle stand for the whole as an undifferentiated face and trunk, for the head alone, or is it simply the boy’s best solution to the problem of translating three dimensions into two while still maintaining a structural correspondence to the human figure? When young children are asked to place arms on a tadpole or related figures, the results suggest that the large tadpole circle is meant to represent the head alone and could result from a serial ordering of figure parts. According to this interpretation, the head and legs are end-anchored because boys and girls typically start a drawing with the head and end with the legs. The trunk and arms are omitted because they are intermediate in the series. When young boys and girls are asked to draw specific parts of a figure or a figure doing a specific action (e.g., picking a flower), however, they can be induced to include figure parts they normally omit. They are capable of more than they spontaneously draw on their own. By the age of five most boys are beginning to draw conventional figures that reliably include heads, trunks, arms, and legs. Drawings of animals, once simple modifications of generic forms, become increasingly recognizable as dinosaurs, horses, and dogs. When drawing human figures, boys often draw the head proportionally too large, not because they are insensitive to figure proportions, but because they have difficulty planning the amount of space needed for the whole figure. Typically, they start with the head and overestimate its size relative to the remaining space for trunk and legs. They also wish to make the head big enough to include eyes, nose, and mouth. When boys are asked to start with other body parts such as the trunk, they do a drawing that better approximates head-to-
55
Tadpole Figure by Boy age 3. Reprinted from: Constance Milbrath, Patterns of Artistic Development in Children: Comparative Studies of Talent, Cambridge University Press, New York, NY 1998.
trunk ratios of the visual standard for the human figure. At the same time boys and girls are evolving more differentiated drawing schemas, they are also developing drawing formulas for the subjects they love to draw. For example, a boy will typically draw the same dinosaur (usually Tyrannosaurus rex or a brontosaurus) across a variety of contexts to stand for all types of dinosaurs. This extends to the orientation of a figure as well. People are invariably drawn in front view, but animals are drawn in side view. This tendency may result from a desire to use the view that best displays the subject’s characteristics. People are easiest to recognize in front view, but animals are more clearly differentiated
56
Artists
Dinosaur Chasing Cartoons by Talented Boy age 9 (Reprinted with permission of C. Milbrath)
in side view. However, if asked to draw a person running, a boy switches to a side view because this view best displays the action of running. The emergence of drawing formulas emphasizes the fact that young boys as well as girls rarely look at external models when they draw but rather consistently choose to draw from a “picture” in their mind. When asked to draw a specific object, boys will barely glance at it before responding by reproducing their tried-and-true formula for the object. When they do take pains to copy a model, they are limited by their level of cognitive maturity. These limitations are most evident in young boys’ depictions of spatial relationships between parts of an object or scene. When boys below the age of five attempt to copy a model of a square, cross, or cube, they are able only to copy the simplest spatial relationships internal to the figures. Boys’ ability to map three-dimensional spatial relationships onto a two-dimensional drawing surface develops slowly
and usually lags behind the normative development of their intellectual understanding of these same spatial relationships. The construction of a two-dimensional picture with the establishment of a true horizontal and vertical axis begins around the fifth or sixth year in most children. By age seven or eight, many boys add the depth plane by using simple positioning devices such as placing objects meant to be in the foreground below those meant to be in the background or overlapping objects so that those in the foreground partially occlude those in the background. However, for the most part, boys at this age ignore depiction of specific views in favor of clearly portraying the subject. Such drawings are called “object-centered” to distinguish them from drawings that are “view-specific” and take a specific view on an object or scene. As young boys enter preadolescence, they become more interested in portraying realism. Figure orientations become more varied, with youngsters attempting profile and even three-quarter views. At
Artists this point, some preadolescents begin experimenting with different specific viewpoints through the use of foreshortening and projected views of geometrical objects. A very few will draw in true perspective as they enter adolescence. Children talented in drawing reach these same pictorial milestones approximately two years before their less talented peers, around the same time as these spatial relationships are understood intellectually by all children. More significantly, all or almost all talented children draw threequarter views and foreshorten figures by age seven or eight and draw true perspective representations by early adolescence. Very few children who are not noticed for their talent in drawing achieve these same milestones. Preadolescent and adolescent boys appear to be much more interested in depicting spatial relationships in a drawing than girls of the same age. Whereas girls often draw figures fashionably dressed in portrait style, boys choose more active topics, like gun battles; superheroes in action; and planes, ships, trucks, or cars in active motion scenes. These active scenes allow greater exploration of the rules for depicting three-dimensional spatial relationships in the two-dimensional plane. Preadolescent and adolescent boys begin to outstrip their female peers in the depiction of projected and perspective views and in the use of other perspective indicators such as foreshortening, modeling of figures, shading, and shadow. This is particularly true when the drawings of talented youth are studied. Talented adolescent boys are excellent geometers, showing both the aptitude for and interest in solving difficult perspective problems. Talented girls are also capable of perspective drawings, but the drawings they prefer to execute depict
57
simpler spatial relationships. Whether the gender differences that appear in preadolescence and adolescence are simply a matter of preference for different drawing topics or are based on fundamental differences in spatial skills is a matter of conjecture. Studies that have looked at gender differences on spatial tasks have found that starting in adolescence and increasingly so in adulthood, males perform better on tests that require mental rotation of objects and transformation of twodimensional spatial configurations into three-dimensional solutions. These skills would be particularly relevant when attempting complex figure orientations and perspective views. Both biological explanations that point to testosterone in males and experiential differences that point to educational biases have been used to account for these differences. Constance Milbrath
References and further reading Csikszentmihalyi, Mihaly, Kevin Rathunde, and Samuel Whalen. 1993. Talented Teenagers: The Roots of Success and Failure. New York: Cambridge University Press. Freeman, Norman. 1980. Strategies of Representation in Young Children. London: Academic Press. Freeman, Norman, and Maureen V. Cox. 1985. Visual Order. Cambridge: Cambridge University Press. Gardner, Howard. 1980. Artful Scribbles: The Significance of Children’s Drawings. New York: Basic Books. Golomb, Claire. 1992. The Creation of a Pictorial World. Berkeley: University of California Press. Milbrath, Constance. 1995. “Germinal Motifs in the Work of a Gifted Child Artist.” Pp. 101–134 in The Development of Artistically Gifted Children: Selected Case Studies. Edited by Claire Golomb. Hillsdale, NJ: Erlbaum. ———. 1998. Patterns of Artistic Development in Children: Comparative
58
Asian American Boys
Studies of Talent. New York: Cambridge University Press. Parsons, Michael J. 1987. How We Understand Art: A Cognitive Developmental Account of Aesthetic Experience. New York: Cambridge University Press. Thomas, Glyn V., and A. M. Silk. 1990. An Introduction to the Psychology of Children’s Drawings. New York: New York University Press. Voyer, Daniel, Susan Voyer, and M. P. Bryden. 1995. “Magnitude of Sex Differences in Spatial Abilities: A MetaAnalysis and Consideration of Critical Variables.” Psychological Bulletin 117: 250–270.
Asian American Boys According to U.S. Census Bureau estimates, 31.6 percent of the 11 million Asian Pacific Americans in the United States are nineteen years of age or younger. This figure for youth is higher than the national average of 28.7 percent (U.S. Census Bureau 2000). The most prominent Asian American ethnic groups in the United States, who represent the vast majority of the U.S. Census category “Asian Pacific Islander,” are Chinese American, Japanese American, Filipino American, Korean American, Asian Indian American, Vietnamese American, Cambodian American, and Laotian American. For boys in these groups, issues such as identity and self-esteem; inter- and intraethnic relations; and emerging new multiracial, multiethnic, and sexual identities are vitally important. For Asian American boys, having a positive self-concept and high selfesteem is related to a multitude of factors. One major factor involves family and individual background, including the parents’ value system, their economic background, and the socioeconomic status of
the family. Any generalized description of Asian American boyhood and youth is a difficult task because of the heterogeneity of this panethnic minority group. Asian Americans represent people whose ancestry originates from many countries. They comprise not only those who have been in the United States for generations but also those who are recent immigrants and refugees. Six out of ten Asian Americans were born outside the United States, and this proportion includes more than 1 million refugees and immigrants from Vietnam, Cambodia, and Laos who arrived in the United States following the end of the Vietnam War in 1975. The socialization and adaptation practices of the family will depend on whether the boy and his family are of native-born, immigrant, or refugee status. Recent literature on children of contemporary immigrants has covered not only U.S.-born children of immigrant parents but also foreign-born children who arrive in the United States before they reach adulthood. Min Zhou (1999) has identified three immigrant generations that represent 90 percent of Asian American children. The first generation includes the foreign-born who arrived in the United States when they were thirteen years old and older. The “1.5” generation includes the foreign-born who arrived in the United States between the ages of five and twelve. The second generation includes U.S.-born Asian Americans and the foreign-born who arrived in the United States at preschool age (from birth to age four). The importance of nativity and immigration status is clearly highlighted in early research that found that third- and fourth-generation Japanese Americans reported self-concept scores similar to those of white Americans. A later study of Chinese American
Asian American Boys
59
A bewildered Nisei toddler stands at the feet of a G.I. guard while waiting to be taken to a relocation center, California, 1940s. (Library of Congress)
children and youth in New York found that those who were born in the United States, as well as those who migrated early and had been in the United States for more than three years, had more positive self-concepts than did recent immigrants (Lee and Zhan 1998). Conflicts often arise in Asian American families in which immigrant parents hold the traditional values of their homeland. The 1.5- and second-generation children of immigrants are strongly influenced by their new environment and are eager to become Americanized. This desire is not uncommon in immigrant families, but several studies have shown that Asian American parents tend to hold on to attitudes and childrearing practices unique to their Asian cultures. Immi-
grant Asian parents tend to be very protective, stress family cohesion over individualism, and exercise more control over their children’s lives than non-Asian parents. This type of authoritarian parenting style can create low self-esteem and denial of one’s heritage during childhood and adolescence, when young people wish to be as “American” as possible. Loi, a young Vietnamese American, describes such conflict in his own boyhood: “I really disliked myself, because I disliked my culture. I disliked my parents, I disliked all the things that happened to me, all the Vietnamese things. Now I see it in terms of me basically comparing myself, the distorted self image I got because I compared myself to a white person, a white standard” (Thai 1999, 65).
60
Asian American Boys
Some Asian American boys, however, choose to reject both their parents’ culture and the dominant mainstream culture in favor of an alternative form of identity. David Moon, a 1.5-generation Korean American, describes his attraction to hip-hop culture: “Some of the rap artists I enjoyed listening to on the radio were L L Cool J, Kool Mo Dee, N. W. A., Ice-T, Public Enemy, Too Short, KRSOne, and Big Daddy Kane. I would go to my friend’s house after school everyday, and we’d try to copy the latest dance moves by Bobby Brown and M. C. Hammer shown in their music videos” (Park 1999, 146). Along with nativity and immigrant status, another important factor for Asian American boys is their relationship with the social environment and society at large. An interesting example comes from a study by Walter G. Stephan and Cookie W. Stephan (1989), who surveyed Asian American college students in Hawaii. The researchers reported that in Hawaii, where Asian Americans comprise the majority of the population, the respondents had more negative attitudes toward whites than did Hispanic Americans on the mainland. On the mainland, Jean S. Phinney (1989) found that more Asian Americans preferred to be seen as Euro-American rather than Asian American. This insecure sense of ethnic self-consciousness was higher than that of either African Americans or Hispanic Americans. Among the most significant social concerns that affect inter- and intraethnic relations for Asian American boys and youth is the perpetuation of the model minority stereotype. This stereotype suggests that all Asian Americans are the same, and all experience social, economic, and educational success. As complimentary as it might sound, the stereo-
type diverts attention from continuing discrimination and fuels competition between groups. George Louie, a Chinese American student in San Francisco, recognizes that the stereotype affects other people’s perceptions: “You get called geeks and dorks and brainiacs. People make fun of Asians and say all we do is study. But, we really do have a life outside of school” (Guthrie 2000, C1). When nineteen Asian American teenagers were interviewed about their school experiences, they mentioned incidents of discrimination committed by high school teachers and students from other minority groups. They reported that teachers stereotyped Asian American students in their performance in art, math, and science and that very little of the refugee or immigrant experience was taught in school (Cowart, Wilhelm, and Cowart 1998). Stacy Lee (1996) provides a highly nuanced look at inter- and intraethnic relations relative to the model minority stereotype at a racially mixed Philadelphia high school. She found that there was not just one unified Asian American group but four self-defined Asian American groups. Each had its own distinct worldview, attitudes toward education, and relationships with others. A significant indicator of the diversity among Asian Americans was the bifurcation between household incomes and poverty rates. In 1998 the median household income for Asian Americans was $46,637, which is higher than the median household income for non-Hispanic whites ($42,429). Yet at the same time, 12.5 percent of Asian Americans live in poverty, compared to just 8.2 percent of non-Hispanic whites. Two of the groups Lee classified, “Korean-identified” and “Asian-identified”
Asian American Boys students, typically lived in more affluent neighborhoods and uncritically accepted the model minority stereotype. Most were first- and 1.5-generation immigrants who distanced themselves from U.S.-born Asian Americans, workingclass immigrant students, and other racial minorities. Korean-identified and Asian-identified students aspired to be “more like whites,” but non-Asian students and teachers often lumped them together with all other Asians. The other two groups, “new wave–identified” and “Asian American–identified” students, both rebelled against the model minority stereotype and sought their own ways to create alternate identities. New wavers were more typically recently arrived working-class immigrant students who were not academic achievers. They were often confrontational with other students, particularly with their African American peers, but they also incorporated many aspects of urban black culture and slang in order to separate themselves from whites and other Asian Americans. Asian American–identified students, however, represented the ethnic and socioeconomic diversity of Asian Americans in the high school. Most were either born in the United States or had been in the United States since they were young children. What set them apart was their orientation to social and political justice issues, criticisms of white racism and privilege, and desire to build bridges with other students of color. Issues of identity and self-esteem, along with inter- and intraethnic relations, change constantly for Asian American boys and youth. Recent changes are due in part to the increased numbers of multiracial Asian Americans, a result of fifty years of U.S. military presence in Asia and a recent increase in Asian
61
An Asian American teenager (Skjold Photographs)
American intermarriage in the United States. During the U.S. occupation of Japan following World War II and especially during the Korean War, American soldiers fathered children with Asian women. During the Vietnam War, as many as 50,000 Vietnamese Amerasians were born in Vietnam. Many of these multiracial Asian Americans eventually came to the United States, some with their fathers and others. The repeal of the last antimiscegenation laws by the U.S. Supreme Court in 1967 brought increased acceptance of interracial marriage. The 1980 census found the intermarriage rate (both interracial and interethnic marriages) for Asian Americans was 25 percent. The vast majority of
62
Asian American Boys
intermarriages for Asian Americans were in fact interracial. This intermarriage figure was significantly higher than the 1 percent for non-Hispanic whites, 2 percent for African Americans, and 13 percent for Hispanic Americans (Lee and Fernandez 1998, Table 1). A distinct part of the diverse contemporary Asian American experience is that of the offspring of Asian and non-Asian interracial unions. Golf superstar Tiger Woods (one-eighth Native American, oneeighth African American, one-quarter white, one-quarter Thai, and one-quarter Chinese) is an example of a new generation of multiracial Asian Americans. In a study of Marshall High School in Los Angeles, Jeff Yoshimi found that most multiracial Asian Americans felt a great deal of fluidity in defining their identity and developing interpersonal relationships. Marshall High student Roy Cui, who is Filipino and white, said: “If anything I feel pretty much unidentified, I feel kind of free-floating . . . a lot of my friends are different ethnicities, so I’ve dabbled with a bunch of different cultures, so I don’t feel too tied to any one” (1997, 136). Analysis of the 1990 census by Sharon Lee and Marilyn Fernandez (1998) found a dramatic decrease in Asian American intermarriage from 25 percent in 1980 to 15 percent in 1990. At the same time, the intermarriage rate for non-Hispanic whites rose to 3 percent, for African Americans, to 6 percent, and for Hispanic Americans, to 19 percent. Investigating these figures in depth, the researchers found that although interracial marriage rates for Asian Americans decreased significantly between 1980 and 1990, interethnic marriage rates nearly doubled from 11 percent to 21 percent during this same time period. These findings show a new trend of Asian American interethnic
marriage that may soon become more common than interracial marriage. If true, this trend will serve to expand a more generalized Asian American identity, further breaking down the maintenance of distinct Asian ethnic identities. Another area of change for Asian American boys and youth is increased attention to homosexuality. Asian American youth are confronting their sexual orientation at a younger age, but it is still very difficult to be open with family and friends. One young Vietnamese male, who immigrated to the United States when he was six, discusses his difficulty in explaining his sexuality: It’s funny, but most Asian people I know consider that being gay is an American thing, a white thing. Asian people, especially Vietnamese, don’t talk about gay issues because for some reason, they believe only American people can be gay. When I was growing up and when I realized I was gay, I would never dare tell anyone in my family about my feelings and my frustrations. (Thai 1999, 63) Despite these fears, the environment for Asian American gays and lesbians is less isolated today than in the recent past because more Asian Americans are openly accepting their sexual orientation and organizations have been established to support them. Lee (1996) describes high school student Stephen Chau, who identified as both Asian American and gay. He belonged to an organization outside school for gay Asians and participated in gay community activities in the city. He kept his sexual orientation from his classmates for fear of rejection, but he eventually “came out” at the end of the school term. To his pleasant surprise, he
Asian American Boys did not face the ridicule or rejection that he had expected. The contemporary issues raised in this overview reflect new directions in research on Asian American boyhood and youth in the United States. The first new direction is the emphasis on the diversity of experiences and identity construction, which can be seen in the detailed attention paid to the variety of Asian American ethnic groups, nativity, immigrant and generation status, class background, and sexual orientation. Previous research focused more narrowly on Chinese and Japanese Americans and from them generalized an overall Asian American experience. It also either praised “positive” cultural aspects of filial piety and the amazing success of Asian American youth in education or narrowly focused on social problems of drug addiction and criminal activity. Another new direction in contemporary research is the increasing use of qualitative research methods. It is important to understand Asian American boys and youth through their own words and perceptions. The ever-changing dynamics of the Asian American experience leave a great deal of room for new research as well as updates of previous work. For example, how the media affect Asian American boys and male adolescents is poorly understood today. One widely reported area of low self-esteem for Asian American children and youth is physical self-concept. Studies of Korean Americans, Japanese Americans, and Chinese Americans have found that Asian American young people are more likely to have lower scores on physical self-esteem measures (Lee and Zhan 1998). This result is not surprising, given the Eurocentric view in the United States of physical attractiveness. Yet the question remains,
63
are these findings true today given the (still limited) increase in images of Asians and Asian Americans on television, in movies, in advertisements, and in professional sports? In sum, there is a need to expand the understanding of the experiences of Asian American boys and youth in order to move beyond stereotypical images. New directions in contemporary scholarly research in this area will help to bring greater descriptive and analytical insights to this constantly changing landscape. Timothy P. Fong See also Chinese American Boys; SameSex Relationships; World War II References and further reading Cowart, M. F., R. W. Wilhelm, and R. E. Cowart. 1998. “Voices from Little Asia: ‘Blue Dragon’ Teens Reflect on Their Experience as Asian Americans.” Social Education 62, no. 7: 401–404. Guthrie, J. 2000. “Not Geeks, Gangsters at Schools.” San Francisco Examiner, May 14, C1, C5. Lee, L., and G. Zhan. 1998. “Psychosocial Status of Children and Youth.” Pp. 211–233 in Handbook of Asian American Psychology. Edited by L. Lee and N. Zane. Thousand Oaks, CA: Sage Publications. Lee, Sharon, and Marilyn Fernandez. 1998. “Trends in Asian American Racial/Ethnic Intermarriage: A Comparison of 1980 and 1990 Census Data.” Sociological Perspectives 41, no. 2: 323–343. Lee, Stacy. 1996. Unraveling the “Model Minority” Stereotype: Listening to Asian American Youth. New York: Columbia University Teachers College Press. Park, K. 1999. “‘I Really Do Feel I’m 1.5!’: The Construction of Self and Community by Young Korean Americans.” Amerasia Journal 25, no. 1: 139–164. Phinney, Jean S. 1989. “Stages of Ethnic Identity Development in Minority Group Adolescents.” Journal of Early Adolescence 9, no. 1–2: 34–49.
64
Asian American Boys
Stephan, Walter G., and Cookie W. Stephan. 1989. “Antecedents of Intergroup Anxiety in Asian-Americans and Hispanic-Americans.” International Journal of Intercultural Relations 13: 203–219. Thai, H. C. 1999. “‘Splitting Things in Half Is So White!’: Conceptions of Family Life and Friendship and the Formation of Ethnic Identity among Second Generation Vietnamese Americans.” Amerasia Journal 25, no. 1: 53–88.
U.S. Census Bureau. 2000. Statistical Abstract of the United States: 1999. Washington, DC: Government Printing Office. Yoshimi, Jeff. 1997. “Hapas at a Los Angeles High School: Context and Phenomenology.” Amerasia Journal 23, no. 1: 130–148. Zhou, M. 1999. “Coming of Age: The Current Situation of Asian American Children.” Amerasia Journal 25, no. 1: 1–27.
B Bar Mitzvah
five books of the Hebrew Bible) and in many congregations reads from the weekly Torah portion or Haftarah (the Prophets) portion or both. The boy’s parents (in some congregations, only the father) also recite blessings over the Torah and then recite a special blessing: “Blessed is He [some congregations change the gender or make it gender neutral by substituting She, Lord, or God for He] who has freed me from responsibility for this child’s conduct.” This prayer, found in a midrash on Genesis (Genesis Rabah 63:10), demonstrates that the parents are no longer responsible for the child’s actions. After the Torah reading and blessings have been concluded, most boys make a speech, usually commenting on the Torah portion they just read or a point of Jewish law and acknowledging the help of their family and teachers. The religious service is usually followed by a bar mitzvah festive meal. Although historically the boy’s family held a small celebration after the bar mitzvah, today in the United States the religious service is usually followed by a meal and party at which family and friends gather to honor the boy’s achievement and the family’s pride in having a son reach adulthood. Thus, a bar mitzvah is a signpost on the road to Jewish maturity, and the celebration is a happy acceptance of the boy’s newfound responsibilities.
When a Jewish boy reaches the age of thirteen, he is considered to be at the age of religious and legal maturity, able to take on the obligations and privileges of adulthood by becoming a bar mitzvah. The Hebrew term bar mitzvah translates to “man of duty” or “son of the commandment” and refers both to the boy and the religious ceremony. Therefore, a bar mitzvah boy celebrates his bar mitzvah. By taking part in a bar mitzvah ceremony, a boy demonstrates that he is ready to take responsibility for his own actions. He is joining the adult community and may now participate as an equal in religious services: he may read from the Torah scroll, use Tefillin (phylacteries) in prayer, is counted as part of a minyan (the ten needed to hold congregational prayer services), and in some traditions, may now, like adults, wear a tallit (prayer shawl) for the first time (in some traditions a boy may wear a tallit before bar mitzvah). The bar mitzvah is a celebration of both separation and continuity. Although the boy is leaving his parents’ authority and his childhood, at the same time, he is taking part in the ceremony that demonstrates his adherence to the continuity of Jewish tradition. There are four parts to a bar mitzvah. During the religious service, the boy recites blessings over the Torah (the first
65
66
Bar Mitzvah
Jewish boys preparing for bar mitzvah. (Shirley Zeiberg)
A bar mitzvah can take place only on a day that the Torah is read in a public worship service. These days include Monday, Thursday, Saturday, holidays, and the first day of the new Hebrew (lunar) month during the morning service, as well as Shabbat (Saturday) or holidays during the afternoon service. The Torah is also read on fast days; however, because these are solemn days, it is not appropriate for a bar mitzvah to take place on these days. Today, the majority of bar mitzvah ceremonies are held on Saturday, the Sabbath. A boy’s bar mitzvah
date is selected based on his birthday. Traditionally, the date is the Torah service following a boy’s thirteenth birthday plus one day. The origins of the bar mitzvah ceremony are uncertain, for unlike other ceremonies and holidays, it is not mentioned in the Hebrew Bible. It is thought that the bar mitzvah ceremony was introduced six centuries ago; however, since the early centuries of the common era a boy of thirteen has assumed adult obligations. The age is significant because, according to the Talmud (Avot 5:21), “at age thirteen one becomes subject to the commandments.” Talmudic sources and their commentaries state that thirteen is the age of physical maturity for boys, the age at which they are able to control their own lives. When a boy becomes thirteen and a day, his vows are considered valid (Mishnah Niddah 5:6), and he can legally buy and sell property. A boy of thirteen is required to fast on Yom Kippur (Ketubbot 50a), something not required of children. According to the Midrash, in the biblical period, the age of thirteen was the turning point in the lives of Abraham, Jacob, and Esau. Abraham is thought to have abandoned idol worship at thirteen, and Jacob and Esau left their father and followed different paths at that age. Talmudic literature suggests that during the Second Temple period (538 B.C.E.–70 C.E.) a child of twelve or thirteen was blessed by the sages after the fulfillment of his first fast day. Although bar mitzvah ceremonies are similar in Jewish communities all over the world, there is some local variety. In some parts of Europe starting in the eighteenth century, a poem was published in honor of the bar mitzvah boy. In certain Sephardic congregations, the bar mitzvah was held during the afternoon service on
Bar Mitzvah the Sabbath, and the boy was led to the reading desk by sponsors. In other parts of the Jewish world, the bar mitzvah was a multiday ceremony, with services being held at the boy’s home and at the synagogue and separate Tefillin ceremonies. As early as the seventeenth century, boys celebrating their bar mitzvah in Germany were often required to conduct a major part of the Shabbat service. Therefore, the ceremony took on a greater importance, and boys were trained for their performance. These more extensive celebrations usually took place on the first Saturday following the boy’s thirteenth birthday. With the birth of the Reform Judaism movement in nineteenth-century Germany, a confirmation ceremony was introduced, which often replaced the bar mitzvah. However, in eastern Europe, bar mitzvah ceremonies were less important: the boy was simply called to read from the Torah for his bar mitzvah on the Monday or Thursday following his thirteenth birthday. In many small towns, a boy became a bar mitzvah in a community congregation to which all Jews in the region belonged. It was a public ritual as the boy moved from childhood to join the wider adult community. In the United States, unlike in most eastern European communities, religious affiliation was voluntary and religious practice more varied. In cities where there were few Jews, bar mitzvah ceremonies would often be held on Jewish holidays when it was more likely that more people would be present. During the California Gold Rush, one such bar mitzvah took place on the Jewish New Year, with two local boys taking on their religious obligations on the same day. Since the early part of the twentieth century, the bar mitzvah ceremony and
67
celebratory party has grown in importance in North America. The ceremony and accompanying festivities have changed, allowing for innovation and modification. From simple synagogue ceremonies followed by a family meal, the bar mitzvah grew to a well-orchestrated religious ceremony followed by a large and, at times, ostentatious party that often rivaled a wedding. At times, the bar mitzvah party became more important than the synagogue ritual. These parties developed their own rituals with candle lighting, large fancy dinners, and a bar mitzvah cake. In the 1920s, these new traditions sought to balance a family’s Jewish and American identity through the religious ceremony and an American-style party. Although the ceremony itself has maintained the same basic structure, with boys demonstrating their Jewish learning by reading the Torah and presenting commentary, over the years the content of boys’ addresses and their levels of preparation have often changed. Around 1900 a number of books of bar mitzvah speeches were published, the first of which sold 10,000 copies. At times, boys recited these speeches word for word, memorizing both language and content. The last of this type of book was published in 1954, as education became a more important part of bar mitzvah preparation than memorization. Synagogues began to require membership and a number of years of schooling before a boy’s bar mitzvah. After World War II, influenced by the baby boom and the movement to suburbia, children attended classes in newly built classrooms after their secular school day and on Sundays. This new requirement not only educated the boy but forced the family to stay affiliated with a synagogue until their children reached thirteen.
68
Bar Mitzvah
A bar mitzvah ceremony photographed by Jack Delano (Library of Congress)
With these changes, the bar mitzvah became more than a rite of passage: it became the culmination of several years of schooling and often the mastery of a standard curriculum. In 1946, Conservative Judaism first set national standards for bar mitzvahs. Three years of study at a congregational school were required, which included attending classes three times a week for a total of six hours. Although Reform Judaism also set educational requirements, it sought to de-emphasize bar mitzvah, instead adopting confirmation ceremonies that kept students in religious education into their high school years. During the 1950s and 1960s, equality for women became an issue, and consequently, more congregations and families chose to have bat mitzvah ceremonies
for girls. In 1922, Judith Kaplan, the daughter of Rabbi Mordecai Kaplan, became the first girl in the United States to have a bat mitzvah ceremony. The bat mitzvah provided an avenue for girls’ education and demonstrated that women could fulfill the same religious obligations as men. Since the latter part of the nineteenth century, Reform Judaism had sanctioned girls’ and boys’ participation in confirmation ceremonies. Orthodox Judaism, however, with its separation of the genders, prescribed a significantly different coming-of-age ceremony for girls than the traditional boy’s bar mitzvah. The Conservative movement that gained strength during the interwar years became the birthplace of the American bat mitzvah, and by 1948 approximately one-
Baseball third of Conservative congregations held some form of bat mitzvah ceremonies. By the 1960s, the bat mitzvah was widely performed in most American Conservative congregations and by the end of the twentieth century in many modern Orthodox and most Reform congregations, where both the bar and bat mitzvah had become integral parts of family life. As the twenty-first century began, the bar mitzvah ceremony in the United States was more celebrated than ever and taking on new ways of expression. With the popularization of air travel and the 1948 birth of the state of Israel, it has become popular for children to have their bar mitzvah ceremonies in Israel, either at the Western Wall in Jerusalem or at the top of Masada near the Dead Sea. For these occasions, whole families and, at times, friends and relatives travel to Israel to give special meaning to this milestone in a child’s life. Other families choose to hold their bar mitzvah ceremonies in places of religious significance, including sites of historical synagogues in Germany or the Caribbean, or in resorts or camps to guarantee privacy of the ceremonies. Also, adults who did not have a bar mitzvah at age thirteen are now choosing to have adult bar mitzvah ceremonies. These ceremonies, often conducted in groups or classes, symbolize the importance of the bar mitzvah and bat mitzvah as part of the continuity of Jewish life. Ava F. Kahn References and further reading Greenberg, Blu. 1985. How to Run a Traditional Jewish Household. New York: Simon and Schuster. Hyman, Paula. 1990. “The Introduction of Bat Mitzvah in Conservative Judaism in Postwar America.” YIVO Annual 19: 133–146.
69
Joselit, Jenna Weissman. 1994. The Wonders of America: Reinventing Jewish Culture, 1880–1950. New York: Hill and Wang. Schoenfeld, Stuart. 1988. “Folk Judaism, Elite Judaism and the Role of the Bar Mitzvah in the Development of the Synagogue and Jewish School in America.” Contemporary Jewry 9, no. 1: 85.
Baseball In fact as well as legend, boys have enjoyed a special relationship with baseball since before the rules of the sport were first codified in 1845. Boys have played baseball, its predecessors, and its many variations with youthful enthusiasm; they have cheered and imitated their favorite players; they have collected and treasured artifacts and souvenirs; and they have dreamed of playing the game professionally themselves. Moreover, boys have carried these habits into adulthood and transmitted their passion to the next generation. In 1905, as the business of professional baseball was enjoying a period of prosperity, sporting goods manufacturer and former major league pitcher Albert G. Spalding and sportswriter Henry Chadwick agreed to settle their dispute over the sport’s national origins by turning the question over to a blue-ribbon commission chaired by Abraham G. Mills, former president of the National League. The commission’s report (1907), written without much regard for historical evidence, placed boys at the center of what might be described as baseball’s creation story. It concluded first that baseball was purely American by birth, as Spalding had contended, and owed nothing to the English game of rounders, as Chadwick contended. Second, the commission concluded that the game had been invented
70
Baseball
Boy up at bat (Skjold Photographs)
during the summer of 1839 in Cooperstown, New York, by Abner Doubleday, who had outlined its basic particulars to a group of boys using a stick in the dirt. The Doubleday myth has since been thoroughly discredited, but the intimate liaison between boys and baseball endures. Primitive or folkloric bat-and-ball games, lacking written rules or formal organization, existed for centuries and were played by both adults and children. Indeed, throwing a round object and attempting to hit it with a straight or crooked limb seems to be a natural leisure activity ancestral to such diverse games as tennis, golf, and baseball. In fourteenth-century England, young men
and women played stoolball, a game involving a pitcher who threw a ball toward an upended, three-legged stool and a batter who used his or her bare hand or a stick to hit the ball away. Eventually, stoolball evolved into a two-stool game, a forerunner of cricket, and then into games played with three or more stools, which became known as bases. English colonists brought folk games to the new world, and they were played by people of all ages in several varieties, including “barn ball,” “old cat” or “one old cat,” and “town ball.” In fact, a number of children’s instructional books published on both sides of the Atlantic in the eighteenth and nineteenth centuries described such games and often added a moral lesson as an incentive for playing them. Although the name “base-ball” was occasionally applied to some of these games, by 1830 “rounders” was probably the most popular variety in both England and North America. Rounders was played by two teams on a field with four stones or posts arranged in a diamondshaped pattern. The “out” team defended the field, and the “pecker” or “feeder” gently threw the ball toward the “striker,” a member of the “in” team. If the striker hit the ball, he ran clockwise around the posts as far as he could. A striker could be retired four ways: by swinging and missing three times, by hitting a ball behind his position, by hitting a ball caught by a defender, or by being hit by a thrown ball while he was rounding the bases. The “ins” remained at bat until each of them was put “out.” In a 1947 book, librarian Robert W. Henderson proved the connection between rounders and baseball. He compared the nearly identical texts of two boys’ books, one published in London in 1829 describing rounders and another in
Baseball Boston in 1834 on an early version of baseball. Henderson thus debunked the Doubleday myth but reasserted the key role boys played in propagating both games. Soon after the New York Knickerbockers club first wrote out the rules for their version of baseball, boys adopted the new standards. Baseball historian Harold Seymour wrote that as the number of adult clubs proliferated in the 1850s, most of them fielded junior “nines” as well, and that in 1860, junior teams formed a national association. Within a decade, though, baseball played by adults at the highest level of proficiency became a professional game, and boys’ relationship to it grew more serious. On the one hand, the baseball business sought to cultivate boys’ interest in the game to ensure that they would one day became paying customers. On the other, boys realized that playing baseball could, for a select few, become an occupation that paid rather well. Moreover, as the professional game made subtle adaptations to appeal to the widest possible audience, baseball’s proponents defined it as a symbol of what it meant to grow up American and extolled its virtues as an activity that promoted physical fitness and moral rectitude. Without disregarding the plight of millions of children, both urban and rural, who were forced to work long hours and had little time for recreation, it is fair to say that baseball had become a primary form of recreation for American boys by the start of the twentieth century. Not every boy played it, certainly, but nearly all were familiar with its rudiments. Sandlot baseball, that is, games with flexible rules played by boys without adult supervision, flourished throughout the country. Boys could transform almost any open space—a pasture, park, play-
71
ground, schoolyard, or even city street— into a ball field with a little imagination. They needed only a minimal amount of equipment to play. Those boys without genuine bats, balls, and gloves improvised, using almost any substantial piece of lumber for a bat; fashioning balls from combinations of string, socks, tape, and other materials; and catching with their bare hands when no gloves or their surrogates were present. In addition, boys adjusted the rules of the game to fit the space available and the number of players at hand. Sometimes they engaged in derivative games like stickball or punch ball or kickball. Many American men, recalling their youth, insisted that they played this way nearly every summer day. Simultaneously, boys became great fans of professional baseball and its surrounding culture, even if they lived far away from a major league ballpark. Baseball cards first appeared in the 1880s, and even though they were distributed in packages of cigarettes, it was boys who prized them. Only a small percentage of boys ever got to attend a professional game, of course, but those that did generally remembered the experience for a long time and often retained their scorecard to reinforce the memory. On every game day, boys congregated outside ballparks, straining for a glance of their favorite players and hoping to cadge a spare ticket or secure an autograph. At home, boys could read about their heroes in the sports pages of the local newspaper or in The Sporting News, a weekly first published in 1886 and once known as the “Bible of baseball.” They could pore over statistical guides with the greatest concentration and immerse themselves in baseball fiction aimed at youth, a genre that dates from William Everett’s 1868
72
Baseball
novel, Changing Base. They could fall asleep dreaming of the major league career that might one day be theirs. Parents sometimes objected to their sons’ enthusiasm for baseball, but gradually adults came to recognize the sport’s wholesome aspects. Certainly, fandom became easier for boys when players considered more respectable by adults, such as the well-groomed collegian Christy Mathewson, joined the professional ranks. Additionally, new immigrants to the United States understood that playing baseball could be an important part of their children’s acculturation process. Public and private schools and military academies allowed students to form baseball teams well before 1900, and advocates of reserving land for city parks eventually incorporated ball fields into their plans. The Young Men’s Christian Association (YMCA) espoused a “muscular Christianity” that embraced baseball, and the settlement house movement also used baseball as a socializing activity. During the Progressive era in the 1910s, adults began to supervise boys’ baseball more closely amid concerns that sandlot and scholastic ball were scenes of petty violence and disorganization. In New York City, for example, physical educators established the Public Schools Athletic League in 1903 for both boys and girls, and other cities did likewise. Urban Catholics founded Catholic Youth Organizations that encouraged boys to play sports, including baseball, under adult supervision. In 1925, the American Legion established a national youth baseball program that, by 1999, included more than 5,000 teams. This trend, furthered by governments and various private organizations throughout the Depression and New Deal eras, reached its apotheosis in 1939 with the establishment of Little League Baseball.
This program was founded in Williamsport, Pennsylvania, by Carl Stotz, a man who observed his nephews being pushed off a baseball field by older boys and set out to right this wrong. Little League, designed for boys aged twelve and under, grew remarkably in the years after World War II. The Little League World Series was first televised in 1953 and first won by a non-U.S. team in 1958. Other programs for other age groups, such as Pony Baseball, followed, and by 1999 Little League itself had expanded to ten divisions involving nearly 3 million youngsters. Professional baseball reveled in the affection boys held for it and cultivated their loyalty. Several clubs established “knothole gangs,” membership in which included free admission to the ballpark. Later, clubs bestowed the same privilege on students earning good grades. Individual players often had fan clubs established in their names, with members receiving photographs and other inexpensive memorabilia. Baseball also promoted Babe Ruth as the quintessential “bad kid” who learned how to play baseball under Brother Matthias at St. Mary’s Industrial School for Boys in Baltimore, Maryland. Ruth used his baseball-playing ability to right his life. As clubs began to broadcast their games, first on radio and later on television, boys formed an ardent audience for both media. In the 1950s, the baby boom generation embraced baseball with unusual vigor. The baseball card industry prospered; baseball fiction, including several series that stretched into many volumes, brought boys to books; and board games such as “APBA Baseball,” “Ethan Allen’s All-Star Baseball,” and “Strat-OMatic Baseball” enthralled boys for many hours. The lyricism suggested by these images and events persisted through the end of
Baseball Cards the twentieth century. In 1985, for example, poet Donald Hall published a book of essays on sport evocatively titled Fathers Playing Catch with Sons, and in 1989, the motion picture Field of Dreams, based on a novel by W. P. Kinsella, gave grownup boys in its audience a chance to revisit their youth vicariously and repair, if necessary, relationships with their own fathers. Well before the end of the twentieth century, though, nearly all the spontaneity once associated with boys playing baseball—or any sport, for that matter— on their own had disappeared. Adults who remembered whiling away the summer days of their youth playing ball with their friends now saw children playing only organized ball in programs governed strictly by adults. Parents glimpsed the lure of college scholarships and the possibility, however remote, that their sons could one day earn millions by playing in the major leagues, and they decided that playing baseball had become too important an activity to let children do it themselves. When girls first petitioned organized programs for equal opportunities to play, it was usually adults who objected. Boys, for their part, forgot how to play on their own, relying instead on parents and coaches to transport them to practices and games. Coaches who stressed competition at too early an age or who managed their charges with too much intensity sucked a good deal of the joy out of the sport for many boys, who often gave it up prematurely. Adult supervision had an additional drawback, too. It tended to create an atmosphere so tense that law enforcement officers sometimes had to intercede in fracases involving coaches, umpires, and parents. Steven P. Gietschier
73
See also Baseball Cards; Games; Muscular Christianity; Young Men’s Christian Association References and further reading Henderson, Robert W. 1947. Ball, Bat and Bishop: The Origins of Ball Games. New York: Rockport Press. Seymour, Harold. 1960. Baseball: The Early Years. New York: Oxford University Press. ———. 1990. Baseball: The People’s Game. New York: Oxford University Press. www.legion.org/baseball/history.htm www.littleleague.org/history/index.htm
Baseball Cards During the 1980s and early 1990s, an ever-increasing number of adult males in the United States began to gather in motel conference rooms, school gymnasiums, and shopping mall corridors each week to buy, sell, and trade baseball cards. As they did so, these men also circulated among themselves a highly sentimental nostalgia for boyhood, one that they associated with these little cardboard objects. Baseball cards are one of the common popular icons associated with boyhood in the United States. They recall images of tranquility and innocence associated with boys trading cards on school playgrounds in a 1950s-style middle-class suburbia. In fact, baseball cards evoke such powerful nostalgic emotions that it is almost easy to forget that they have not always been a significant part of boyhood in the United States. The first baseball cards to be sold commercially were actually marketed to adults, not children. In the early 1880s, tobacco companies, able to produce and market a high volume of cigarettes with new factory machinery, began using a variety of advertising gimmicks to expand
74
Baseball Cards
their markets. As one of their first schemes, companies inserted small picture cards, measuring about 1.75 inches by 2.75 inches, into packages of cigarettes. Such cards featured a wide variety of popular images, including vaudeville actors and actresses, war heroes, and athletes. Allen and Ginter Tobacco Company and the Goodwin Tobacco Company were among the first to market their products using cards featuring baseball players. The players on these cards were presented as manly and serious— they never smiled at the camera, and they wore perfectly pressed uniforms, sometimes with a necktie. Such images provide a vivid contrast to the more playful and boyish images that would be marketed directly to children by the middle of the twentieth century. Although adults were the ones who purchased these objects with their cigarettes, some who were children during that time period remember that youths tended to collect and play with them. However, not until after World War I would companies package baseball cards with products such as candy, gum, caramels, or cookies, thereby marketing products directly to children. Like other emerging forms of commercial culture during this time period (popular music, movies, and pulp fiction), baseball cards became an increasingly important aspect of children’s lives during the twentieth century, a commercial intervention into preadolescent play during an era in which child labor laws, industrial mechanization, and mandatory schooling all extended childhood and made play an increasingly central aspect of children’s lives. During the 1930s the Goudey Gum Company first used baseball cards to market bubblegum to youths. Its cards
presented baseball as a wholesome, patriotic activity that at the same time embodied the “innocent” fun of boyish play. Some contained biographies on the backs, signed as if star players such as Lou Gehrig or Chuck Klein had written them. Others were photos of players’ heads superimposed atop skinny cartoon bodies. They were sold with coupons that children would send back, allowing them to join a fan club or obtain baseball equipment. By the end of the decade, gum companies increasingly associated baseball cards with patriotic symbols, selling their product wrapped in red, white, and blue paper and evoking images of baseball as the “national pastime.” After World War II, companies regularly produced and sold yearly sets of baseball cards to children for the first time. Beginning in 1948, the Bowman Corporation began printing and selling annual sets of cards with bubblegum each summer. In 1952 the Topps Corporation of Brooklyn, New York, challenged Bowman with a now famous 407-card set. By this time baseball card collecting had already become one of the most popular hobbies among boys in the United States. Topps was able to maneuver Bowman out of the baseball card market by signing players to exclusive contracts, eventually bought out Bowman, and maintained a monopoly over baseball card production and sales until the early 1980s. Unlike the cards children collected in the earlier part of the twentieth century, those produced after World War II were not primarily an advertising mechanism used to sell another product like candy or gum. Instead, they were sold as products in and of themselves. What had once been a promotional give-
Baseball Cards
A young boy admires some of his favorite baseball heroes on baseball cards put out by Topps Corporation, 1965. (Bettmann/Corbis)
away had now become an elaborately crafted form of media entertainment. In a space smaller than a postcard, they contained not only photographs but also a wide range of information: a player’s throwing, batting, and fielding positions; team; hometown; and the color of his hair and eyes. The new cards to emerge after World War II were part of the sports culture being promoted through the new medium of television. Sports provided easily packaged programming for this visual medium. What is more, because the government and the broadcasting industry in the United States had managed to organize television around commercial broadcasting, sporting events provided an easy forum through which to reach a male market segment, one that had long been a dif-
75
ficult one to reach effectively. By the mid1950s, a majority of households in the United States owned television sets, allowing audiences in the far reaches of the nation to watch major league baseball for the first time. Baseball cards produced during the 1950s reflect the increasingly sophisticated knowledge of major league baseball that children were developing during this era. With the monopoly that Topps enjoyed over the baseball card market, collecting baseball cards assumed a kind of predictability and stability between the mid1950s and the early 1970s. Each year, the same company would produce cards and market them the same way. Topps would strategically market cards in sets, releasing only a small number at a time throughout the spring, summer, and fall. Children who collected were then strung along throughout the season as they attempted to complete an entire set. In this respect, young audiences learned through their baseball cards not only about baseball but also about the rules of engagement in a capitalist market. Card collecting did this on two levels. First, it coaxed preadolescents into behaving as young consumers, teaching them how to spend money strategically on bubblegum packs and how to gain pleasure from the act of buying. Second, the very act of collecting implicitly meant that youths had become part of a “make-believe” capitalist market involving baseball cards, in which boys attempted to build up surpluses in order to best maneuver themselves among their friends with whom they traded cards. Topps failed to keep its monopoly over baseball card sales in the early 1980s when it lost an antitrust battle with the Fleer Corporation of Philadelphia in the summer of 1980. The ruling allowed
76
Baseball Cards
Fleer and a third company, Donruss of Memphis, Tennessee, to produce baseball cards for the 1981 season. This new competition actually helped to fuel a renewed interest in baseball card collecting during the decade and to increase Topps’s profits. By the end of the 1980s, two more companies, a conglomerate corporation known as Score and a new baseball card manufacturer called Upper Deck, also had licensing agreements to produce full sets of baseball cards. Upper Deck created a new kind of glossy, expensive, high-quality card that was printed on high-bond cardboard. Each one featured sharp color photos on the front and on the back, along with a hologram to “prevent” counterfeiting. By the end of the 1990s, this design had become the standard that the other companies followed. Beginning in the early 1970s, baseball card collecting began to undergo an important change. Adult males began to create formal organizations and events surrounding the hobby of baseball card collecting. They organized baseball card conventions, published baseball card collecting newsletters, and created local baseball card collecting clubs. As the adult hobby grew in popularity, cards became collectors’ items that were sold for money. Until the early 1980s, adult collecting was a relatively small hobby. With the growth in the baseball card market combined with the speculative atmosphere of the 1980s, it grew extremely rapidly, becoming one of the most popular adult hobbies in the United States by the early 1990s. Nostalgia lay at the center of baseball card collecting. On the first level, collectors were nostalgic for the game as it was played in the past. Adult collectors often complained that contemporary stars were spoiled millionaires who showed no
loyalty to any single team, whereas players in the past were hardworking heroes. They tended to see the game as it had been played as a symbol of stability. Likewise, these collectors also expressed a second level of nostalgia, one that combined their personal memories with historical time. Adults often remember cards as tied to memories of boyhood before dating. They remember warm friendships between boys who would meet to trade and play games with their cards. Men who collect tend either to have ended their hobby when they took an interest in sex or to have continued to collect secretly “in the closet.” These personal memories of collecting also evoke broader historical memories of the past as a “simpler” or more innocent time. The particular past remembered in this way is a suburban one that defined the cultural ideals of the 1950s and 1960s. In actuality, the 1950s witnessed enormous change and social disruption in communities. The realities of life during this era counter nostalgic images of the 1950s, both on and off the baseball field. During that decade baseball faced a series of major crises as television almost devastated the minor leagues and coerced a number of major league teams to move to new cities. Moreover, suburban growth destroyed ethnic neighborhoods and, through a variety of discriminatory public policies, created newly segregated urban social patterns. The personal nostalgia for boyhood that collectors expressed also contained tensions. Many adults who collected cards did so to rekindle the kind of genuine human relationships they remembered having with one another as children. By associating baseball cards with preadolescent boyhood, however, they tended to
Basketball understand such relationships as all-male ones. By excluding women, the collecting hobby sometimes had the effect of distancing men from their female partners, thereby undermining an important human relationship. Furthermore, the competitive nature of the contemporary hobby has also caused tensions among adult collectors, many of whom have grown distrustful of one another. Today, baseball card collecting is still a popular hobby among children, and as in the past, one that boys engage in more than do girls. Their collecting practices reflect the influence of the adult hobby. For example, children pay close attention to the condition of their cards by storing them in protective plastic binders—a direct outgrowth of a collectors’ market that places a premium on cards in “mint” condition. Ironically, many adult collectors express irritation with boys who are interested in the speculative market that surrounds baseball card collecting, even though such a market is the outgrowth of adult interest in the hobby. The collecting habits of contemporary boys have indeed changed. Yet the problem that collectors often have with this is not really a statement about the behavior of contemporary youth. Rather, it is a reflection of the unstable character of the nostalgia among collectors, a nostalgia for innocent boyhood relationships that never existed and are therefore impossible to replicate. John Bloom See also Baseball References and further reading Bloom, John. 1997. A House of Cards: Baseball Card Collecting and Popular Culture. Minneapolis: University of Minnesota Press. Boyd, Brendan, and Frederick Harris. 1973. The Great American Baseball Card
77
Flipping, Trading, and Bubble Gum Book. New York: Warner Paperbacks. Lemke, Bob. 1997. Standard Catalog of Baseball Cards. Iola, WI: Krause Publications. Nardinelli, Clark, and Curtis Simon. 1990. “Consumer Racial Discrimination in the Market for Memorabilia: The Case of Baseball.” Quarterly Journal of Economics (August): 575–596.
Basketball Unlike many other sports, basketball has a clear history. In December 1891, James Naismith, a physical education instructor at the International Young Men’s Christian Association Training School (now Springfield College) in Springfield, Massachusetts, created the game known as basketball. Generally, it is played indoors on a 94-foot court, with two opposing teams of five players each attempting to propel (or shoot) a ball of approximately 33 inches in circumference through a goal that is 10 feet above the playing surface. Naismith responded to a problem that had arisen in the college regarding the requirement that boys have an hour each day of an organized activity. In the fall and spring, outdoor activities were possible, but the New England winters precluded year-round activity, and the boys often spent their winter activity times marching or doing calisthenics, which neither they nor their instructors particularly enjoyed. Luther Gulick, the head of physical education at the college, assigned Naismith, a thirty-year-old native of Almonte, Ontario, to find a solution to this problem. Naismith tried unsuccessfully to modify outdoor games and adapt them to indoor conditions. He finally set about to invent a new game, one that
78
Basketball
would use a large enough ball not to require a bat or a stick to propel it. He envisioned the ball being thrown at elevated boxes to score a goal, but when he sought such boxes from the school janitor he was disappointed. Instead, the janitor offered Naismith two peach baskets and installed them 10 feet off the ground, attaching them to the balcony that encircled the gym. Naismith wrote out a set of thirteen rules, with the number of players on a side being variable. Because the recalcitrant gym class had eighteen members, the first game of basketball had nine on a side. The rules, written and published in the school paper in January 1892, are as follows: • The ball may be thrown or batted (but not with a fist) in any direction, but may not be carried while running. • The ball must be held in or between the hands. • If a player holds, pushes, trips, or strikes the person of an opponent, a foul will be called. • Three consecutive fouls count as a goal for opponents. • A goal shall be made when the ball is thrown or batted into the basket and stays there, provided the basket is not touched. • When the ball goes out of bounds, a member of the team that did not touch it last shall throw the ball back in. The ball must be thrown in within three seconds, or the ball is awarded to opponents to throw in. • The umpire shall judge fouls; the referee shall decide out-of-bounds disputes, whether goals count, and any other controversies on the court.
• Two fifteen-minute halves will be held with five minutes of rest in between. • Victory goes to the team with the most goals, but in the event of a tie, captains may agree to play until a tie-breaking goal is made. Naismith’s first team went on an exhibition tour in 1892 to Albany, Troy, and Schenectady, New York, as well as Providence and Newport, Rhode Island. In February 1892, two Young Men’s Christian Association (YMCA) branches in Springfield played to a 2–2 tie. The next month two girls’ teams competed, and later that year Amos Alonzo Stagg, a faculty member at Springfield College, took a position at the University of Chicago and introduced basketball to that institution. Basketball spread rapidly in three ways. First, boys going home for Christmas vacation from Naismith’s YMCA Training School in Springfield, Massachusetts, carried the news of the game and started playing basketball at their local YMCAs. Second, the school distributed its paper, The Triangle, which carried the rules of basketball, to YMCAs throughout the United States and thus helped disseminate the sport. Third, many city settlement houses, which often had close ties to their YMCAs, quickly adopted the game. In April 1892 basketball was introduced at the new athletic facilities of the YMCA in New York City. Young Men’s Hebrew Associations (YMHAs), which had begun in the 1850s, also began equipping their gymnasiums with basketball courts soon after 1900. In 1893 many YMCAs formed leagues and conducted local championships, and in April 1896 in Brooklyn, there was a “championship for America” YMCA tournament.
Basketball As basketball spread among YMCAs and settlement houses, it was played largely by lower-middle-class and lowerclass young people, typically first- or second-generation immigrant boys. Basketball’s creation coincided with the beginning of the largest influx of new immigrants to the United States ever, and many of them saw the sport as an opportunity to demonstrate their assimilation to things American. In March 1893, Vanderbilt University became the first college to field a team, followed closely by Hamline College in Minneapolis. As early as 1898, professional leagues were formed, the first ones being around the Philadelphia–Trenton, New Jersey, region. By 1906, professional leagues also existed in New England (both eastern and western Massachusetts had leagues) and western Pennsylvania. More colleges began to establish teams, mainly in the Northeast, although by 1908 the number of teams in the Midwest was growing. In the 1920s, clear differences in rules between the professional and college games emerged, as rules became more codified. Nevertheless, until the 1930s and even later in some areas, court size was variable, but it did not exceed 4,000 square feet. College courts were generally 50 feet by 70 feet, whereas the courts used by professionals were usually 35 feet by 65 feet. In the 1920s, professional teams played inside a metal screen cage at least 11 feet in height. The Wachter brothers of Troy, New York, first developed and used the cage in the 1910s in the New York State League. The cage sped up play because the ball was “live” off the cage and could almost never go out of bounds. Players could also use the cage to enhance their leaping, and the cage served to protect the players and referees from rowdy fans. The cage was
79
standard for all professional contests until 1925–1926, when the newly formed American Basketball League, with nine teams stretching from Boston to Chicago, abandoned the use of the cage. Despite that, the term “cagers” to describe basketball players stuck and is still commonly used today. In the 1920s, there were more than five leagues operating around New York City and Philadelphia, in Ohio, and in the Midwest. College and high school basketball steadily gained more interest and attendance during this decade. In New York, sportswriter and promoter Ned Irish began college basketball double- and tripleheaders, which proved to be exceedingly popular. The idea spread to Chicago, Philadelphia, and other cities and remained a common activity into the 1960s. In that forty-year period, players began modifying earlier beliefs about certain aspects of the game. One of those concerned shooting, which almost every player did with two hands and feet planted on the floor. In the 1930s, players began experimenting with one-handed shooting, using the off hand only to balance the ball. Other players found that leaving the floor on their shots allowed them an advantage over floor-bound defenders. Hank Luisetti, a player at Stanford University, was an early popularizer of this “jump shot,” and his fame grew when he played in some basketball doubleheaders held in Madison Square Garden. There in one game he scored fifty points, a virtually unheard-of accomplishment. In the 1940s, “Jumping” Joe Fulks of Murray State University and Max Zaslofsky of St. Johns University were sensational jump shooters in college before playing professionally into the early 1950s. Since the 1960s basketball has been the most popular of organized sports in num-
80
Basketball
Basketball is a popular urban sport. (Archive Photos)
ber of participants in high school and number of spectators at all levels. The explosive growth in fan interest began in earnest in the 1980s, in large part because of increased television coverage and the excitement surrounding certain players and rivalries. Today, though girls have now swelled the ranks of players at various levels, basketball still ranks as the most played and watched boys’ sport in the nation. One clear reason is that even small high schools can field a team because only five players are needed at any one time. Many states have created various levels based on school populations to try to allow greater equality in competition. Although many high schools may
not be able to field a number of sports teams because of size, space, or financial constraints, there are virtually no public high schools in the United States without a boys’ basketball team. Among American Indian or Native Alaskan populations, high school basketball is often one of the great unifiers in rural, widely dispersed communities. Basketball is also extremely popular among African Americans, many of whom today live in U.S. inner cities and find it easy to play the game with a few friends in a very limited space. Actually, African Americans played the game from its earliest years. African American boys idolized the New York Renaissance
Basketball teams in the 1920s, 1930s, and 1940s and the Harlem Globetrotters from the 1930s to today. After World War II, there was a veritable rush of African Americans seeking to play at all levels of basketball. When Crispus Attucks High School of Indianapolis, led by Oscar Robertson, triumphed in the 1956 Indiana State High School tournament, it was a watershed that proved that an all-black squad could compete and triumph in a venue acknowledged as one of the leaders in basketball competition in the nation. Today a new model has emerged as some outstanding African American high school players skip college and move directly into the professional ranks, as did Minnesota Timberwolves player Kevin Garrett. Although this trend inspires many African American boys, it is also disturbing because it downplays the importance of a college education in deference to a possible high-paying professional career. Basketball is popular around the world and has been a sport in the Olympic Games since 1936. Still facing the game are a number of issues that involve rule interpretations, equipment, and larger “philosophical” concerns. In regard to rules, major changes in scoring have taken place since the 1960s. The first involved outlawing the dunk shot in the 1967–1968 season out of fear that it would change the game for the worse and discourage youngsters from developing a varied shooting repertoire. Many critics saw it as the Chamberlain or Alcindor rule, a response designed to curtail the potential dominance of Wilt Chamberlain or Lew Alcindor (later known as Kareem Abdul-Jabbar). Yet the dunk was allowed once again in the 1976–1977 season. The concern about its tendency to discourage boys from developing other shots still re-
81
mains, but the dunk has added to the entertainment value of basketball and probably will not be banned again. Another change in rules came with the introduction of the three-point shot, a rule that the Olympics and American Basketball League had popularized before the National Basketball Association (NBA, established in 1949), colleges, and high schools adopted it. The three-point shot has opened up the game and made defenses “packed into” the area near the basket less common. It also allows more drives to the basket. The three-point distance varies, which is a continuing concern. Currently, high schools and colleges have a 19-foot, 9-inch distance. The Olympics allow 21-foot shots, and the NBA sets the distance at 23 feet, 6 inches. A proposed scoring change has been around almost as long as the game itself, that is, the height of the basket. Proposals to raise the basket to 11 or 12 feet continue to be offered as a way to offset the dunk and return pure shooting to the game, but no proposal seems to have widespread support. There is also concern over the growing roughness of the game, although this concern is ironic since professional basketball players in the 1920s were commonly knocked unconscious in contests. Nevertheless, bigger, faster players cause more damage, and the acceptance of “hand checking” (the defender being allowed to use a forearm or open hand to impede a player’s progress) abetted the roughness. Hand checking is no longer allowed, but referees, particularly in professional games, are often tolerant of it, and because of extensive television coverage, young players then emulate the professionals. As for equipment, the leather ball is being “challenged” by balls made of new materials that manufacturers claim to be
82
Bicycles
“better than leather.” Whether the challengers will ultimately influence sales and usage remains to be seen. In addition, for years players wore shorts that extended only 12–18 inches below the waist. Then players at the University of Michigan popularized baggy shorts extending to the knee, and that longer fashion has remained with variances on bagginess and length. The “short shorts” have vanished from the game. A larger philosophical concern in recent years is what teams can and should receive television coverage and how they should profit from it. In addition to professional games, college and even many high school games are now televised. Who should control revenue from these games and what, if any, amount should be returned to the players is a continuing source of concern. Professionals, of course, are paid, and their union is part of the decisionmaking process regarding revenue sharing. Most college decisions on revenues are governed by the National Collegiate Athletic Association, and individual state high school associations usually have the largest say in the distribution of profits generated from televised high school games. Despite all these issues, basketball remains a relatively simple game, which continues to be the source of its widespread popularity. Murry Nelson See also Young Men’s Christian Association References and further reading Gault, Frank, and Claire Gault. 1977. The Harlem Globetrotters. New York: Walker. Hollander, Zander, ed. 1979. The Modern Encyclopedia of Basketball. New York: Doubleday. Levine, Peter. 1992. Ellis Island to Ebbetts Field: Sports and the American Jewish
Experience. New York: Oxford University Press. Neft, David, Richard Johnson, Richard Cohen, and Jordan Deutsch. 1976. The Sports Encyclopedia Basketball. New York: Grosset and Dunlap. Nelson, Murry. 1999. The Originals: The New York Celtics Invent Modern Basketball. Bowling Green, OH: Bowling Green University Popular Press. Peterson, Robert. 1990. Cages to Jump Shots: Pro Basketball’s Early Years. New York: Oxford University Press. Roberts, Randy. 1999. But They Can’t Beat Us: Oscar Robertson’s Crispus Attucks Tigers. Indianapolis: Indiana Historical Society. Telander, Rick. 1976. Heaven Is a Playground. New York: St. Martin’s Press.
Bicycles In the twentieth century the bicycle came virtually to define boyhood. Learning to ride a bicycle served as a rite of passage from childhood to youth and physically liberated boys from their neighborhoods. Bicycle riding by boys began with the first models, growing slowly but steadily from the 1890s until the 1930s when, despite the Depression, every boy expected to own a bike. Bicycles taught boys basic mechanics and involved them in social reform, such as the good roads movement. Since the 1930s, boys’ bicycles have evolved from streamlined but heavyframed, single-gear, elaborately painted machines to a lightweight, multigeared, almost undecorated apparatus and then to a combination of rugged construction and complex gearing with customized accessories. Despite the technological and stylistic changes, the functions of boys’ bicycles remain the same—to provide a sense of independence; to increase opportunities for competition and socialization; and to initiate the bike owner into the system of
Bicycles
83
A young boy proudly displays the bicycle that he bought with his own earnings, 1930s. (Library of Congress)
consumption, maintenance, and planned obsolescence of material possessions. The first “ordinary” bicycle, which had a front wheel as much as 5 feet in diameter, was too large and cumbersome for most young boys to ride. For them, manufacturers provided an assortment of velocipedes or tricycles with front wheel sizes ranging from 16 to 28 inches. By the
end of the 1880s, the ordinary bicycle had been all but replaced by “safety” bicycles with wheels of equal size, usually 30 inches. These bicycles employed a chain mechanism attached to the pedals to turn the rear wheel and hand breaks to stop the vehicle. In appearance, the bicycle of the 1890s was similar to today’s and could be equipped with the same variety
84
Bicycles
of accessories—bells, baskets, lights, speedometers, odometers, and luggage racks. Serious bicyclists also required special caps, uniforms, and badges to identify them as members of bicycle clubs. With the development of the automobile in the early twentieth century, the bicycle boom among adults ended. Most Americans preferred cars. Boys and girls inherited bicycles from their older siblings and parents, but social norms placed restrictions on girls’ freedom of movement. Boys, however, were encouraged to roam on their bikes in search of adventure. Books such as The Ready Rangers: A Story of Boys, Boats, and Bicycles, Fire-Buckets and Fun by Kirk Munroe depicted bikes as essential to a boy’s life. Parents who claimed that bicycles were merely expensive toys were ridiculed in the boys’ adventure novels, opening a crack that would later be described as the generation gap. But is a bicycle a kind of toy? For children, the answer is a qualified “yes.” When a survey in 1896 asked 701 boys aged seven to sixteen in Worcester, Massachusetts, to name their favorite toys, they placed bikes sixth, after tops, balls, marbles, express wagons, and footballs. After age twelve, bikes ranked higher than wagons. As an activity, boys ranked biking lower than ball games, marbles, tag, checkers, hide-and-seek, and cards, but such lists are imprecise because all children’s outdoor play is dependent on the season, peer groups, and economics. One twelve-year-old Worcester boy said he liked his bicycle because it made him strong and he could do tricks on it, which points to the close links between athleticism and cycling. From the 1890s to the present, bicycles must be seen in the context of other wheeled apparatus used by boys to move
about quickly and sometimes carelessly. Speed is intoxicating for boys, and the progression from wagons and tricycles to scooters and sidewalk bicycles to roller skates and soapbox racers and finally to full-size bikes is one of increasing velocity and vertigo. Risk and bravado are a boy cyclist’s constant companions. In the Stephen Crane short story “Showin’ Off,” a boy is teased into riding his velocipede down a steep embankment with painful results, but few boys can resist a dare. Boys often test their nerve with their bikes. For some, bicycle ownership was a mark of wealth. Boys who rode Hawthornes, an inexpensive brand sold at the Montgomery Ward stores, could only envy the boys on Schwinns. Poorer boys acquired their bikes by “borrowing” one for an unspecified time or by outright theft. Boys learned hard lessons about real life in the lively market in used bikes and parts. The popular writer William Saroyan, who was a boy in Fresno, California, just after World War I, recalled that his bikes were rebuilt secondhand machines that were always breaking down but that bike riding taught him about “style, speed, grace, purpose, value, form, integrity, health, humor, music, breathing, and finally and perhaps best of all the relationship between the beginning and the end” (12). Post–World War II bikes reflected the growing American obsession with the automobile. Schwinn’s 1949 model, the Black Phantom, had carlike chrome on fenders, handlebars, and accessories. A few years later the Huffy Company introduced a “Radiobike” that placed the battery-powered set in the frame of the bicycle. Colorful rubber grips with bright plastic streamers were added to handlebars. Luggage racks were located above
Bicycles the rear fender, and large wire baskets attached above the front wheel. Boys could deliver newspapers and groceries from these sturdy but heavy bicycles. To increase the volume of sales, manufacturers introduced “sidewalk” bikes with smaller tires and frames for children younger than seven. In 1952 Huffy advertised a bike with training wheels for children as young as four years of age. In 1959 a survey of more than 1,000 boys in northwestern Ohio revealed that bicycle riding was almost as popular as football, the number one pastime. Even before the end of the 1950s, however, a revolution had begun taking place in bicycle design and technology. Lightweight bikes from England with three to eight gears, hand breaks, and thin, highpressure tires became the choice of serious cyclists. Schwinn introduced the “Varsity” in 1960, and for a few years the “racing” bike dominated the market. The narrow seat was uncomfortable, however, the gears needed constant maintenance, and the narrow tire rims were easily bent by curbs and rough city streets. By the mid-1960s many of the old balloon-tire bikes were being customized. Boys in California began riding a new style of machine they called “Krates.” A typical Krate had 20-inch wheels, perhaps taken from a young child’s sidewalk bike, a large “polo” or “banana” seat, and “apehanger” handlebars that rose up to the eye level of the seated rider. In the 1970s another revolution in design and riding style occurred. Young men and boys customized their bicycles to resemble off-road motorcycles. Bicycle motocross (BMX) competitions among boys as young as twelve years old were organized. Manufacturers quickly responded to the demand with “mountain bikes,” some costing up to $5,000 in the late
85
1990s. These were not meant for boys, of course, but less expensive models were available, and toy stores sold “sidewalk” copies in 10-, 13-, and 16-inch wheel sizes, complete with the knobby tires of the racing models. In addition to off-road biking, an increasing number of boys have begun competing in trick-riding contests, doing complicated aerial maneuvers off ramps and over obstacle courses. Helmets and kneepads have become required accessories for biking as lights, bells, and fenders have disappeared. In whatever model, the bicycle remains an essential part of the lives of boys because it serves both utilitarian and psychological needs. Commuting to school, exploring a neighborhood, and going on excursions through parks are all necessary passages of a boy’s life enhanced by a bicycle, as are fantasies of athletic glory and dreams of the open road. Little wonder that the painter Andrew Wyeth titled his 1950 painting of a boy riding his bike past a featureless open field “Young America.” The painting symbolizes the freedom of the open road leading to future adventures beyond the horizon. Boyhood and bicycling are synonymous. Bernard Mergen See also Cars; Toys References and further reading Munroe, Kirk. 1897. The Ready Rangers: A Story of Boys, Boats, and Bicycles, Fire-Buckets and Fun. Boston: Lothrop Publishing. Pridmore, Jay. 1999. Classic American Bicycles. Osceola, WI: Motorbikes International. Saroyan, William. 1952. The Bicycle Rider in Beverly Hills. New York: Scribner’s. Smith, Robert A. 1972. A Social History of the Bicycle: Its Early Life and Times in America. New York: American Heritage Press.
86
Big Brothers
Big Brothers A social service organization that matches adult men who serve as mentors, companions, or “big brothers” to boys without adult male role models in their lives. The organization was formed in several different cities in the early part of the twentieth century. By the end of the century, Big Brothers served thousands of boys in towns across the United States. Between 1903 and 1910, two men independently initiated the first big brothers programs in Cincinnati and in New York City. In 1903 twenty-three-year-old Cincinnati businessman Irvin F. Westheimer looked out the window of his office building and saw a boy searching through a garbage can for food. Westheimer spoke to the boy, found that he was one of five children from a fatherless home, and subsequently took the boy out for a meal and met him frequently to provide him with companionship and support. Westheimer urged some of his friends in business to help other young, needy boys, and in 1910 the group had established a Big Brothers organization in Cincinnati. Within a year, the organization had 400 volunteers, and by 1916 it was assisting 203 little brothers. Westheimer’s agency served only Jewish boys, but Catholic and Protestant men in Cincinnati formed other Big Brothers agencies to serve boys of their respective faiths. Meanwhile, in New York City, Ernest K. Coulter, a clerk of the Children’s Court that tried juvenile cases, became alarmed by the number of boys that came before the court for minor offenses. In 1904 he spoke to the Men’s Club of the Central Presbyterian Church of New York and challenged its members to become big brothers to needy delinquent boys who lacked appropriate male role
models. Immediately, thirty-nine men volunteered to help Coulter. By 1907 a Big Brothers agency was functioning in New York City, and by 1910, when it was incorporated by the state, the agency had aided 1,000 little brothers. As in Cincinnati, the male volunteers in New York were professional men, although they came from various religious persuasions. The boys came to the agency largely through recommendations from the judges in Children’s Court. Big Brothers met their little brothers on a one-to-one basis at least two times a month. The Big Brothers concept proved popular with men throughout the country, and by 1916 there were programs sponsored by Jewish, Protestant, and Catholic groups of men as well as by churches and service organizations like Kiwanis in ninety-six cities. Most agencies served only white boys, although by 1914 the New York Big Brothers agency matched black big and little brothers, and by the 1920s Urban Leagues in various cities provided African American male role models for black boys. In the same years that concerned men were forming Big Brothers programs, religiously motivated women were founding comparable organizations for girls. Catholic women in New York established a Big Sisters program in 1902, and Protestant women followed suit in 1908. These organizations followed the same pattern as did Big Brothers agencies: they sent representatives to the New York Children’s Court and determined, along with court officials, which girls would benefit most from care from a Big Sister. The first national conference of Big Brothers and Big Sisters agencies was held in 1917, and subsequently the Big Brothers/Big Sisters Federation was founded and local groups were invited to
Big Brothers join it. By 1922 there were 106 local groups, and 50 had joined the federation. Forty-four of these agencies assisted 16,000 youngsters in a year. Most boys and girls served were between the ages of ten and sixteen. Big Brothers saw their little brothers an average of 3.6 times a month. In the 1920s Big Brothers began to use the slogan “No Man Stands So Tall as When He Stoops to Help a Boy” (Beiswinger 1985, 82, 96). In 1923 Paramount Pictures made a movie about Big Brothers, featuring a small boy named Midge Murray whose older brother was killed in a gang fight. Before he died the older brother asked his friend, Jimmy Donovan, to be a big brother to Midge. Jimmy did not take the task too seriously until he attended a church meeting and heard about the Big Brothers organization. Then he worked to be a real big brother to Midge. During the Depression, local Big Brothers organizations had economic problems and failed to pay their dues to the national federation. As a consequence, the Big Brothers/Big Sisters Federation dissolved in 1937. Nonetheless, the work of local Big Brothers groups continued. In 1940 a national study of twenty-eight Big Brothers agencies in twenty-one cities and twelve states revealed that Big Brothers agencies provided boys with a variety of services, including optical care, farm placement, psychiatric care, scholarships, school clothing and supplies, allowances, loans, and general relief. Nonetheless, personal interaction between adult male volunteers (twenty-one to fifty years of age) and boys remained the heart of Big Brothers work. There were 3,615 such volunteers in 1939. One-third of the Big Brothers agencies served boys of all racial backgrounds, but most of the rest served only whites (Beiswinger 1985, 108–110).
87
After World War II, local Big Brothers organizations once again sought to connect with one another. In 1946 Big Brothers of America was incorporated in New York state. At that time, the organizers felt that work with boys and girls was so distinct that combining work for the two genders in one organization was inappropriate. In 1948 Norman Rockwell designed a logo for the organization, which showed a white man dressed in a suit and tie pointing with his right arm and enfolding in his left the shoulders of a white boy dressed neatly but casually in a striped shirt, short jacket, and pants. The logo indicates that the organization served largely whites and that volunteers were professional men and little brothers were clean, neatly attired youths. In the 1960s, C. Randolph Taylor became the first African American to serve on the national board of Big Brothers of America. As a child in Harlem, he had been the New York agency’s first black little brother. Taylor’s appointment to the board heralded a greater commitment to recruiting big brothers and serving little brothers of various racial backgrounds. In 1970, Taylor objected to a film made by Big Brothers of America called “A Friend for Joey” because it did not depict racial diversity. The board then ordered the film remade to Taylor’s satisfaction. The women’s movement of the 1970s awakened both Big Brothers and Big Sisters organizations to the necessity of working together to improve the lot of children without strong parental role models. Big Sisters International was formed in 1970 and merged with Big Brothers of America in 1977. The new symbol of the organization was an adult embracing a child. In 1992–1993, Public/Private Ventures, an organization directed at helping
88
Black Panther Party
youth, completed a study comparing tento sixteen-year-olds who applied to Big Sisters and Big Brothers programs. The youngsters were assigned randomly to two groups—one that received assignments to Big Brothers or Big Sisters and one that did not. After eighteen months, the two groups were compared, and the study found that boys who had Big Brothers were considerably less likely to begin using drugs or to hit people and more likely to feel able to complete their schoolwork and to feel better about their parents or guardians than boys in the control group. Big Brothers do not set out to prevent drug abuse or raise grades. However, having a responsible adult male mentor provide a close, personal, and supportive relationship with a fatherless boy continues to prove successful in improving the life chances of the boy. Sadly, although Big Brothers serve thousands of boys, by the end of the twentieth century, there were nowhere near enough volunteers to serve all of the boys who requested big brothers. Priscilla Ferguson Clement See also Clubs References and further reading Beiswinger, George L. 1985. One to One: The Story of the Big Brothers/Big Sisters Movement in America. Philadelphia: Big Brothers/Big Sisters of America. Greif, Richard S. 1997. Big Impact: Big Brothers Making a Difference. Boston: New Hat.
Black Panther Party See African American Boys
Bodies Physical appearance and body image play an increasingly important role in the
lives of adolescent boys at the beginning of the twenty-first century. Compared to a generation ago, boys today exhibit higher rates of body dissatisfaction and weight preoccupation and express greater concern with their physical attractiveness and body shape. The role of body image may not be as significant for boys as it is for teenage girls, but more boys worry about improving their physical appearance now than in earlier eras. One factor responsible for this trend is the growing cultural interest in the male body. Newspapers, magazines, and television now carry regular features on men’s health and boys’ fashion and reveal the latest secrets on how to keep the male body youthful, lean, and fit. As cultural interest in the male body has risen, so has the premium assigned to physical attractiveness for adolescent boys and, for that matter, the pressure on boys to invest in their appearances. Because of the increased pressures placed on boys by the media and the fashion and fitness industries, more boys have become self-conscious of their bodies and preoccupied with body-altering behavior such as weight lifting, exercising, and dieting. As with adolescent girls who experience extreme bodily concern, boys struggling with their appearance often take excessive steps to improve their body image. These include taking steroids or other growth-enhancing drugs and using plastic surgery, such as chest implants, hair removal surgery, and even penile engorgement operations. The increasing number of steroid abuse and death cases that stem from the pressure boys face to gain muscle mass illustrates the point that concern over body appearance can no longer be stereotyped as a feminine preoccupation. Today, growing numbers of boys obsess over their looks and per-
Bodies ceive their builds as expressions of masculine identity. Most boys and young men make sense of their bodies in comparison to some culturally defined ideal body. In American culture, the ideal is to be muscular. The Charles Atlas ads that first began appearing in men’s magazines and boys’ comic books during the 1940s vividly illustrate this point. Using a comic strip format, the ads describe how a 97-pound weakling overcame physical weakness as a way to fight back against bullies. This story may be simplistic, but few stories have been more instructive in explaining the relationship between the male body and what it means to be “masculine” as that word is popularly conceived. According to the Atlas ad scenario, being masculine means having a formidable presence in the world, one that symbolizes power, control, and invulnerability, not to mention the capacity to exercise violence when required. This implies, as Alan Klein argues, that a well-developed muscular physique not only serves as “a defense against a world perceived as intimidating [but as] a means of overcoming the threat of intimidation, that is, of feeling more attractive, more accomplished, and more in control” (1994, 87). A similar relationship between muscularity and prescriptive masculinity exists in contemporary films, particularly in the action films of American cinema. Since the 1980s, heavily muscled film stars such as Arnold Schwarzenegger and Sylvester Stallone have been successful not only in increasing box office sales but also in bringing a threateningly physical understanding of masculinity to the big screen, embodying what Susan Jeffords calls the “remasculinization of America” (1989). The idea of “remasculinization” refers to the revival of representations of
89
A Charles Atlas body becomes the ideal for boys, 1940s. (Library of Congress)
masculinity that emphasize being aggressive, in control, and, if provoked, violent. According to Jeffords, efforts to remasculinize America are explicitly intended to redress feelings of social weakness resulting from the American defeat in the Vietnam War and implicitly intended to redress the reduction in social privileges among men and boys resulting from the political gains made by women and girls in recent decades. One outcome of the remasculinization of America has been a mild shift in the on-screen visual presentation of the male body, as bulging muscles, rock-hard physiques, and a fusion of male bodily power with machines (weapons, mostly) have become celebrated features of some of Hollywood’s most extravagant productions. Some early American folktales also emphasize the virtues of male brawn, though in contrast to post–Vietnam War
90
Bodies
Muscles and physical strength set off young boys. (Shirley Zeiberg)
male body mythology, these tales often concern the male body’s resistance to industrialized machinery. For example, folk legends such as Paul Bunyan and John Henry are customarily described in physical terms, with an emphasis on their impressive size and strength, and are often portrayed as having assembled the edifice of America through the sheer force of their muscles. The significance of muscles for men and boys has also been a subject in male advice books as far back as the mid-1800s. For example, Daniel Eddy, publisher of the popular book The Young Man’s Friend, viewed physical strength as the core of male character, saying that “[what] mud sills are to a building, muscular development is to manhood” (quoted in Rotundo 1993, 223). During the same period, another popular advice giver commented that
“the development of a man’s body gives him strength of mind and self-control” (quoted in Rotundo 1993, 224). As these examples imply, the emphasis on muscularity and physical strength is a central message of boyhood socialization. At an early age, boys learn to identify muscles and physical strength as part of what makes them “different” from girls and women. By encouraging boys to read superhero comic books, watch professional wrestling, and play with GI Joe dolls, Americans are contributing to a cultural climate in which muscles and physical strength are treated as rewarding features of masculinity. An examination of contemporary magazines and other media suggests that although muscularity remains the standard against which the male body is judged, less emphasis is currently placed on raw
Bodies physical mass in the fashion of Arnold Schwarzenegger and more on being trim, fit, well defined but not excessively large, and physically agile. In other words, the hyperdeveloped muscularity of an Arnold Schwarzenegger is regarded by most people as excessive, but muscularity is still key in appraising the male body. Despite the continuity of this view, the degree to which muscularity is emphasized varies by historical period. For example, in certain periods, less emphasis was placed on muscularity, including just after the Civil War and in the 1960s. In other periods, such as during the late years of the nineteenth century and the early years of the twentieth century and from the 1970s to the early 1990s, body ideals placed an increased emphasis on muscularity and the need to be physically strong. Despite the historical variability in cultural emphasis, the muscular body prevails as a cultural ideal because as a visual presence it symbolizes the predominantly stereotypical views of masculinity. Moreover, most males not meeting the culturally ideal physique (i.e., the muscular physique) express dissatisfaction with some aspect of their bodies. Young men and adolescent boys often carry with them images of both their own body and their ideal body, and these two images are not necessarily identical. Instead, an overwhelming majority of males report that they would prefer to be muscular as opposed to skinny or fat. In Michael Mishkind’s view (1987), most males feel bodily dissatisfaction in comparison to the ideal type because it is believed that those males closest to the ideal reap certain cultural and social benefits not available to those furthest away. The literature on physical attractiveness provides support for this argument— that the muscular body serves as a privi-
91
leged body—indicating that people (both males and females) who closely resemble cultural standards of beauty receive advantages and opportunities not readily open to others. For example, teachers treat attractive children more favorably and perceive them as more intelligent than less attractive children. Attractive children receive more attention from their peers and are viewed as more popular than unattractive children. Being attractive provides benefits throughout life: “An attractive person is more likely to receive help, to elicit cooperation in conflict situations, and to experience more satisfying interpersonal relationships. And attractive applicants have a better chance of getting jobs and receive higher starting salaries” (Mishkind 1987, 39). In studies asking people to rate the physical beauty of different male body builds, respondents repeatedly identified the muscular physique as more attractive than nonmuscular physiques. Other research has shown that people associate muscular males with positive stereotypes, including happiness, politeness, helpfulness, bravery, strength, masculinity, health, self-reliance, and intelligence. In contrast, respondents described fat males as sloppy, dirty, dependent on others, lazy, lonely, less good-looking, and less intelligent, while viewing skinny males as quiet, nervous, sneaky, afraid, less masculine, weak, and sickly. These stereotypes can be found among a variety of groups, including the lower and middle classes, blacks and whites, children, and young and older adults. In other words, at all levels of society, strong cultural preferences for muscular males and aversions to fat and skinny males appear to exist. Clearly, as long as muscularity serves as a key standard to assess the male body,
92
Bodies
how a boy feels about himself will depend in part on how he measures up to this standard. Indeed, several studies have found a direct correlation between self-esteem and having a muscular body, indicating that young men and teenage boys with less athletic frames have lower levels of life satisfaction than males with muscular builds. The implication is that there are great social-psychological costs for not fitting the cultural ideal. Yet, as Barry Glassner argues (1995), there is also great variation in how males respond to the physical ideals placed upon them. Some boys actively build up their bodies to compensate for physical weakness. For these boys, building the body serves to build self-esteem and express control. Other boys learn to modify or even reject stereotypical standards of physical appearance. Rather than basing self-esteem on their looks, many boys develop other attributes to obtain this goal. For these boys, body image plays a less direct role in their lives. Concerns over body image are strongest at times when men and boys feel uncertain about their masculine identity. The late nineteenth and early twentieth century was one such period in American history. As the nation shifted from a primarily rural, agricultural base to one rooted in manufacturing in the cities, the structural foundations on which masculinity was based began to erode. Prior to this period, most men enjoyed control over domestic life and the labor process, and boys had opportunities for property ownership through inheritance. Among the characteristics attributed to masculinity were ruggedness, self-reliance, and independence, especially economic independence. However, with the rise of industrial capitalism and the end of west-
ward expansion, men were increasingly expected to specialize as family wage laborers outside the home. Unlike selfemployed farmers and businessmen, wage laborers are vulnerable to firings or periodic layoffs and have limited access to property and little control over the products of their labor. The forces of industrialization and urbanization, combined with the growth of an active women’s movement pushing for equality in public life, contributed to a crisis in masculine identity (especially among white, middleclass males). These developments made it more difficult for most men and boys to demonstrate their commitment to traditional masculinity. Although responses to these changes varied tremendously, a significant number of males sought to revitalize American manhood by creating homosocial institutions where traditional masculinity could be preserved. These included the institution of organized sports, especially “combat sports” such as boxing and football, and organizations in which traditional manliness could be instilled in boys, such as the Young Men’s Christian Association (YMCA) and the Boy Scouts of America. The effort to retrieve masculinity from the “feminizing” forces of industrial society also included the muscular Christianity movement. Central to these counterdevelopments was a new and more vigorous image of the male body, one based on physical strength, force, and aggression. By grounding masculine status in bodily performance, males were able to express and maintain qualities associated with atavistic notions of masculinity. As new social pressures have created additional barriers to the pursuit of masculine identity, the male body remains one of the few sites in which males can
Bodies redress perceived threats to their masculinity. In the 1970s, most men began to encounter serious challenges to their role as family provider. Much of this stemmed from change in the economy. With economic growth slackening in the 1970s, the earning power of the male wage steadily declined, making it more difficult for most households to maintain their standard of living on one paycheck alone. As part of an effort to supplement their husbands’ stagnant wages, married women increased their labor force participation to unprecedented levels, with the most significant increase attained by those with the youngest children. These factors, along with the reemergence of the women’s movement and the increasingly optional character of contemporary marriage, have further complicated men’s social ties to traditional masculinity. The media and consumer culture continue to promote traits associated with traditional masculinity, such as control and dominance, but for a growing number of males these traits are unattainable. In order to deal with and respond to this more recent crisis in masculine identity, many males now seek out symbolic resources with which to rebuild their identity and validate their manhood. At a time when male privileges are on the decline and women and girls have made significant social and political gains, the muscular body appears to provide a reasonably reliable foundation on which to prop up men’s and boys’ sagging identities, to preserve some semblance of traditional male-female differences, and to ward off the erosion of male power by associating males and maleness with control and dominance. How boys make sense of and cope with body image is also influenced by differ-
93
ences and inequalities in race, class, age, sexual orientation, and disability. Although growing numbers of boys care about the health, shape, and appearance of their bodies, not all have equal amounts of time and resources to invest in their bodies, nor do all individuals feel the same kind of social pressure to work on their bodies. Social class plays a particularly important role in how males come to terms with body image. For example, working-class men often place greater emphasis on muscularity and physical strength than middle- and upper-class men, a distinction that stems from the position working-class men occupy in the economy. Working-class occupations often involve physical labor, thus requiring strength, stamina, and toughness. As such, the bodily capacities of working-class men serve as their primary economic asset on the labor market. Boys who grow up in working-class communities often learn to place high value on heavily muscled, powerful builds. By contrast, middle- and upperclass boys often learn to emphasize bodily traits that are not directly tied to and necessary for physical labor. For working-class boys, a physically large, muscular physique is also an important means of masculine self-expression within an otherwise limited structure of social and economic opportunity. By literally embodying stereotypical masculine traits such as strength, dominance, and virility, working-class boys are able to respond to the limits that class inequality places on their other opportunities for self-expression. Middleand upper-class boys, however, have greater access to other compensatory resources (besides the body) on which to base their masculinity. Rather than endorsing bodily attributes associated with
94
Books and Reading, 1600s and 1700s
traditional masculinity, many support less conventional body images as a way to distinguish themselves and their class status. Chris Wienke See also Boxing; Boy Scouts; Comic Books; Football; Muscular Christianity; Sports; Superheroes; Young Men’s Christian Association References and further reading Connell, Robert. 1995. Masculinities. Berkeley, CA: University of California Press. Glassner, Barry. 1995. “Men and Muscles.” In Men’s Lives. Edited by Michael Kimmel and Michael Messner. Boston: Allyn and Bacon. Jeffords, Susan. 1989. The Remasculinization of America: Gender and the Vietnam War. Bloomington, IN: Indiana University Press. Klein, Alan. 1993. Little Big Man: Bodybuilding Subculture and Gender Construction. Albany: State University of New York Press. ———. 1994. “The Cultural Anatomy of Competitive Women’s Bodybuilding.” In Many Mirrors: Body Image and Social Relations. Edited by Nicole Sault. New Brunswick, NJ: Rutgers University Press. Mishkind, Michael. 1987. “The Embodiment of Masculinity: Cultural, Psychological, and Behavioral Dimensions.” In Changing Men: New Directions in Research on Men and Masculinity. Edited by Michael Kimmel. Newbury Park, CA: Sage. Rotundo, Anthony. 1993. American Manhood: Transformations in Masculinity from the Revolution to the Modern Era. New York: Basic Books. Shilling, Chris. 1993. The Body and Social Theory. London: Sage. White, Phillip, and James Gillett. 1994. “Reading the Muscular Body: A Critical Decoding of Advertisements in Flex Magazine.” Sociology of Sport Journal 11: 18–39. Wienke, Chris. 1998. “Negotiating the Male Body: Men, Masculinity, and Cultural Ideals.” The Journal of Men’s Studies 6, no. 2: 255–282.
Books and Reading, 1600s and 1700s Seventeenth-century colonial boys had little to read besides primers and the Bible, but by the eighteenth century folktales, classical myths, poetry, books of manners, spellers, and small children’s books with pointedly moral lessons became available. Most reading material in colonial America was written by Europeans and was either exported to the colonies or reprinted (no copyright permission necessary) by the few colonial printers who were licensed by the crown. Clerics compiled conversion narratives to encourage dutiful behavior and teach Calvinist theology. Gentlemen of standing penned instructions on manners, fashion, morality, education, and travel. Elite households contained classical, theological, and contemporary European works that afforded literate boys further reading pleasure. John Newbery, who introduced small fictional stories for children about 1744, became Britain’s bestknown eighteenth-century printer and author and was widely distributed in the colonies. Missionaries often spread literacy to non-European Americans, and free people of color started their own schools. After the Revolution, Noah Webster standardized reading instruction and spelling and introduced some American history through his popular schoolbooks. Because of the Puritan imperative that Christians read and study the word of God, the New England colonies probably had the highest literacy rate anywhere in the world in the seventeenth century. The Massachusetts Bay Colony mandated that all children be taught to read, either at home, at school, or when apprenticed to a master. Like their European counterparts, colonial children first used a hornbook, or battledore, to master
Books and Reading, 1600s and 1700s the letters of the alphabet and numbers. The hornbook was actually a small board covered with a sheet of paper, usually containing the alphabet, numbers through ten, and the Lord’s Prayer, all covered with a thin sheet of transparent horn for permanence. After children learned these hornbook symbols, they received their first books, primers. Progressing from simple consonant and vowel sounds to three-letter words to simple maxims, children could read fluently before ever learning to write. British primers sold well in the colonies until the compilation of The New England Primer in 1690. Its use of rhyming verses like “In Adam’s Fall, We sinned All” (Ford 1897, 37) and its morally instructive biblical quotations arranged in alphabetical order by first word for easier memorization made The New England Primer the perfect instructional tool for Puritan society. Additional texts usually included a catechism, prayers, the Apostle’s Creed, and accounts of Protestant martyrs. Printers reproduced The New England Primer with slight variations well into the nineteenth century. Outside of primers, boys’ reading in colonial America remained limited. John Cotton developed a shorter Calvinist catechism than was traditionally used in England and Scotland in Spiritual Milke for Babes; in Either England: Drawn out of the Breasts of Both Testaments . . . (1646), which was the first book for children actually written in the American colonies. Instead of more than 100 theological questions, Spiritual Milke presented only 64 questions and answers to be memorized. James Janeway’s A Token for Children (1671) provided young readers with accounts of pious childhood conversions and tearful deathbed testi-
95
Many colonial boys learned to read from The New England Primer. (Library of Congress)
monies of faith. Cotton Mather found this work so powerful a tool for religious education that he crafted similar accounts of New England children to be added to the printing of A Token for Children in America. British author John Bunyan’s A Pilgrim’s Progress (1678) was second only to primers in popularity in New England. All of these works may have been read to children as often or more often than they were read by children themselves. Those boys who had access to education beyond a grammar school would have been introduced to Latin and Greek, particularly the works of Cato, Virgil, Cicero, Erasmus, and oth-
96
Books and Reading, 1600s and 1700s
ers from the ancient world who stressed morality and character. The colonies around the Chesapeake and farther south contained few Calvinists; Southern educational efforts targeted the sons of gentlemen rather than all children. Wealthy planters hired tutors who provided the same classical education boys might receive on the Continent. Boys studied Latin grammar and texts as well as treatises on religion and natural philosophy. Local clerics accepted the responsibility for teaching the Church of England’s catechism, and most homes contained The Book of Common Prayer and British primers. Accomplished readers enjoyed such works as Plutarch’s Parallel Lives, Christopher Marlowe’s The Tragicall History of Dr. Faustus (1604), medieval romances, or perhaps the first American captivity narrative, The Sovereignty and Goodness of God by Mary Rowlandson (1682). The eighteenth century brought a greater variety of reading material to colonial boys, most notably the children’s stories printed and often written by John Newbery of London. A Museum for Young Gentlemen and Ladies (1750), The Pretty Book for Children (1743), and A History of Little Goody Two-Shoes (1765) were among the most popular titles that made their way to the colonies. In the latter, the protagonists, an orphaned brother and sister, exemplify industriousness, kindness, thrift, and obedience to authority as they make their way in the world and are justly rewarded by achieving considerable status and comfort. Unlike the deathbed conversion stories in Janeway’s Token, Newbery’s children found that good moral character brought worldly rewards as well as spiritual ones. In addition to fiction, Newbery published informational books for children on such
topics as the natural world, historical events, and animal behavior. American printers like Isaiah Thomas (Boston) and Hugh Gaine (New York) pirated popular British works and marketed them in the colonies, sometimes altering the texts only slightly and adding their own woodcuts for illustration. Religious works for children remained popular for those few who could afford to purchase books for children. Isaac Watts’s Divine Songs (1715) was widely reprinted intact; individual poems also found their way (uncredited) into other American children’s books. Etiquette manuals were staples in elite society just as they were in England, including the 1775 work Principles of Politeness, and of Knowing the World by Philip Dormer Stanhope, fourth Earl of Chesterfield, and Daniel Defoe’s The Complete English Tradesman (1725–1727), which enabled boys of the middling classes in the few seaboard cities to perfect their gentlemanly manners and business acumen. American versions of Tales of the Arabian Nights and Charles Perrault’s “Cinderella” and “Little Red Riding Hood” also circulated. The survival of so few early American children’s books is less a testimony to their scarcity than it is to the multiple readership that each book undoubtedly had. No books designed to teach colonial children to read or to introduce them to literature were created for Native American or African American children specifically. Missionaries in New England’s “praying towns” often sought to translate parts of the Bible into a written form of the native language, taught the basic English alphabet to native children, and tried to establish English schools as well. Not until the nineteenth century did Native Americans appear as characters (usu-
Books and Reading, 1600s and 1700s ally unsympathetic ones) in children’s fiction. Some African American boys, both slave and free, undoubtedly were tutored in reading, if not writing, by clergymen or slave mistresses. The Anglican Society for the Preservation of the Gospel established a school for slaves in South Carolina around 1740, but slaveholder opposition seriously hindered literacy for African Americans. Postrevolutionary concern over a literate citizenry led to numerous calls for state programs of universal education and certainly spurred literacy efforts nationwide. New Jersey and New York even included mandates that masters teach slave children to read. More commonly, African Americans took it upon themselves to learn to read and founded their own schools in Newport, Boston, and Charleston, for example. The American Revolution disrupted the importation of British books but did not lead to a stampede to produce American stories for American children. Many long-standing primers and stories were given quick cosmetic surgery before being reprinted for citizens of the new United States. Even before the Revolution, The New England Primer’s verse for the letter “K” had been modified from “King Charles the Good, no man of blood” to “Kings should be Good, no men of blood.” After the Revolution, the verse for “K” became “The British King lost states thirteen” (reprint edition, 1843). The Revolution’s greatest impact on children is found in the creation of new textbooks. Noah Webster published a progressive series of texts in 1783 designed to move the student from sounding out initial consonants to pronouncing rhyming words to reading sophisticated political essays. At each level the student encountered material on the development of moral character, the superiority of repub-
97
lican government, and the pleasure of civic duty. Webster’s American Spelling Book, commonly called “Old Blue-back” for its binding, became the principal text at the common school (a free public school) level. In his more advanced reader, The Little Reader’s Assistant (1790), he included a federal catechism to teach children the benefits of participatory democracy. Webster also took pains to broaden the cultural base of his readers by including stories of Native American peoples, depicting the abuses of slavery, and describing the injustices perpetrated against those so held. By emphasizing spelling as part of the reading process, Webster achieved a standardization of American English, which he formalized in his 1828 American Dictionary of the English Language, thus also helping to democratize and broaden the American experience. Jedidiah Morse’s Geography Made Easy (1784) and Nicholas Pike’s Arithmetic (1788) complemented Webster’s works as the basic texts for schoolboys. For those who were apprenticed to a farmer or craftsman before they learned much more than basic reading and ciphering, the most commonly read books were the Bible and the Farmer’s Almanac. Although the latter may have been the farmers’ guide to planting and weather forecasting, it, like primers and children’s stories, also contained moral advice, pithy sayings, Bible verses, and characterbuilding tales. Poor Richard’s Almanac, which Benjamin Franklin began publishing in 1733, was but one of many popular versions. Perhaps no book thrilled the young American boy as much as The Life and Memorable Actions of George Washington (1800) by Mason Locke Weems, better known by his pen name of “Parson Weems.” Capitalizing on the tremendous
98
Books and Reading, 1800s
popularity of the just-deceased first president, Weems created a biography more fictional than fact. He invented the incident of young George and the cherry tree, as well as one in which hurriedly planted cabbage plants spell out “George Washington” when they mature. These myths occurred repeatedly in nineteenth-century biographies and histories of the nation’s first president. Gail S. Murray References and further reading Avery, Gillian. 1994. Behold the Child: American Children and Their Books 1621–1922. Baltimore: Johns Hopkins University Press. Ford, Paul Leicester, ed. 1897. The “New England Primer”: A History of Its Origin and Development. New York: Dodd, Mead. Halsey, Rosalie V. 1969. Forgotten Books of the American Nursery: A History of the Development of the American Story-Book. Boston: Charles Goodspeed, 1911; reprint, Detroit: Singing Tree Press. Kiefer, Monica. 1948. American Children through Their Books, 1700–1835. Philadelphia: University of Pennsylvania Press. Murray, Gail Schmunk. 1998. American Children’s Literature and the Construction of Childhood. New York: Twayne Publishers, Prentice-Hall International. New England Primer Improved for the More Easy Attaining the True Reading of English, The. 1843. I. Webster, publisher. Pickering, Samuel F., Jr. 1993. Moral Instruction and Fiction for Children, 1747–1820. Athens: University of Georgia Press.
Books and Reading, 1800s A distinctive boys’ literature began to develop in the United States in the midnineteenth century. Instead of reading
the didactic moral tales imported from England in the first part of the century, boys read about the frontier, foreign travel, war exploits, urban adventures, and boyhood pranks. Although plot development often overshadowed complexity of character, many stories still encouraged cultivating gendered character traits. In the common schools (free public schools established in the early nineteenth century) in the Midwest, young boys memorized passages from William McGuffey’s readers that were heavily laced with moral guidance. Middle-class and affluent boys could subscribe to various magazines, ranging from the wholly religious to the eclectic, and workingclass adolescents found reading pleasure in dime novels. American adventure stories and fantasy competed with European classics. Juvenile periodicals flourished in an increasingly child-centered culture. Little writing for children tackled the major social problems of the century: slavery, intemperance, ethnic discrimination, and urban poverty. British children’s stories appeared more frequently in the early nineteenth century than did works by American authors. Heavily didactic, these stories featured unnatural children with such names as Jack Idle and Mary Ann Selfish, whose transparent behaviors and moral choices were designed to impress the child reader with those virtues needed to build strong republican citizens. Some stories disparaged inordinate wealth, which often led to slovenliness, whereas hard work and thrift brought pecuniary as well as spiritual rewards. Others stressed honesty, responsibility, kindness, steadfastness, and obedience to authority. These didactic stories also stressed the importance of rational thought over fantasy, of factual in-
Books and Reading, 1800s formation over imagination. Society’s attachment to didactic literature thus initially discouraged imaginative and creative stories for young boys. The transition from moral didacticism to pure adventure came gradually and is best seen in the works of “Peter Parley,” pen name of Samuel Goodrich, the first American to make a career out of writing for children. The Tales of Peter Parley about America (1827), Stories about Captain John Smith (1829), and the dozens of books that followed use Parley as a grandfatherly storyteller who comments on the narrative he unfolds. Some 107 titles provided young boys incentives to read for pleasure while instructing them about such diverse topics as astronomy, music, geography, mythology, manufacturing, and history. In addition, Goodrich impressed upon the young reader those same moral codes trumpeted in earlier British books. Goodrich founded and briefly edited Peter Parley’s Magazine before he joined the staff of the popular Robert Merry’s Museum (1851–1854). These periodicals were founded in the early republic period to meet growing middle-class tastes for self-improvement and character building. Some adult fiction that featured frontier heroes became a staple of early nineteenth-century boys’ reading as well. James Fenimore Cooper’s vivid portraits of European–Native American encounters in The Last of the Mohicans (1826) and other frontier adventures in The Pioneers (1823), The Prairie (1827), and The Deerslayer (1841) were erroneously believed to be historically accurate and were often assigned by teachers. Regardless, boys enthusiastically followed the adventures of Natty Bumppo. Cooper’s contemporary, Washington Irving, recast
99
Leatherstocking leading an advance in the forest. Illustration from The Last of the Mohicans by James Fenimore Cooper. (Library of Congress)
European folktales to an upstate New York setting in The Sketch Book (1819), with its classic stories “Rip Van Winkle” and “The Legend of Sleepy Hollow.” Boston headmaster Jacob Abbott used his experience in the classroom, combined with a pious nondoctrinal Christianity, to publish children’s books based on childdevelopment theory. His first Rollo series (1834–1843) followed the physical, intellectual, and moral development of young Rollo from age three to maturity. A second series (1853–1858) took Rollo on Atlantic voyages and to European cities. Abbott’s
100
Books and Reading, 1800s
other children’s books included the Franconia Stories, the Jonas series, and the Lucy books. In all, young readers could readily identify with the protagonist’s mistakes and learn from them without the intrusion of the didacticism of earlier stories or the authorial voice of the Peter Parley books. The Oliver Optic series by William Taylor Adams (116 books in all) also featured boys’ adventures tales, but with less didacticism and fewer moral messages. Like Peter Parley, “Oliver Optic” used an older male figure to guide the young protagonist, but not all Adams’s heroes came to a good end or profited by their mistakes. Some critics deplored the sensationalism of the Oliver Optic plots and criticized their subtle moral vagueness. But young boys found them engaging, particularly because the main character made his own way in the world without constraining authority or punishment. Adams incorporated school, work, and travel adventures in his vast production of boys’ books. As popular as Goodrich, Abbott, and Adams were, their popularity paled next to that of the most famous boys’ author of the century, Horatio Alger, who wrote more than 100 titles for young adults. Ragged Dick (1867) follows the adventures of fourteen-year old Dick Hunter, who lives on his own as a bootblack in New York City. Seeking to better his circumstances, he forgoes gambling and smoking in order to save his earnings, attend night classes, and assist friend and stranger alike. All along the way, good luck and chance meetings with wellplaced gentlemen enable Dick to achieve his goal by landing an office position. Even though the name Horatio Alger is often associated with a “rags-to-riches” motif, none of his heroes actually became wealthy. Rather, they utilized nineteenth-century values like thrift, perse-
verance, and honesty to become modestly self-sufficient and respectable members of middle-class society. An assortment of “bad boy” stories— full of mischief, pranks, and misbehavior—began with Thomas Bailey Aldrich’s biographical memoir set in rural New Hampshire, The Story of a Bad Boy (1870). Newspaper columnist George W. Peck added an urban version of this theme with several collections of tales under the title of Peck Bad Boy and His Pa (1883). But the most famous literary pranksters of the nineteenth century were undoubtedly Tom Sawyer and Huck Finn. These creations of Mark Twain (Samuel Clemens) reflect sheer devilment, which no amount of punishment or practical experience could stifle. Critics at the time found The Adventures of Tom Sawyer (1876) and The Adventures of Huckleberry Finn (1884) wholly unsuitable for young readers, for Clemens turned the formula of bad–behavior–brings–punishment on its head as he celebrated the pranks and crimes that the youngsters perpetrated on their befuddled elders. The moral certainty found in most nineteenth-century children’s fiction eludes the reader of Twain’s classics. Schoolbooks constituted the bulk of reading material for young men in the first half of the century. Those boys who attended academies and colleges found the curriculum and reading material similar to those from a century earlier. By midcentury a new series of schoolbooks specifically designed to streamline literacy acquisition for the busy farm boy, while also inculcating the moral standards and democratic ideals necessary for the expanding United States, was available. McGuffey’s Eclectic Readers (1836– 1837) taught millions of Americans to read and spread his unique combination of moral certitude, religiosity, capitalism,
Books and Reading, 1800s and democracy. The series came out in a new edition in 1879 and remained a bestseller well into the twentieth century. As abundant as schoolbooks were, the thousands of children’s books distributed inexpensively by the American Sunday School Union (ASSU) during the evangelical Protestant resurgence early in the century were just as common. An interdenominational effort, the ASSU published explicitly Christian stories and tracts for young people. At a time when towns usually had no public libraries, schools or churches could afford to buy a set of books called a “library” from the ASSU for as little as $10. By 1830, the ASSU claimed sales of some 6 million books. Although some authors returned to earlier Puritan themes, others like Jacob Abbott displayed a more civic religion based on moral certainty. All stories contained warnings of dire consequences for the disobedient, lazy, or improvident reader. The ASSU joined the growing market in juvenile magazines with the publication of the Infant’s Magazine for young children and the Youth’s Friend and Scholar’s Magazine for older children. Like other Sunday school publications, these periodicals stressed biblical knowledge, character building, attendance at worship, and proper decorum and manners. The Youth’s Companion, the longest-running juvenile periodical (1827–1929), was an inexpensive four-page weekly that rivaled the popular Our Young Folks magazine. Oliver Optic’s Magazine, launched shortly after the Civil War, introduced more secular themes, a greater variety of heroes, and lots of boyhood adventures. The popularity of these periodicals attests to the far greater number of boys who had the financial means and the leisure time to enjoy regular recreational reading.
101
Working-class boys and recent immigrants were not ignored by the flourishing publishing industry. Hundreds of dime novels poured forth from the presses of Beadle and Adams, George Munro, and Street and Smith. Cheaply produced in a small format (4 inches by 6 inches), these books brought crime and adventure stories to a mostly male, working-class readership. The novels were often condensed and sold as a series of soft-covered pamphlets called “yellow-backs.” Edward Stratemeyer, who developed his own publishing empire in the twentieth century, pioneered boys’ detective stories with his Nick Carter series. Teachers and librarians deplored these dime novels for their lurid illustrations, formulaic plots, settings in gambling parlors and dance halls, and unrepentant heroes. The novels introduced boys to a pseudo-adult world that middle-class boys, in particular, found intriguing and alluring. Many a young boy chafed under the velvet knickers and lace shirts popularized by Frances Hodgson Burnett in Little Lord Fauntleroy (1886), yet they avidly consumed the fairy-tale story of a poor American child who found himself the heir to a British fortune. Boys’ reading also included some fantasy and imaginative fiction, with the most entertaining being The Brownies: Their Book (1887) and its twelve sequels by Palmer Cox. These rhyming stories recounted the nighttime adventures of these fantastical creatures, as when they play in the schoolroom after dark and parody the teacher. Cox used pen and ink to sketch his brownies among the printed verses, in the margins, and peeking around the corners of the page. In addition to these newly penned fantasies, versions of classics like Cinderella and Tom Thumb circulated widely along with classics like Robinson Crusoe by Daniel Defoe (1719) and Robin
102
Books and Reading, 1800s
Early fantasy reading of boys, the brownies taking banners from a fort. Illustration from The Brownies Many More Nights by Palmer Cox. (Library of Congress)
Hood. Artist Howard Pyle illustrated particularly popular versions of The Merry Adventures of Robin Hood (1883) and The Story of King Arthur (1903). The Civil War provided setting and theme for dozens of boys’ books such as Frank’s Campaign (1864), which launched Horatio Alger’s career in boys’ literature, and Adams produced a set of books, beginning with The Soldier Boy (1863), about various boys and their wartime adventures. It was the children’s magazines, however, that succeeded in projecting pa-
triotism, suffering, self-denial, and war fervor directly into American homes. Rare was the child who escaped knowing about the war and feeling obligated to contribute in some way. Well after the conflict ended, southern authors kept romanticized war motifs alive. Thomas Page’s Two Little Confederates (1888) describes a highly sentimentalized war and plantation life complete with happy slaves. Postbellum authors often heralded the Old South and portrayed slavery as a benign economic system. Joel Chandler Harris’s book Uncle
Books and Reading, 1900–1960 Remus: His Songs and His Sayings (1880) used a stereotypical former slave to recount authentic African American folktales. Originally character sketches for the Atlanta Constitution, Harris’s stories were popular with adults in both North and South, for they implied that no discord existed between races or regions following the war. The several Uncle Remus books were probably purchased mostly by adults and perhaps read or told to children because the dialect Harris employed is difficult to decipher. After the war, African Americans flocked to any and all schools established, but they encountered no literature written especially for them or about their experiences. The major social concerns of the nineteenth century—slavery, intemperance, ethnic discrimination, and urban poverty—made few appearances in children’s literature. Lydia Maria Child carried some abolitionist stories in the periodical Juvenile Miscellany (1826–1834), and the American Sunday School Union advocated temperance in such tracts as The Child’s Book on Intemperance (1832) and The Glass of Whiskey (1825). Boys also saw live performances of the most popular temperance work ever produced, Ten Nights in a Bar Room (1854) by T. A. Arthur. Urban poverty permeates the Horatio Alger books, but its elimination rests largely on the improvement of character. Ethnic minorities, especially the Irish, appear in Peck Bad Boy as intemperate and argumentative; the Polish grocer is an often dishonest “Polacker.” Gail S. Murray References and further reading Avery, Gillian. 1994. Behold the Child: American Children and Their Books 1621–1922. Baltimore: Johns Hopkins University Press.
103
Denning, Michael. 1987. Mechanic Accents: Dime Novels and WorkingClass Culture in America. New York: Verso Press. Elson, Ruth Miller. 1964. Guardians of Tradition: American Schoolbooks of the Nineteenth Century. Lincoln: University of Nebraska Press. Gorn, Elliott J., ed. 1998. The McGuffey Readers: Selections from the 1879 Edition. Bedford Series in History and Culture. Boston: Bedford/St. Martin’s Press. MacLeod, Anne Scott. 1975. A Moral Tale: Children’s Fiction and American Culture, 1820–1860. Hamden, CT: Archon Books. Marten, James. 1998. The Children’s Civil War. Chapel Hill: University of North Carolina Press. Murray, Gail Schmunk. 1998. American Children’s Literature and the Construction of Childhood. New York: Twayne Publishers, Prentice-Hall International. Nackenoff, Carol. 1994. The Fictional Republic: Horatio Alger and American Political Discourse. New York: Oxford University Press.
Books and Reading, 1900–1960 The first half of the twentieth century saw a lucrative market develop in boys’ books, featuring literature distinctive from the multigenerational reading of the Victorian era. Fantasy and science fiction became popular genres during this period, and books for very young readers proliferated. Publishers brought out books on boys’ travel adventures, boarding school escapades, and sports and mysteries in series formats with continuing characters. Series books were sometimes set in foreign locations, especially during wartime, but many centered on “All-American” towns, schools, and sports teams. Racial and ethnic stereotyping permeated many of the series books. The popularity of juvenile periodicals gave way to serial and comic books,
although Boys’ Life and Highlights for Children were twentieth-century introductions. Many midcentury books presented realistic family stories from white, middle-class America. Racial and ethnic minorities found little literature that represented life as they knew it. Theodore S. Geisel, writing as Dr. Seuss, introduced young readers to amusing plots and memorable characters based on his own whimsical drawings. American children’s fiction had been largely bereft of fantasy, but the dawn of the twentieth century saw the creation of one of the greatest children’s fantasies of all times, The Wizard of Oz by L. Frank Baum (1900). Dorothy Gale’s wild transporting from Kansas to Oz brought out those attributes familiar to readers of boys’ nineteenth-century fiction: courage, single-mindedness, fearlessness, and perseverance. Her companions on the yellow brick road—the Tin Woodman, the Cowardly Lion, and the Scarecrow—all sought to become better or stronger men. Besides the quest-for-wholeness theme, scholars have read the Oz fantasy as a symbolic depiction of late-nineteenth-century political struggles, with populism warring against big business and political opportunism. However, Baum himself contended that the novel simply grew out of bedtime stories he told his sons. The publisher sold 100,000 copies of Baum’s fantasy before the year was out, and the book has remained in print ever since. MetroGoldwyn-Mayer produced a film version starring Judy Garland in 1939. Although Baum wrote numerous sequels set in Oz, none measured up to the original in creativity or popularity. Much nineteenth-century fiction had appealed to adults and children alike, but the twentieth century saw the canonization of a distinctive children’s literature,
one promoted by publishers, librarians, and schoolteachers alike. National Children’s Book Week was launched in 1919 to promote “wholesome reading” over dime novels and to increase the audience for the new children’s divisions found in all major publishing houses. The annual Newbery Medal was created in 1922 to honor the best work in children’s literature, and the Horn Book Magazine was founded in 1924 to review children’s books for parents and teachers. Many of the juvenile periodicals launched in the late nineteenth century continued to attract boy readers in the early twentieth century, especially St. Nicholas magazine, a lengthy collection of favorite authors like Mary Mapes Dodge (who also edited the magazine), Robert Louis Stevenson, Howard Pyle, and even Teddy Roosevelt. Boys’ Life, featuring outdoor activities as well as inspiring stories, was created in 1912 as an outreach of the Boy Scouts of America movement. Highlights for Children appeared in 1946 as an interactive educational and entertainment periodical. The first boy hero of the twentieth century was probably Frank Merriwell, the creation of Gilbert Patten, who wrote under the pseudonym of Burt L. Standish. Frank Merriwell’s adventures occurred at boarding school, a setting made popular by the British book Tom Brown’s School Days (1857) and its many English imitators. Popularity, sports, bullies, and antagonistic teachers comprised the main issues and cast of characters in Patten’s stories. After exploiting all the school adventures and sporting activities imaginable, Patten filled subsequent books with the activities of Frank’s brother Dick and eventually his son Frank, Jr. The novels emphasized forging strong peer relationships, overcoming adversity, and being
Illustration showing Dorothy scolding the Cowardly Lion, from The Wonderful Wizard of Oz, one of the most popular fantasy stories of all time. (Library of Congress)
106
Books and Reading, 1900–1960
transformed by team spirit. Between 1901 and 1916, Patten created 208 Merriwell books. Edward L. Stratemeyer, writing as Arthur M. Winfield, also used the basic premise of boys at school, but he expanded the format into tales of high adventure in his Rover Boys Series, which began in 1899. The far-flung adventures of Dick, Tom, and Sam Rover were less about school and more about summer vacations and holiday escapades. After taking the three boys through adolescence, marrying them off, and settling them in the same wealthy neighborhood, Stratemeyer wrote a second series following the adventures of the boys’ four sons (and ignoring the two daughters). By 1926, the series contained thirty titles and had spun off additional stories of their boarding school, the Putnam Hall Series. Stratemeyer’s business success lay in the founding of the Stratemeyer Literary Syndicate in about 1906, a corporation that employed a pool of authors to flesh out plots for stock characters. The authors wrote under pseudonyms and for a flat fee, receiving none of the royalties from books that sometimes stayed in print for three generations. Closely related to the school stories were a host of sports sagas in which life lessons were mastered on the playing fields. Like British fiction, American sports heroes and student leaders originally came from privileged backgrounds and attended private schools. However, by World War I, boys’ stories came to be set principally in public schools. Ralph Henry Barbour’s The Half-back: A Story of School, Football, and Golf (1899) began a career of authorship of sports stories that lasted until 1943 and included 160 titles. His characters followed an unwritten “sportsman’s code” that taught
courage, fairness, equality, and leadership both on and off the playing field. Authors trumpeted the new “modern age” of the early twentieth century most prominently in their emphasis on technology, invention, and aviation. The Tom Swift Series (1910–1941) featured young Tom as a mechanical genius. By the end of the forty-volume series, he had moved from fixing engines to building televisions and observing outer space through his telescopes. The authors of the Tom Swift books not only taught the reader about the fascinations of science and technology but also engaged the reader in foreign travel, marketing and business decisions, and patriotic Americanism. Spin-offs from this series included the Moving Picture Boys Series (1913–1922) and the Tom Swift, Jr., Series (1954– 1971), which found Tom’s son engaged in space labs and international science activities. Aviation series became popular with boys during World War I; after Charles Lindbergh’s heroic flight across the Atlantic, the Ted Scott Flying Series (1927–1943) created a similar fictional hero. Regardless of the information one could learn from these travel and technology series books, teachers and librarians did not consider them to be authentic children’s literature, regardless of the thousands of titles sold. They were not reviewed in The Horn Book nor did their titles appear on recommended reading lists. The same is true for the most popular boys’ series of the 1900–1960 period, the Hardy Boys Mysteries, which began in 1927 with Stratemeyer Syndicate author Leslie McFarlane and The Tower Treasure. Like other syndicate series, this one featured prolific travel in fast cars, motorcycles, and airplanes as the teenage brothers Frank and Joe Hardy sought to
Books and Reading, 1900–1960 solve mysteries and apprehend criminals, for which they were handsomely rewarded. The plotting of this series was far more complex than previous boys’ adventures, with multiple story lines woven together in the final crime solution. Unlike the Rover Boys, the Hardy brothers did not age but continued as adolescent crime solvers in Bayport for thirty-some years. The syndicate began revising the series in 1959, updating scenes, removing racial and ethnic slurs, and often shortening the text. An entirely new Hardy Boys series was launched in 1987. Children’s authors did not often write directly about the Depression of the 1930s, but the theme of survival over adversity certainly permeates the Little House books by Laura Ingalls Wilder, the first of which, The Little House in the Big Woods, appeared in 1932. Although the Wilder family consisted of daughters only, boy readers enjoyed the frontier adventures, peer relationships, and the basic patriotic themes of the Wilder sagas. Other family stories popular in the first half of the twentieth century include Elizabeth Enright’s stories of the Melendy family, like The Saturdays (1941), and Eleanor Estes’s Moffat family series. Both authors represented middleclass Americans making the best of diminished circumstances in close-knit neighborhoods with interesting and clever siblings and pals. Rather than follow the same family through various adventures, Lois Lenski’s numerous familycentered books describe families living in different historical periods and in various parts of the United States, including rural Florida, Appalachia, South Dakota, and the Oklahoma oilfields. Theodor S. Geisel, better known as Dr. Seuss, had his first book, And to Think That I Saw It on Mulberry Street (1937),
107
rejected by twenty-eight publishers before it was accepted by Vanguard. By the time of his death in 1991, Geisel had sold over 200 million books, making him the most popular author of the century. Whether using prose or poetry, his stories emerged as both iconoclastic and nonsensical, and he illustrated them himself with imagination and whimsy. The Cat in the Hat (1957) was created to give Random House a more lively beginning reader, Yertle the Turtle (1958) indirectly took up the problem of dictatorship, and The Lorax (1971) was an impassioned plea for environmentalism. Yet all Seuss books remain, first and foremost, enormously entertaining. More realistic animal fables also attracted young readers at midcentury. Essayist E. B. White charmed children with the antics of a mouse (Stuart Little, 1945), a pig and spider (Charlotte’s Web, 1952), and a mute swan (The Trumpet of the Swan, 1970). The United States’ entry into both world wars was noted by taking characters from an established series book and placing them near the war (although not usually in it) or in Red Cross or other war-support efforts. Well after World War II, some prominent authors like John Tunis developed moving war novels like Silence over Dunkerque (1962). The war novel Johnny Tremain (1943) by historian Esther Forbes was actually about the American Revolution, but its purpose was to encourage patriotism and the defense of liberty. For actual war-themed fiction during World War II itself, boys had to turn to the superheroes of the comic books, a genre both ignored and condemned by children’s literature specialists. Comic hero Superman made his appearance in Action Comics in 1938 and was soon followed by Batman and Captain Marvel. Spy Smasher appeared
108
Books since 1960
in 1942 as numerous superheroes took up war themes. Technology, war, and goodover-evil themes also characterized the science fiction genre, whose popularity soared with Robert Heinlein’s ventures into young adult fiction, beginning with Rocket Ship Galileo (1947). One characteristic shared by series adventures, family stories, detective series, and science fiction is their almost complete disinterest in portraying racial and ethnic minorities in any but the most disparaging and stereotypical ways. To eliminate the most offensive abuses, the Stratemeyer syndicate rewrote most of the Hardy Boys and Nancy Drew series in the late 1950s. African Americans, Native Americans, and Latinos rarely found their lives and aspirations accurately portrayed in children’s fiction. A few African American writers produced some children’s books, such as You Can’t Pet a Possum (1934) and Lonesome Boy (1955) by Arna Bontemps and biographies by Carter Woodson. W. E. B. Du Bois started a shortlived magazine for African American children called the Brownies’ Book in 1920. Florence Crannell Means wrote sensitively about Mexican American children in Rafael and Consuelo (1929), Native Americans in Our Cup Is Broken (1969), and Japanese Americans in The MovedOuters (1945.) But basically, children’s fiction privileged white, middle-class life. Gail S. Murray
References and further reading Donelson, Kenneth L., and Alleen Pace Nilsen. 1996. Literature for Today’s Young Adult. 5th ed. Reading, MA: Addison-Wesley. Evans, Walter. 1972. “The All-American Boys: A Study of Boys’ Sports Fiction.” Journal of Popular Culture 6: 104–121. Johnson, Deidre. 1993. Edward Stratemeyer and the Stratemeyer
Syndicate. Twayne United States Authors Series. New York: Twayne Publishers. MacCann, Donnarae, and Gloria Woodard, eds. 1989. The Black American in Books for Children: Readings in Racism. 2d ed. Metuchen, NJ: Scarecrow Press. Murray, Gail Schmunk. 1998. American Children’s Literature and the Construction of Childhood. New York: Twayne Publishers and Prentice-Hall International. Townsend, John Rowe. 1971. A Sense of Story: Essays on Contemporary Writers for Children. Philadelphia: J. B. Lippincott.
Books since 1960 Boys’ books since 1960 are linked by one major thematic thread: boys as problem solvers. Societal factors strongly influence the plot, setting, characterization, and style of the books, but the thread that continues in fiction and nonfiction books is characters in science, adventure, exploration, and entrepreneurship and fact finders who are problem solvers. The popularity of fiction has increased since the 1960s as the authors began to address real-life situations for boys in their environments, family structures, and peer relations. Whether the plot of the book focuses on home situations, school settings, relations with peers, or struggles with personal obstacles, authors apply current themes and events to shape plots. In addition, boys also eagerly read informational books, nonfiction books, and books involving humor. Prior to 1960, books written for boys were typically series books. Familiar characters and settings proved popular. The books featured one or two male characters, who were mechanically inclined and showed respect for their elders. The boys did not have superpowers, but they
Books since 1960 were very cunning and imaginative. Several series, such as Franklin W. Dixon’s The Hardy Boys (1927), Bruce Campbell’s The Ken Holt Series (1949), and Andy Adams’s The Biff Brewster Series (1960) were based on that popular format. The problem was presented at the beginning of the book, and by the end, the young men had saved the day. The characters were suburban, economically stable young men who faced challenges with courage. Series books for boys continued to be popular in the 1960s, but the more turbulent times and technological advancements made character development and attention to detail essential. During the 1960s, boys were exposed to many societal changes. On television, during dinner, families watched the civil rights marches and Vietnam War protests. Violence was brought into the home and became personal with the assassinations of John F. Kennedy, Robert Kennedy, Martin Luther King Jr., and Malcolm X. Children were exposed to great science achievements with the space program and John Glenn’s historic walk on the moon. The boom of rock and roll promoted the rebellion against the morals and ethics of the 1950s. Literature changed with the times. Characters became more sophisticated and the descriptions more vivid to compete with the images and the detail presented in the media and on television. The civil rights movement and global television opened the door for authors to introduce more multicultural characters and settings into popular books. Diverse characters, such as Peter in Ezra Jack Keats’s story The Snowy Day, were introduced. Peter was a young African American boy curious about his surroundings. Keats traced Peter’s experiences through several picture books, developing situa-
109
tions with his community and family and his relationship to peers. Madeleine L’Engle, Maia Wojciechowka, Scott O’Dell, and Lloyd Alexander wrote award-winning novels that attracted boys to diverse characters, far-off settings, or even another world to rule over monsters, as in Maurice Sendak’s Where the Wild Things Are. Again, the connecting thread for their characters remained boys as problem solvers. Boys were fascinated with the characters that confronted the same challenges real boys faced in the real world. Therefore, boys wanted the ending of the book to be successful, one in which the lead character saved the day or solved the problem. One fantasy book that crossed gender lines was A Wrinkle in Time by L’Engle, published in 1962 and featuring a young heroine, Meg, and her younger brother, Charles Wallace. They and their friend set off into space to rescue their father from mystic evils. The book contained a great deal of science fiction, and Charles Wallace loved to participate in mathematical activities and challenges. Maia Wojciechowka also explored the relations between a father and son in the book Shadow of a Bull. In 1965 she won the highest honor for writers of young adult books, the Newbery Award, for her intense descriptions of characters and the situations they encountered. Scott O’Dell investigated family tradition and the relationship between a father and son in the 1967 novel The Black Pearl. The son’s quest for the black pearl against the great manta became a challenge of good versus evil. In Lloyd Alexander’s fantasy series The Chronicles of Prydain, the central character, Taran, continually faced obstacles that he overcame with a group of unlikely, mystical supporters. Taran set forth on a noble quest in the
110
Books since 1960
book The High King, in which he struggled to overcome his lowly status as an assistant pig keeper. Even though the books were high fantasy as opposed to reality-based novels, they dealt with relationships among young people, family structures, and the struggle between good and evil. Publishers and authors apparently believe that boys prefer to read male authors, and as a result, female authors often use their first initials to hide their gender and attract male readers. S. E. Hinton wrote The Outsiders in 1967 about a young man torn between loyalty to his family and his own morality. The book From the Mixed-Up Files of Mrs. Basil E. Frankweiler (1968) by E. L. Konigsburg, a female author, won the Newbery Award. Among Americans in the 1970s, awareness of global issues grew, the divorce rate increased, families became more diverse in structure, and the problem of drug abuse moved onto the front burner. The decade saw President Richard Nixon resign and President Jimmy Carter asking people to commit to volunteerism and contribute to the greater world. The Vietnam War continued to play on television during dinner, and the economic gap between classes continued to grow. The Cold War raged on in Eastern Europe, and the race war continued in the United States. As authors sought to satisfy a more mature and curious audience, boys’ books reflected these changes in society by continuing to increase their sophistication, character and language development, and attention to detail. Book publishing became big business, and publishers expanded their offerings for young male readers. Not only fiction, especially historical fiction, but also books on science and technology were popular with boys.
During the 1970s, several types of books that boys had long enjoyed continued to be popular. Books that focused on partnerships or relationships with other ethnic and physically challenged characters, such as Marlene Fanta Shyer’s novel Welcome Home, Jellybean (1978) and Nikki Grimes’s book Growin’ (1977), remained favorites among boys. In addition, series books such as Edward Packer’s Choose Your Own Adventure Series (1979), written in various genres, were popular. Other books of the 1970s dealt with family conflict and spoke of real-life occurrences. The struggle to resist peer pressure appeared in sports books and in books that took place in school settings. Perhaps one of the most controversial books of the 1970s was Robert Cormier’s The Chocolate War (1974). The book struck a chord with parents and school officials, but critics believed the book to be too negative. The Chocolate War was aimed directly at the emotions of young people and the significance of peer pressure in schools. It acknowledged the need for children to stand up for themselves but did not simplify the consequences that might follow. The book was also controversial because it called attention to the hierarchical abuse that peers inflicted on each other in school. Other themes that predominated in books from the 1970s were sexuality, disco, and drugs. The importance of detail within a story increased. Television opened up the visual world; therefore, books had to become just as vivid. Fantasy writers such as William Steig and Laurence Yep developed worlds that the reader could visualize. In Steig’s book Abel’s Island, the main character Abel was a civilized and sophisticated mouse who was transported to an isolated island by a hurricane. At first, Abel spent his time con-
Books since 1960 templating his life and how he had arrived at his current predicament. Finally Abel realized that he must survive and developed the resourcefulness to take care of himself. The character was eventually transported back to his home with a new perspective and positive plans for the future. In 1971, Steig wrote Amos and Boris, which focused on a beached whale named Boris and Amos’s attempts to rescue him. Amos enlisted the whole town and used various imaginative strategies to complete the task. Young male characters overcame adversity in Laurence Yep’s 1975 Newbery Honor book Dragonwings. A young Chinese boy growing up in the 1900s in America had to learn quickly and take on grownup responsibilities to survive in San Francisco. The 1980s produced the “Me Generation,” but also it was the decade of great advancements in technology, serious natural disasters, man-made disasters, and global triumphs. Hurricane Andrew ravaged the United States, and the Challenger explosion tested faith in technology. Chernobyl and terrorism fostered Americans’ stereotypes about foreigners, whereas the end of the Cold War and the fall of the Berlin Wall helped to restore faith in humanity. The needs of a fastpaced society were increasing. Young men were growing up on fast food and drive-thru conveniences. Violence and sex on television were there for nightly viewing, and professional sports became the nationwide obsession. Heroes for young men were consistently saving the day in either sports or the movies. Superman was in the theaters, and Michael Jordan was ruling the boards. Books for boys became sleeker, fastpaced, and filled with humor. Jerry Spinelli wrote humorous books set at school that dealt with peer relations and
111
social embarrassment. His first book, Space Station Seventh Grade, shared the experiences of a boy trying to survive in seventh grade. Gary Paulsen brought survivalist books to the bookshelves of households in Tracker (1984), Dogsong (1985), Hatchet (1987), and The Winter Room (1989). Paulsen’s leading male characters confronted difficult situations on the farm or alone in the woods, possessed amazing problem-solving abilities, and always succeeded in the end. In Harris and Me, Paulsen dealt with other prominent 1980s issues: he explored family structures and alcoholism through a patient, independent male character who went to live with members of his extended family. In the 1980s, fantasy writer Brian Jacques published the first novel in his Redwall series. Jacques’s characters were mice, rats, and other animals, like Steig’s characters in the 1970s. The main character was a young male mouse, timid and courageous all at the same time. Another author, Walter Dean Myers, created realistic situations and focused on his readers’ development of historical knowledge. Myers explored drugs, street life, family structures, and peer pressure in his book Scorpions. He wrote about a young male struggling to survive while everything around him was falling apart. The 1990s opened the doors to all genres of literature for boys. Publishers produced hundreds of informational texts, nonfiction, realistic fiction, historical fiction, and humorous folktales and fairy tales geared for boys. Throughout the decade, the economy grew, and children became major consumers. President William Jefferson Clinton reached out to young voters in his campaigns but became embroiled in a public scandal with a young intern at the White House. Violence was in the media every day. Talk
112
Books since 1960
shows, shock radio, sensational television, and violent computer games became common. School shootings shocked Americans and people around the world, and pressure grew for a morals/ethics curriculum in schools. In 1991 the Gulf War began, which landed the greatest number of American troops on foreign land since Vietnam. Once again, families observed a war over dinner as “smart bombs” found their targets. Ethnic cleansing became a common practice in Europe and Africa, while other countries hesitated to react. Congress witnessed major battles over the tobacco industry. Clarence Thomas became the second African American Supreme Court justice, even though he was accused of sexual harassment. Tiger Woods epitomized a champion. The centennial Olympics in Atlanta captivated the world, but the bombings there and in Oklahoma City promoted fear of terrorism. The school shootings at Columbine High School in Colorado were a tragic episode in the decade that affected everyone. The shortened global attention span in the 1990s added to production of massmarket paperbacks, focusing more on appealing, familiar characters than on an author’s style or theme. Every television show and movie produced a book. Boys’ books continued to emphasize problem solving and realistic situations. Amid the turbulence, humor was strong as a genre. Louis Sachar brought the contrast of humor and serious themes to his 1999 Newbery Award winner Holes, in which the main character, Stanley Yelnats, was falsely incarcerated and had to survive his new life at a work camp. Jon Scieszka and Lane Smith developed an award-winning formula with their humor and illustrations. The authors applied contemporary
and humorous twists to classic fairy tales. In their book The Stinky Cheese Man and Other Fairly Stupid Tales, the duo took familiar tales such as Jack and the Beanstalk and added comedy. They continued to delight readers with their version of the three little pigs, entitled The True Story of the Three Little Pigs as Told by A. Wolf (1989). Lane Smith enhanced the stories by illustrating the books with contemporary characters and pictures. Sciezka and Smith also collaborated on a series called the Time Warp Trio, in which three young boys with access to a time machine set off on various adventures. In each short book the boys found themselves in a predicament and became courageous enough to overcome it. R. L. Stine’s popular 1990s series Goosebumps reflected the increasing presence of violence in the media, video games, and television. Goosebumps books were suspense novels for the young that took the character into various situations facing villains, monsters, and evil beings. The characters typically lacked confidence or were shy in the beginning of the book, but by the end they found the courage to persevere. Stine also wrote a young adult series called Fear Street (1995). Both series’ clear point of view and fantasy situations made them two of the most popular series in the 1990s. Another popular 1990s series, written by J. K. Rowling, took boys as well as girls to Harry Potter’s world of wizardry. The series focused on the years that the main character, Harry Potter, spent at Hogwarts wizardry school. Foster parents raised Harry after his biological parents were murdered by an evil villain who continued to stalk Harry at school. The main character possessed many of the attributes familiar in boys’ books. Harry Potter was an unlikely hero, but through peer relationships and his personal sense
Books since 1960
113
A twelve-year-old boy reads one of the 1990s most popular fantasy book series about Harry Potter. (James Marshall/Corbis)
of justice, he reluctantly solved all the mysteries and saved the day. Jennifer Clement
References and further reading Alexander, Lloyd. 1968. The High King. New York: Bantam Doubleday Dell. Cormier, Robert. 1974. The Chocolate War. New York: Laureleaf. Daniel, Clifton, ed. 1987. Chronicle of the 20th Century. New York: Prentice Hall. Hinton, S. E. 1967. The Outsiders. Boston: G. K. Hall. Huck, Charlotte. 1997. Children’s Literature in the Elementary School. 6th ed. Boston: McGraw-Hill. Jacques, Brian. 1986. Redwall. New York: Putnam. Keats, Ezra Jack. 1962. The Snowy Day. New York: Penguin. L’Engle, Madeleine. 1962. A Wrinkle in
Time. New York: Farrar, Straus and Giroux. Murray, Gail S. 1998. American Children’s Literature and the Construction of Childhood. New York: Twayne Publishers. Myers, Walter Dean. 1988. Scorpions. New York: HarperCollins. O’Dell, Scott. 1967. The Black Pearl. New York: Bantam Doubleday Dell. Paulsen, Gary. 1993. Harris and Me. New York: Bantam Doubleday Dell. Rowlings, J. K. 1997. Harry Potter and the Sorcerer’s Stone. New York: Scholastic. Sachar, Louis. 1998. Holes. New York: Farrar, Straus and Giroux. Scieszka, Jon. 1992. The Stinky Cheese Man and Other Fairly Stupid Tales. New York: Penguin. ———. 1996. The Time Warp Trio Series. New York: Penguin. Sendak, Maurice. 1963. Where the Wild Things Are. New York: HarperCollins.
114
Boxing
Spinelli, Jerry. 1982. Space Station Seventh Grade. Toronto: Little, Brown. Steig, William. 1971. Amos and Boris. New York: Farrar, Straus and Giroux. ———. 1976. Abel’s Island. Toronto: Collins Publishing. Stine, R. L. 1995. Goosebumps. New York: Apple/Scholastic. Wojciechowka, Maia. 1964. Shadow of a Bull. New York: Simon and Schuster. Yep, Laurence. 1975. Dragonwings. New York: HarperCollins.
Boxing An ancient sport, boxing served as a form of military training. By the eighteenth century, English working-class males engaged in bare-knuckle fights for money prizes (prizefighting) and American slaves were forced to engage in the practice for the amusement of their masters. In the generation after the Civil War, boxing took on added significance as a symbol of manhood. The perceived feminization of culture and the lack of a war in which to demonstrate their bravery and courage brought an increased interest in boxing among men. Upper-class males took up the art of self-defense as a mark of their masculinity and rewrote the rules to bring greater regulation to the sport. Padded gloves came into use in the 1880s, and the Amateur Athletic Union, the governing body for sports in the United States at that time, initiated its first boxing championships in 1888. African American and immigrant youths, in particular, saw boxing as a means to social mobility, and athletic clubs, park districts, newspapers, and religious groups sponsored amateur boxing teams throughout the twentieth century. Youths who proved successful in the local, regional, and national events often persevered as professional boxers in the hope of attaining a measure of wealth and fame.
Youths engaged in boxing despite its reputation as a vile and brutal sport. Throughout the nineteenth and early twentieth centuries, middle-class reformers attempted to ban prizefighting and curb the gambling that accompanied it. Nevertheless, the sport survived in private clubs, clandestine bouts, and in its supposedly more pure amateur form. Among working-class youth, fighting proved a necessity in urban areas where ethnic and racial rivalries fueled hostile encounters. Jack Johnson emerged from the “battles royal” of the neighborhoods to become the first African American heavyweight champion of the world in 1908. By 1913 the Chicago Hebrew Institute provided boxing lessons for its members and soon fielded a team to counter notions of Jewish debility. The Democratic Party, champion of the working class, won legalization for boxing in New York in 1920. In 1923 the Chicago Tribune, a Republican newspaper; challenged the boxing laws in Illinois by organizing an amateur tournament, since 1927 known as the Golden Gloves for the gold pendants awarded to champions in each weight class. Boxing was formally legalized in Illinois in 1927. New York and Chicago soon became national boxing centers and rivals as the Tribune’s team of champions faced their counterparts from the New York Daily News in annual tournaments. Many of boxing’s greatest champions, including Joe Louis and Sugar Ray Robinson, gained fame as youths in the Golden Gloves competitions. In 1930 the Chicago archdiocese initiated its Catholic Youth Organization (CYO) under the leadership of Bishop Bernard J. Sheil. A comprehensive athletic program, boxing proved the centerpiece of and the major fund-raiser for the organization. The 1931 CYO tournament
Howard Williams (L) and Edward Smith (R) face off during a boxing match in St. Paul, Minnesota. (Underwood & Underwood/Corbis)
116
Boy Scouts
drew 18,000 fans, and champions earned a trip to California and college scholarships. In subsequent years the CYO fielded international boxing teams and managed the professional careers of its boxers. By 1934 CYO fights were broadcast over the radio, and more than 2,200 youths entered the Chicago tournament in 1935. Under the direction of Arch Ward, sports editor of the Chicago Tribune, newspapers collaborated in promoting a national Golden Gloves tournament, which, like the CYO, produced a team for international bouts against foreign teams that filled the sports pages and sold newspapers. By 1938 the Golden Gloves tournament drew 23,000 entries from twenty-six states. World War II depleted the boxing ranks as fighters joined the war effort. But youth resumed their pugilistic interests in the 1950s. By 1960 the Golden Gloves tournaments had produced nearly twenty world champions in the professional arena. Although greatly diminished from its heyday in the 1930s, boxing continues to be an important activity of youth, who engage in the sport in the annual Golden Gloves tournaments, private gyms, park district programs, police athletic associations, and other amateur athletic organizations. USA Boxing, the national governing body for amateur boxing affairs, including the Golden Gloves and other tournaments, continues to sponsor training camps and competitions for both males and females in twelve weight classifications. Gerald R. Gems References and further reading Gorn, Elliott J. 1986. The Manly Art: BareKnuckle Prize Fighting in America. Ithaca: Cornell University Press.
Isenberg, Michael T. 1988. John L. Sullivan and His America. Urbana: University of Illinois Press. Sammons, Jeffrey T. 1990. Beyond the Ring: The Role of Boxing in American Society. Urbana: University of Illinois Press.
Boy Scouts Boy Scouts is a large nonprofit organization furnishing outdoor recreation and instruction intended to build character and instill good citizenship. Robert BadenPowell drew upon his experience in military scouting and on existing programs for boys to formulate the basic plan for Boy Scouting, published in Britain in 1908. Uniformed and organized in patrols and larger troops, boys learn woodcraft and other outdoor skills, earn promotions in rank and badges for specific skills, perform intermittent individual and group service to others, and train for citizenship. The Boy Scouts of America (BSA), founded in 1910, soon became the largest independent character-building program for boys in the United States. Though originally intended for adolescents, Boy Scouting in the United States served mainly preadolescents and in 1930 launched Cub Scouting for younger boys, which eventually outgrew the regular Boy Scout program. At first slow to develop distinctive activities for older youths, the Boy Scouts of America enjoyed considerable success in the late decades of the twentieth century with a vocationally oriented Explorer program for both boys and girls. All branches of scouting grew rapidly after World War II; all except Exploring declined significantly in the 1970s but recovered membership by the 1990s. The American organization’s political and cultural conservatism led to controversy and legal battles over exclusion of
Boy Scouts
117
Boy Scouts dressed in uniforms resembling U.S. military uniforms display their patriotism by holding U.S. flags in Arlington National Cemetery, 1930. (Library of Congress)
girls (from all but Exploring), nontheists, and gays. British general Robert S. S. BadenPowell (1857–1941) launched Boy Scouting in 1908 with the serial publication of Scouting for Boys. Loosely based on military scouting but free of the tedious close-order drill that characterized explicitly military cadet training, Baden-Powell’s program promised outdoor adventure and companionship in patrols with boy leaders under the supervision of an adult scoutmaster. Boy Scouts promised adherence to nine (later ten) scout laws mandating virtues such as loyalty and cheerfulness. By passing tests in first aid,
signaling, knot tying, and similar skills, scouts could advance to second-class and then first-class rank. Further badges recognized proficiency in pursuits such as ambulance work and seamanship. To British elites chastened by the recent and inglorious Boer War (1899–1902) and fearful of imperial decline, scouting promised to strengthen the rising generation by instilling character and patriotism. Promoted by the publisher Arthur Pearson and exploiting Baden-Powell’s popularity as a rare hero of the Boer War, scouting mushroomed. Despite some challenges to his authority and defections over the years, Baden-Powell maintained control
118
Boy Scouts
of scouting through a central office. Boy Scouting was an organization, not just an activity open to all. William D. Boyce, a newspaper publisher, incorporated the Boy Scouts of America (BSA) on February 8, 1910, but did little more to develop scouting. Instead, workers for the Young Men’s Christian Association (YMCA), which already ran extensive recreational and character-building programs for boys, supervised the transplanting of scouting to the United States and persuaded Boyce to cede control to an independent organizing committee. This group hired James E. West (1876–1948) as executive secretary. West, whose title soon changed to chief scout executive, remained in power from 1911 to 1943 and dominated the early development of the BSA. Under West, the BSA retained all major features of British scouting but expanded and formalized rules, further standardizing the program and strengthening control by national headquarters. Scout insignia added an American eagle; the scout’s promise became an oath; and three new laws (bringing the total to twelve) declared that American Scouts were brave, clean, and reverent. Merit badges (open only to first-class scouts) proliferated, and a hierarchy of ranks extended beyond first class: a Life Scout had won five specified badges, a Star Scout ten, and an Eagle Scout twentyone. By 1914 the BSA required annual registration of all scouts and scoutmasters with national headquarters. Unlike the British movement, which relied on retired military officers and other unpaid gentlemen, the BSA supervised local councils using paid scout executives who received brief, in-house training on scouting procedures and owed their careers to headquarters. Similarly, within
troops and at summer camps, the BSA favored close adult supervision and downplayed the boyish initiative implicit in the patrol system. A congressional charter, secured in 1916, strengthened the Boy Scouts of America’s identification with national purposes, secured its exclusive right to Boy Scout terminology, and helped the BSA eliminate unauthorized, often overtly militaristic Boy Scout groups. The BSA, whose uniform closely resembled the U.S. Army’s, received specific exemption from laws against private citizens wearing the uniform. Highly publicized service during World War I, especially in liberty loan drives, displayed Boy Scout patriotism; and the BSA entered the 1920s as an agency of conservative Americanism. The BSA relied on local institutions to recruit scoutmasters and scouts, pay troop expenses, and provide meeting places. Protestant churches of the middle class sponsored about half of all troops from 1911 into the 1920s, and churches still sponsored nearly half the scout units in the 1980s and 1990s. By then Mormons, to whom scouting’s conservative Americanism proved congenial, were hugely overrepresented compared to their percentage of the U.S. population; United Methodists and Presbyterians increased their representation substantially but still lagged behind Mormons. Roman Catholics initially thought Boy Scouting too Protestant but responded to BSA overtures until Catholic churches became leading sponsors of scouting units, though the Catholic presence among Boy Scouts still did not match their percentage of the U.S. population. Public schools and parent-teacher associations assumed a larger role with the advent of Cub Scouting but still sponsored fewer than
Boy Scouts one-quarter of all scouting units by the 1980s. Other civic and community organizations, notably service clubs and veterans’ groups, expanded their involvement so that by the 1980s they backed more than one-quarter of all scouting units. In keeping with troop sponsorship, early scoutmasters were almost exclusively white-collar workers, young businessmen, and professionals; their troops recruited mainly sons of the middle class and of skilled workers. Although recruitment broadened over time, the BSA has retained a disproportionately middleclass constituency. Despite the power of headquarters, local racism held scouting hostage. Southern whites fiercely resisted black boys wearing scout uniforms. By 1929, 3 percent of scout troops were black, but mostly in the midwestern and mid-Atlantic states (Wagner 1979, 253–254). As late as the 1940s, some southern scout officials still fought having African American boys in uniform. In the 1960s southern councils began to integrate and admit black scouts to council camps, but integration of troops began only later as schools finally started to desegregate. Although outreach efforts to target poor African American communities suffered from high costs and intermittent funding, by the late 1960s nonwhite boys were almost as likely as whites to be Boy Scouts. Initially setting age limits at twelve to eighteen, the BSA promoted scouting as a cure for adolescence, distracting boys from sexual urges and antisocial behavior by preoccupying them with instruction in useful skills and outdoor recreation. It promised to control boys at an age when most quit Sunday school and some lost interest in regular schooling. Simultaneously, scouting’s outdoor program and
119
Since the 1960s the Boy Scouts have attracted boys of all races. (Skjold Photographs)
masculine leadership assuaged contemporary fears that middle-class boys were growing soft, raised by mothers and ignored by busy fathers, schooled by women teachers, weakened by sedentary urban living, and enfeebled by popular entertainment and surrender to sexual urges. As depicted by biographer Tim Jeal (1990), Baden-Powell, the movement’s founder, may have been a repressed homosexual; he married only late and reluctantly at age fifty-five. He and other leaders in character building for boys, many of whom also married late or not at all, favored a style of masculinity that was distinctly boyish and preadolescent, strenuously suppressing interest in girls or sexuality of any
120
Boy Scouts
sort. Thus scouting’s answer to adolescence was in large measure to avoid it by prolonging preadolescence. To compound the problem, Boy Scout officials strongly discouraged troops from playing team sports, lest these displace scouting. In so doing, they swam against strong currents in American culture, for success in baseball, football, or basketball, new sports whose popularity mushroomed in the late nineteenth and early twentieth centuries, had by the 1910s become the paramount badge of masculinity for urban boys. Scouting’s roam-thewoods boyhood reflected nostalgia for an earlier era, the world of Tom Sawyer. From the start, accordingly, scouting had trouble keeping underage boys out and older ones in. The program proved so appealing to ten- and eleven-year-olds that headquarters struggled to enforce the age twelve minimum. And down through the years the movement has offered boys a refuge from the combative, winnertake-all competitiveness of team sports. But by high school age, fear that the uniform looked juvenile (as if one were playing soldier), impatience with younger “kids” who flooded into troops, the vogue of school-sponsored sports, and growing interest in girls all combined against continued involvement in scouting. A related problem was how well the program could sustain boys’ interest. Camping, scout leaders assumed, furnished the strongest attraction; but running camps, especially safe and sanitary ones, taxed the resources of early scoutmasters. Increasingly, the BSA organized mass camps with instructional programs centered on preparation for meeting promotion requirements and winning badges, to which individual troops could bring their scouts. Yet large camps risked undercutting troop initiative, and in 1940 fewer
than one-third of all Boy Scouts camped for a week or more, although two-fifths camped for one to six days. By 1981, 48 percent of Boy Scouts camped for six or more days, and by 1998 57 percent of scouts did so (Macleod 1983, 298; Boy Scouts of America 1998). Advancement through the ranks furnished another test of engagement. This proceeded slowly at first: in 1920 just one scout in 600 earned his Eagle Scout badge; about one in eight had reached first-class rank. As the BSA’s instructional program solidified, these numbers rose. In 1981, almost half of all Boy Scouts won some advancement in rank, and more than 2 percent earned Eagle Scout badges; in 1998, more than 4 percent reached Eagle Scout rank (Macleod 1983, 248–249; Boy Scouts of America 1998). Commitment as measured by retention rates also rose over time. From 1914 (when national registration began) through 1922, fewer than half the boys enrolled at the start of a given year remained members the following year. By 1980 the proportion for scouts and Explorers approached two-thirds, quite high for a program appealing to a narrow age range (Macleod 1983, 154, 297). That restricted appeal was evident early in the BSA’s history. Despite James West’s ambition to hold boys through adolescence, the Boy Scouts’ median age from the 1910s through 1934 hovered around 13.9, and the majority quit before their fifteenth birthday. This problem would persist; in 1967 the median age of all Boy Scouts and Explorers over the age of twelve was just 13.7 (Macleod 1983, 282, 292). Bowing to the inevitable, the BSA began Cubbing (later Cub Scouting) in 1930 for boys aged nine through eleven. Anxious not to duplicate regular scouting, the organizers designed a more homecentered program with roles for women
Boy Scouts as leaders. As enrollment surged through the 1940s, the BSA in 1949 lowered the age for Cub Scouts to eight through ten and for Boy Scouts to eleven and older and opened Explorer Scouting, a program for older boys begun in 1935, to boys at age fourteen. Cub Scouting slightly surpassed the two senior branches’ combined membership in 1956, ran neck and neck through the 1960s, and then edged ahead again in the late 1970s. In 1982 the BSA added Tiger Cubs for seven-year-olds; by 1990 it served six-year-olds and boys could become Cub Scouts at age seven. Explorer Scouting at first differed little from regular scouting and grew slowly, but in 1958 the program shifted to center on career interests. In 1968 Explorer Scouting admitted girls as members; substantial growth ensued, and by 1981 girls comprised more than 40 percent of the 500,000 Explorers. By then the majority of Explorer posts were sponsored by business firms, hospitals, police and fire departments, and various civic organizations, enabling teenagers to explore possible vocations firsthand. By 1990 Explorer membership exceeded 1 million. In 1998 the BSA affiliated Exploring with Learning for Life, its new school-based subsidiary, and continued more traditional scouting activities for older youths in its Venturing program. By 1999 the preponderance of younger boys was clear. Cub Scouting ended the year reporting 2,181,013 boy members; Boy Scouting had 1,028,353 boys; Venturing had 202,486 boys and girls; and Exploring reported 330,489 boys and girls (Baker 2000). The BSA’s national leadership reacted to the youth culture and antiwar protests of the late 1960s with shock and outrage and took a conservative position in the culture wars that followed. In a drive to
121
expand membership (Boypower ’76) and demonstrate the movement’s social relevance, the BSA revised Boy Scout requirements in 1972 to address urban concerns. Drug abuse avoidance, first aid for rat bites, and urban hiking were in; tracking, semaphore signaling, and canoeing were out. Urban outreach was expensive, however, and strained the loyalty of Boy Scouting’s traditional constituencies. Overall BSA membership fell by onethird during the 1970s, shrunk in part by declining birthrates but still more by faltering recruitment. Only Explorer Scouting grew, Cub Scouting declined, and Boy Scout enrollment dropped sharply. In 1979 a revised handbook turned the Boy Scout program firmly back toward the woods, and national headquarters moved from northern New Jersey to a more financially and politically congenial setting in Irving, Texas. In the 1980s total membership stabilized and then began to rebound, until by the late 1990s it stood approximately where it had in 1970. Meanwhile, beginning mostly in the 1980s, widely publicized legal battles advertised and may have reinforced the cultural conservatism of the BSA’s national leadership. Although girls could be Explorers and in 1988 the BSA opened all leadership positions to women, the organization held firm against admitting girls to branches of scouting open only to boys under age fourteen. Despite some losses in lower courts, in higher state courts the BSA generally prevailed against legal challenges to this and other exclusions. Typically, courts ruled that as a private nonprofit organization, the Boy Scouts of America was exempt from state civil rights laws that governed businesses and other public facilities. The BSA’s Declaration of Religious Principles holds that “no member can grow into the best
122
Boys’ Choruses
kind of citizen without recognizing an obligation to God” (U.S. Scouting Service Project 2000). Adults unwilling to sign could not lead Boy Scout troops, and children refusing to affirm belief in God were denied scout membership. In a ruling characteristic of the ultimate disposition of such cases, the California Supreme Court in April 1998 unanimously upheld the barring of twin brothers from a Cub Scout den in 1990 for refusing to declare belief in God. (Admitted by a lower court, by 1998 the boys had qualified to become Eagle Scouts.) The most heated debates, also pitting liberal against conservative church leaders, erupted over exclusion of gays, especially from leadership positions. In this regard as well the BSA’s legal position prevailed in state courts until August 1998, when the New Jersey Supreme Court ruled that state antidiscrimination law governing public accommodation applied to Boy Scouting. The BSA appealed, citing rights of free association and expression. In a 5–4 vote, the U.S. Supreme Court ruled on June 28, 2000, in Boy Scouts of America v. Dale, that compelling the Boy Scouts to accept gay troop leaders violated the organization’s First Amendment right of expressive association. The ruling enables the BSA to bar gays from leadership positions and may support the BSA policy of denying membership to avowedly homosexual boys. David I. Macleod See also Clubs References and further reading Baker, Karen (external relations, Boy Scouts of America). 2000. Telephone conversation, May 30. Boy Scouts of America. 1998. “Annual Report,” http://www.scouting.org/ excomm/98annual/yir1998.html (accessed May 28, 2000).
Dean, John I. 1992. “Scouting in America, 1910–1990.” Ed.D. diss., University of South Carolina. Jeal, Tim. 1990. The Boy-Man: The Life of Lord Baden-Powell. New York: William Morrow. Macleod, David I. 1983. Building Character in the American Boy: The Boy Scouts, YMCA, and Their Forerunners, 1870–1920. Madison: University of Wisconsin Press. Peterson, Robert W. 1985. The Boy Scouts: An American Adventure. New York: American Heritage. U.S. Scouting Service Project. 2000. “BSA Declaration of Religious Principle,” http://www.usscouts.org/aboutbsa/rp. html (accessed May 29, 2000). Wagner, Carolyn Ditte. 1979. “The Boy Scouts of America: A Model and a Mirror of American Society.” Ph.D. diss., Johns Hopkins University.
Boys’ Choruses In a country saturated with sports and electronic toys, each generation produces youngsters whose desire to sing overcomes the inherent obstacles, even the likely ridicule from their peers. Singing in a chorus, a boy learns a gentle form of self-discipline, focus, concentration, awareness, and a sense of his place in history. He discovers that many great men of the past were once choirboys. He further experiences mastering challenges, meeting impossible deadlines, and summoning the courage to learn and perform solos. He shares in the common bond of cultural community and teamwork with his fellow choirboys, forging friendships for life. Through the trial of voice change, he learns about letting go of childhood and accepting responsibility for himself as a youth and an adult. Later in life, he will look back upon his choirboy days as a crucial stage in launching himself into success as an adult. And he will always love music.
Boys’ Choruses
123
The Boys’ Choir of Harlem accompanies singer James Taylor, Boston, 2000. (Reuters NewMedia Inc./Corbis)
Groups of singing boys are a European concept, but records of individual singing boys extend back into antiquity and may be found in the history of Egypt. A choir of boys was honored in the pre-Christian era with a monument that still stands below the Acropolis in Athens. In the early Christian church, choir schools for boys were established and served not only to provide music for the Mass and other liturgies, but also provided the church with future priests. In the late fifteenth century, court chapels vied with one another for the best boy singers. Flemish boys were exported
to sing in Vienna. Choirboys in England admired for the beauty of their voices suffered kidnapping by rival choirmasters. In what may be mere legend, Orlando di Lasso (1532–1594), the famous composer, was said to have been kidnapped three times because of the beauty of his voice. Boyhood in earliest America did not include much organized singing. In Protestant churches boys did engage in the singing of hymns and psalms with the entire congregation. A few choirs of men and boys carried on the Anglican cathedral and collegiate traditions that were brought to America by immigrants from
124
Boys’ Choruses
Great Britain. Almost all of these choirs were located along the East Coast of the area that became the United States, clustered around major universities and cathedrals, until as late as the early twentieth century. Many notable Episcopal choirs of men and boys were held in high esteem throughout New England at least through the 1960s, when ecclesiastical traditions began to be undermined. In the Roman Catholic tradition, choirs of men and boys could be found throughout the eighteenth and nineteenth centuries in urban areas in which that faith flourished. Most prominent of the Roman Catholic choirs of boys and men were the Paulist Choristers, founded in Chicago in 1904 and in New York in the 1920s. These choirs were directed by Father William J. Finn, who in his later retirement was also a prolific author on the subject of choir development and voice building. When Father Finn moved to New York in 1928, Father Eugene O’Malley succeeded him in Chicago until his own retirement in 1967. In Pasadena, California, John Henry Lyons started a boys’ choir in 1925 in his function as music administrator for the Pasadena public school district. After Lyons’s retirement in 1953, the choir was reinvigorated by John R. Barron in 1970 as a community boys’ choir. The Pasadena Boys Choir remains one of the few treble-only community boys’ choirs in the United States. Community boys’ choirs as such were unknown in the United States until the early 1930s. Their growth and development were influenced primarily by those famous touring boys from Vienna, the Wiener Sängerknaben (Vienna Choir Boys). Father Josef Schnitt organized the touring program to provide care and lodg-
ing for the many war orphans roaming the streets of the former empire capital. In the wake of Vienna Choir Boys concert tours, community boys’ choirs sprouted all across the United States. Most prominent among the early boys’ choirs was the Apollo Boys’ Choir founded by Coleman Cooper in 1935, which was a unique American expression of the Viennese concept. The Apollo Boys’ Choir toured the country from coast to coast in limousines and lived and worked together in princely splendor, even during the Great Depression. But most American community boys’ choirs struggled to survive. After initial enthusiasm and local success, competition with sports activities overwhelmed many. One such early community boys’ choir exists today under a different name. In 1937, Herbert Huffman founded the Columbus Boychoir under the auspices of the local Kiwanis Club. By 1950, the choir had earned a national reputation for excellence, and the decision was made to move the choir to its present choir school home in Princeton, New Jersey, called “Albemarle.” Arguably the best U.S. allboy choir in the year 2001, the American Boychoir was so renamed in 1980. Not to be left out, in 1984, Columbus, Ohio, reorganized its own Columbus Boychoir, serving a largely African American community. During the decade of the 1930s, Father Edward Flanagan’s Boys Town Choir was organized, but not until 1941 did Father Flanagan charge Father Francis P. Schmitt with the task of providing the sort of training that would launch the choir into national tours. In the late 1970s, however, the music stopped. Boys Town Choir was disbanded as archaic and no longer needed. In 1934, Bob Mitchell was organist and choirmaster at
Boys’ Choruses
125
California Boys’ Choir (Courtesy of Douglas Neslund)
St. Brendan’s church in Hollywood, when directors were making films that often contained a religious or family theme in which a boys’ choir could be utilized. Bob Mitchell’s Singing Boys made more than 100 such movies, including Going My Way with Bing Crosby. A rival choir was located in nearby Long Beach, California, the Choristers of St. Luke’s Episcopal Church. Under the direction of William Ripley Dorr (1891–1968), the St. Luke’s Choristers also appeared in many movies, including Christmas Holiday (1944) with Deanna Durbin and Let’s Sing Again (1936) with boy singer Bobby Breen. Today, however, St. Luke’s choir includes men and women. After he moved to the American desert because of health problems and became homesick for the traditional music of Europe, British citizen and former Metropolitan opera singer Eduardo Caso
founded the Tucson Arizona Boys Chorus in 1939, adapting its literature to his new homeland. From its beginning to the present, rope tricks and coyote howls punctuate every concert given by this choir, which is currently under the direction of Julian Ackerley. In Denton, Texas, former soloist of the Apollo Boys’ Choir George Bragg desired to establish a signature boys’ choir in the American Southwest and founded the Denton Civic Boys Choir in 1946. Through hard work and steady application of high ideals and principles, Bragg was able to move the choir to Fort Worth in 1957, where it took the name the Texas Boys’ Choir. Through the decades of the 1960s and 1970s, the Texas Boys’ Choir represented the finest boys’ choir singing in the United States, appearing on national television, making popular recordings, and touring throughout the
126
Boys’ Choruses
nation. Perhaps the high point was the awarding of two Grammy Awards to Bragg for recordings made in Europe in the mid-1960s. Bragg has published a series of books on the subject of boys’ choirs. An outstanding Texas Boys’ Choir soloist, Donald Collup, may still be heard and appreciated via the Internet. In the twenty-first century, however, the current administration of the Texas Boys’ Choir has chosen to transform the boys’ choir into a mixed choir with both boys and girls. Another Grammy-winning U.S. boys’ choir is the Atlanta Boy Choir, founded in 1957 and still directed by Fletcher Wolfe, which has traveled and performed extensively in international venues. The Atlanta Boy Choir functions as the resident boys’ choir of the Atlanta Symphony Orchestra. In 1960, Harvey K. Smith assumed the directorship of the Phoenix Boys Choir, which had been in existence since 1948. Through Smith’s leadership, the Phoenix Boys Choir has succeeded where many have failed, having obtained its own choir school. The Phoenix group is directed today by Viennese-born Georg Stangelberger. In 1968, the California Boys’ Choir (CBC) was formed as part of the author’s master’s degree program at the University of Southern California in Los Angeles but quickly grew in esteem to become the unofficial resident boys’ choir of the Music Center. The CBC, at its apex, was the boys’ choir chosen by Columbia Artists Management in New York to succeed the Texas Boys’ Choir as their touring boys’ choir, following the retirement of Bragg in 1974. Although the CBC ceased operations in 1984, its success may be measured by its many former choirboys who, as adults, perform as professional singers and musicians.
Three relatively new boys’ choirs that have prospered and found nurturing roots in their respective communities are the Madison Boychoir, founded by Carrel Pray in 1971 in Wisconsin and still flourishing under Dan Krunnfusz; Florida’s Singing Sons Boychoir in Fort Lauderdale, founded in 1975 by Jeffri Bantz and currently operating under the direction of David R. White; and the Fort Bend Boys Choir of Texas, founded in Stafford in 1982 by William R. Adams, who continues as the choir’s artistic director. New community boys’ choirs continue to be formed, despite societal changes and American boys’ preference for sports over singing. One such new choir is the Pacific Boychoir Academy, founded in 1998 and located in Oakland, California. Kevin P. Fox is currently directing a choir of fifty boys, aided by a staff of three associates. The Pacific Boychoir Academy has competition in the San Francisco Bay Area from the Golden Gate Boys Choir, directed by Steven Meyer; the Ragazzi, directed by Joyce Keil; and the San Francisco Boys Chorus, directed by Ian Robertson. A number of solo boy singers have developed into “stars.” Bobby Breen was featured in movies of the 1930s, including Hawaii Calls (1938). Years later, Wayne Newton started his singing career as a boy soloist. Donnie Osmond and his brothers enjoyed their own television show, and the Jackson 5 made Michael a star for Motown. In 1983, Bejun Mehta recorded an entire compact disc of classical works that set the bar exceedingly high for boy sopranos everywhere, and Billy Gilman is making his mark as a soloist in the realm of country music. Douglas Neslund
Boys Town References and further reading Bragg, George W. 1999. The Big Book. Privately published. Cf. http:// 216.147.109.215/bragg.html (accessed March 11, 2001). Finn, William J. 1939. The Art of the Choral Conductor. Evanston, IL: Summy-Birchard Publishing. Ford, Larry. 2001. “Boychoir—Past, Present and Future,” http://www. boychoirs.org (accessed March 11, 2001). ———. 2001. “Donald Collup Singing Alleluja by Wolfgang Amadeus Mozart,” http://www.boychoirs.org/collup.html (accessed March 11, 2001). Ford, Larry, Gene Bitner, and Lindsay Emery. 2001. “The World of Treble Voices,” http://216.147.109.215/ contents.html (accessed March 11, 2001). Lundstrom, Linden J. 1957. The Choir School. Minneapolis, MN: Augsburg Publishing House. Neslund, Douglas. 2001. “Voices of Angels” bookmarks, http://groups. yahoo.com/group/Voices_of_Angels/ links (accessed March 11, 2001). Sadie, Stanley, ed. 1980. The New Grove Dictionary of Music and Musicians. London: Macmillan.
Boys Town First established as an orphanage for boys by Father Edward Flanagan in 1917, made famous by the movie of the same name in 1938, Boys Town today is a large residential treatment center for troubled boys and girls near Omaha, Nebraska, with many branches in other states. By 1994 Boys Town had assets of $514 million, 400 acres of buildings and grounds, and 900 acres of cultivated farmland (Weisman 1994, 53). In 2000 it changed its name to Girls and Boys Town. In Omaha in 1915, Flanagan was a young Roman Catholic priest who operated a “working men’s hotel” for homeless men. His work with the men convinced him that most had been ill treated
127
as children. They became delinquents as teens and criminals as adults. Flanagan decided to concentrate on helping boys avoid becoming the impoverished, unemployed men who spent the night at his welfare “hotel.” With the help of a friend who was a probation officer in Omaha, Flanagan took charge of five boys with whom he lived in a rented Omaha house. Probation officers, the police, and ordinary citizens soon began sending more boys to him, so by their first Christmas, there were two dozen boys in the home. Later they moved to a larger house in Omaha, where Flanagan and others provided the boys an education and introduced them to playing sports and band instruments. Flanagan’s policy of admitting boys of all races and religions as well as those whom the courts labeled juvenile delinquents disturbed his Omaha neighbors. They criticized the boys and Flanagan himself to the point that the young priest decided to find a farm outside town to which he could move his little community. In 1921 he bought 160 acres 10 miles west of Omaha that eventually became Boys Town. Of course, Flanagan had not only detractors but supporters as well. Volunteer groups donated their time and money to his program. When the local schools objected to Boys Town boys attending classes, Flanagan set up his own school at the institution with nuns loaned him by the local Catholic bishop. In the early days on the farm, boys lived in makeshift wooden barracks, farmed the fields, and tended cows and chickens. In 1922, with the assistance of several wealthy Omaha supporters, Boys Town began construction of the first fivestory permanent building to house the school, dormitory, gym, and workshop.
128
Boys Town
Father Flanagan, the founder of Boys Town, talking to an orphan boy (Bettmann/Corbis)
Flanagan’s credo was “There’s no such thing as a bad boy—only bad environment, bad thinking, and bad training” (Jendryka 1994, 5). There were no walls around Boys Town, and boys were free to choose to come and live there. Although Flanagan called it a home, the boys lived in large dormitories. All received religious training, Catholic or Protestant. Jewish boys could attend synagogue in town. And Flanagan kept the boys very busy—with schooling, vocational training, sports, music, and farmwork. When they left Boys Town, many returned to their families or entered foster care
homes. In 1936 Boys Town was incorporated as a separate city with its own police force, fire department, and post office. Flanagan was very successful at publicizing Boys Town. He invited celebrities who visited Omaha to come to his home and meet the boys. Over time his celebrity visitors included Franklin D. Roosevelt, Bing Crosby, Bob Hope, and Babe Ruth. He conducted weekly radio broadcasts in Omaha and traveled with his boys throughout Nebraska and neighboring states. The boys marched in parades and performed band concerts at fairs. In 1938 the movie Boys Town
Boys Town starred Mickey Rooney as an orphan boy and Spencer Tracy as Father Flanagan. Tracey won an academy award for his performance in the film and donated the award to Boys Town. Thanks to the movie, Boys Town became a household word, and Flanagan took advantage of the favorable publicity to raise money for the school. In the 1940s, with the help of Ted Miller, a fund-raiser attracted to the school by the movie, Boys Town launched a nationwide mailing system at Christmas and Easter that solicited contributions with a picture of one boy carrying another and the caption: “He ain’t heavy, Father. . . . He’s M’ Brother.” In 1948, Father Flanagan died of a heart attack in Berlin while serving as a U.S. government emissary to help European countries deal with youth problems in the aftermath of World War II. By this time the school was sending out millions of letters and collecting extraordinary sums of money. In 1972 Sun Newspapers of Omaha won a Pulitzer Prize for articles revealing that Boys Town had assets of more than $200 million yet continued to raise $26 million a year in donations. In addition, the school admitted fewer boys than it had in the 1950s and continued to accommodate them in a large custodial setting at a time when child welfare experts recommended foster care or small group home care for troubled youths. After the Sun Newspapers exposé, Boys Town hired a new administrator, Father Robert P. Hupp, who transformed the school into what it remains today, a family-style, residential treatment center for troubled boys and some girls that range in age from eight to eighteen years. (The school began to admit a small number of girls in 1979.) The courts commit about 70 percent, parents send others, and some boys come on their own. The dormitories
129
were abandoned and replaced with singlefamily homes in which trained married couples acted as parental figures to eight to ten boys or girls. The homes are large and comfortable. There are two boys to a room. They eat breakfast and dinner together; one boy helps with the cooking and another with the dishes. After dinner, the family meets to discuss what happened during the day and to plan weekend excursions to parks and the movies. Each family group also takes a one-week summer vacation together. By the 1990s, the Boys Town program was very structured. The goal is to teach boys and girls, who come largely from broken homes where many have been emotionally, physically, and sexually abused, social skills to help them get along with others and vocational skills to help them get and keep a job. The most basic social skills taught at the school are learning to follow instructions, greet someone, get the teacher’s attention, make a request, disagree inoffensively, and accept criticism and the answer “no.” Family teachers and schoolteachers emphasize these skills by praising appropriate behavior and providing constructive criticism. They do not use corporal punishment or force boys to do sit-ups or run laps for behaving incorrectly. To reinforce good behavior, the boys receive “empowerment cards” when they enter Boys Town. They receive points for good behavior and lose points for inappropriate actions (such as fighting or failing to clean their rooms). To keep basic privileges, a boy or girl must earn 10,000 points a day. The goal is to teach boys and girls what behaviors are appropriate by showing them which ones pay off in points. At the Boys Town school, the average class size is between eight and twelve students. Most boys are one or two years
130
Braces
behind academically when they arrive, so the teachers have to work intensively with the boys to teach them basic reading and math skills. Boys must also take vocational courses in keyboarding, computers, career development, and home management. Many take additional vocational courses as well, and some attend the local public high school to take courses to prepare them for college. Religion remains an important part of the Boys Town program. School officials argue that believing in a higher power serves to strengthen boys. Religion (of the boy’s choice) is a required part of the curriculum, and family teachers also encourage boys to meditate and read the Bible or other religious books. The cost for one year of education at Boys Town in 1994 was $47,000, about average for residential treatment programs across the country. Boys usually spend eighteen to twenty-two months at the school and then leave to return to their families or to manage on their own in jobs or in college. According to Boys Town, it has about an 80 percent success rate. That means that 80 percent of those who enter the school complete the program, graduate from high school, get and keep jobs, and avoid arrest and dependency on welfare. By the 1980s Boys Town had expanded its program beyond the Nebraska campus. In 1995 there were seventeen Boys Towns in several states. In 2000, when girls in the school constituted nearly half the population, administrators asked the 900 young residents to vote on a name change. The youngsters overwhelmingly approved the new name: Girls and Boys Town. Priscilla Ferguson Clement See also Orphanages; Poverty
References and further reading Bosc, Michael. 1987. “Boys Town: New Ways but Respect for the Past.” U.S. News and World Report, March 20: 38–39. Girls and Boys Town. 2000. “About Boys Town, History,” http://www.boystown. org/home.htm (accessed September 5, 2000). Hupp, Father Robert P. 1985. The New Boys Town. New York: Newcomen Society of the United States. Jendryka, Brian. 1994. “Flanagan’s Island: Boys Town 1994.” Current (November): 4–10. Weisman, Mary Lou. 1994. “When Parents Are Not in the Best Interests of the Child.” Atlantic Monthly 274, no. 1 (July): 42–63. Williams, Paul N. 1972. “Boys Town, America’s Wealthiest City?” Sun Newspapers of Omaha. Special report, March 30.
Braces See Orthodontics
Bullying In the last decades of the twentieth century, a plethora of research on bullying among children appeared. What began as a movement of inquiry among Europeans in the 1970s and grew to include researchers in Canada, Australia, and Japan has taken hold of the American imagination in the late 1990s. By the twenty-first century, the United States was awash in studies about bullying, aggression, peer-inflicted trauma, contributing parental styles, and, more recently, explosions of gun violence. This new framing of the problem of bullying contrasts sharply with earlier discussions of the topic, which saw bullying as an unfortunate but inevitable stage of boyhood. For example, in a popular educational psychology textbook from the 1930s, bullying was viewed as a normative social experience for boys:
Bullying
131
Boys bully another in a kindergarten class. (Shirley Zeiberg)
Ordinary teasing, especially on the part of boys, is a common expression of aggressiveness. When it is a source of difficulty in school situations, the victims of the teasing can usually be taught to laugh their way to freedom, or effectively to ignore the teaser. It should be explained to them that teasing is done for its effect and lacking the effect, it will cease (Wheeler and Perkins 1932, 433). There is no way to discuss the history of research on bullying without discussing the contributions of Dan Olweus, a professor of psychology at the University of Bergen, Norway, who has spent more than twenty-five years researching
the topic. Although his research has largely focused on Scandinavia, the survey that he developed has been used in other settings, including the United States. Before importing wholesale the findings from Olweus’s research to children and schools in the United States, two major differences need to be articulated. First, the Scandinavian countries are largely homogeneous countries, without much diversity in race, ethnicity, language, or religion, factors that often serve as triggers for bullying. Second, those countries, as is the case with most countries in Europe and those that they colonized, have a standardized, nationalized curriculum, allowing for comparisons to be made within and across each country,
132
Bullying
down to the level of particular classrooms. Olweus’s research and suggestions for intervention, then, are predicated upon national regulation of the whole school environment, including classroom content, pedagogy, informal and formal activities, guidance and counseling services, and activities directed at parents. Nonetheless, owing to Olweus’s primacy in undertaking and popularizing research on bullying, attention must be paid to his contributions to the field and to the definition that he developed. To Olweus, bullying occurs when someone is “exposed, repeatedly and over time, to negative actions on the part of one or more other students” (Olweus 1993, 9). According to Olweus, “negative actions” are intentional infliction or attempts at such, including threatening, taunting, teasing, name-calling, hitting, pushing, kicking, pinching, and restraining. He also acknowledges that it is possible to carry out negative actions “without the use of words or physical contact, such as by making faces or dirty gestures, intentionally excluding someone from a group, or refusing to comply with another person’s wishes” (9). Bullying implies an imbalance of strength and can include a single incident of a serious nature, yet not all acts of meanness, pestering, or picking on someone constitute bullying. Olweus emphasizes that the salient feature of his definition of bullying is that the negative actions are repeated and carried out over time. Beyond Olweus’s work, British scholars have contributed to the development of bullying research. Several (Whitney and Smith 1993; Ahmad and Smith 1994) have offered a gendered look at the problem, no doubt because of the long history that the British have with single-sex boarding schools and the decades, even generations, of harassment and bullying
that have gone on in those environments (Keise 1992; Tattum and Lane 1988; Tattum 1993; Smith 1991). In the United States, most research on bullying has taken a psychological/psychopathological point of view. Bullying has been pathologized and made personcentered, which places the focus on victims or bullies. For example, the pathologizing approach toward victims asks what a victim can do to deflect the bullying (smile, talk in a loud voice, avoid contact, find new friends, etc.) and, in some cases, blames the victim for having been victimized. When the focus is on the bullies, this approach proposes treatment programs to rid them of their bullying tendencies and offers guideposts by which to judge potential bullies (Stein 1999; Swearer and Doll, in press). The problem of bullying is individualized and pathologized, perhaps because psychologists have dominated the field of bullying research (Stein, in press). Multiple studies in the United States have demolished myths that have long dominated popular thinking about the problem of bullying. These myths include the ideas that bullying only exists in certain environments, such as rural boredom or urban overcrowding; that bullies are isolated, lonely, marginal children; that as children age, they outgrow bullying behaviors; that certain personalities or characteristics make children more susceptible to bullying (for example, being overweight, wearing glasses, having facial mannerisms, or not participating in athletics); and that only the bully and the victim and not the bystanders are affected by the problem. New research findings have put these myths to rest. Bullying presents itself in all sorts of environments: rural, suburban, and urban, regardless of socioeco-
Bullying nomic and racial or ethnic backgrounds. Bullies are popular and command attention and respect from their peers and even from some adults. Bullying is not simply a behavior that exists while children are very young; it can continue in middle school and beyond. Boys and girls have different repertoires for bullying others. The bullying dynamic has an impact on bystanders as well as on the victim or victims, and bullies and victims change and exchange places and roles. Their relationship is not fixed and static. In other words, today’s bully can be tomorrow’s victim. Studies indicate that bullying is present in all sorts of communities, across race, ethnicity, and socioeconomic status (SES). However, most studies have been conducted in communities with lower SES and in communities that are predominately segregated by race. A study of children in grades seven to twelve (Oliver, Hazler, and Hoover 1994) found that even in small-town, allegedly safe environments, 81 percent of the males and 72 percent of the females reported being bullied by their peers, with ridicule and verbal and social harassment the most common forms. More recent studies in rural communities, including one of mostly Caucasian children from lowSES households in the Midwest (Espelage and Holt, in press) and another in rural South Carolina with a largely African American population, found that 14.5 percent of middle school children in the Midwest study met the criteria for bullying frequently, and 25 percent of students in grades four through six in the southern study admitted to bullying another student with some regularity in the previous three months (Limber et al. 1997). The myth of the marginal, loner, unpopular bully seems to have disappeared
133
with more research. In a study of 452 boys in grades four to six (54 percent were European American, 40 percent African American, and 6 percent Latino and were either from the Chicago area, including inner-city to suburban settings, or from North Carolina, including a rural county and small city), bullies were found to be popular students, whether ranked by their peers or by their teachers. This study suggests that highly aggressive boys can be among the most popular and socially connected children in elementary classrooms (Rodkin et al. 2000). In another study of 400 middle school students in a largely rural community with a significant number of low-SES households (the sample was 51 percent female and 49 percent male, with a racial breakdown of 93 percent Caucasian, 1 percent African American, 2 percent biracial, and 4 percent other racial backgrounds), results indicated that students who bully their peers on a regular basis have the same amount of popularity or peer acceptance (i.e., number of friends) as those students who do not bully their peers (Espelage and Asidao, in press). This finding suggests that students who bully others are not necessarily socially rejected and do have friends. It appears that most of the taunting and teasing by boys is seen as an effective and attractive means of interpersonal interaction. The authors propose that boys use these bullying tactics to obtain status within the social structure, but over time other characteristics, such as athleticism, become more predictive of popularity. When athleticism ascends in importance, however, hazing plays a larger role in young men’s lives (Walsh 2000). The phenomenon of bullying typically has been framed as both inevitable and one that children outgrow with age. Most
134
Bullying
of the research has focused on elementary-age children and has predicted that most children outgrow bullying sometime in middle school. However, these age and grade distinctions are quite arbitrary, and recent studies have begun to look at children in the later middle school years and beyond. Part of the problem with suggesting a cutoff age for bullying behavior has been the definition of bullying as a set of behaviors that exist solely among younger children. When bullying is defined to include genderbased behaviors, including some of the features of sexual harassment, the conventional boundaries of age that encompass bullying begin to expand. Donna Eder, Catherine Colleen Evans, and Stephen Parker (1995; see also Eder 1997) have studied sexual aggression within the culture of middle school students, focusing on language and informal talk such as gossiping, teasing, insulting, and storytelling. Eder found that boys and girls alike used sexual putdowns toward girls and that girls’ use of words like sluts or whores helped to maintain a hierarchy with male-oriented, tough, and sexually aggressive boys at the top. Girls also tormented boys who were friendly toward them by casting doubt on their heterosexuality. Eder points out these and other ways in which girls contribute indirectly to sexual aggression. Several other psychologists have begun to look at the manifestations of bullying among middle school children. Anthony Pelligrini (in press) and Dorothy Espelage, Kris Bosworth, and Thomas R. Simon (1999, 2000) have probed the older age limits of bullying. Each has discovered that the saliency of sexual harassment as bullying is transformed with age. Eder, Nancy Lesko (2000), and Michael Kimmel (1996, 2000) may be among a mi-
nority of researchers in the United States who recognize the role that homophobia plays in gender relations, especially among boys. Popular books by psychologists William Pollack (1998) and Dan Kindlon and Michael Thompson (1999) also acknowledge the potent force that the label of “weak” or “girl-like” can have in boys’ lives. Most of the research on sex differences in bullying has been done by European scholars who have found that boys bully with direct physical behaviors, whereas girls specialize in indirect bullying, such as slander, spreading rumors, exclusion or shunning, and manipulation of friendships (Olweus 1993). In addition, British researchers have discovered that bullying is carried out by the following perpetrators in descending order of frequency: one boy, several boys, boys and girls, several girls, and one girl. They found this pattern to be consistent in both middle schools and secondary schools. Their research has confirmed the previous findings on gender differences: girls are equally likely to be bullied but are only about half as likely to be involved in bullying others as are boys; and boys are bullied almost entirely by other boys, whereas girls are bullied by both sexes (Whitney and Smith 1993). Yvette Ahmad and Peter K. Smith (1994) found that by secondary school, physical bullying has largely decreased for girls, but there can be an escalation of their involvement in indirect bullying, especially by spreading rumors about others. They conclude that although there are both qualitative and quantitative research findings that confirm the existence of male and female forms of bullying, this finding does not mean that these forms are exclusive to each sex. In the United States, the research of Barrie Thorne has implications for infor-
Bullying mation on bullying. Although not explicitly about this topic, Gender Play: Girls and Boys in School (1993) raises questions about the nature of gendered play and interactions and offers many insights into the development of gender relations in elementary school. She found that boys use sexual insults against girls and that they regard girls as a group as a source of contamination. Boys and girls who do not conform to this prototype, especially those who desire to be friends with one another, are at risk of being teased or ostracized. One girl, speaking of her friendship with a boy at church and in their neighborhood, poignantly offers the reason why the two friends do not speak to each other at school: “We pretend not to know each other so we won’t get teased” (50). The threat of heterosexual teasing may act as a deterrent to cross-gender friendships or may drive those friendships underground. Espelage and Holt (in press) have also found gender differences in bullying. Their research shows that males are more likely to bully their peers than females are, even in situations when bullying is confined to verbal interactions. In summary, it is clear that more work on sex differences in bullying needs to be undertaken by researchers in the United States. When the problem is defined as sexual harassment, however, guidance and findings already exist on the subject (Stein 1999). Researchers on bullying would be well served by becoming familiar with the two decades of research findings that already exist on sexual harassment in schools. An increasing number of researchers have focused on bystanders to bullying, studying their role on the sidelines as well as the impact of their witnessing such behaviors. Much of this new re-
135
search builds upon the work of British researchers (Smith 1991), one of whom referred to bullying as a “silent nightmare” because of the enforced code of secrecy on the part of both victims and witnesses. Canadian researchers (Atlas and Pepler 1998) found that even though peers were present in 85 percent of bullying episodes, they intervened in only 10 percent of the incidents. Since many bystanders fear for their own personal safety, they may choose to ignore bullying episodes. In a study of almost equal numbers of 470 boys and girls in grades five to eight in an ethnically diverse suburban New Jersey community, bullying was found to be a common feature of daily school life. Close to one-half of the students, both male and female, had been bullied. The researchers found that gender and grade effects were evident. For example, twice as many male as female students reported the belief that victims deserved the behaviors to which they were subjected. Male students reported feeling excitement and fear while witnessing bullying, but female students reported feeling fear and helplessness (Jeffrey, Miller, and Linn, in press). Eighth graders were significantly more indifferent to the bullying and to the victim than fifth graders, who reported feelings of anger toward the bully. More eighth graders indicated that they were outsiders to the bullying episode, compared to the fifth graders. What is so troubling about the results of this study is that the students’ indifference further contributes to the eroding of the school culture that allegedly promises safety and equal educational opportunity. More hopeful results, however, emerged from a three-year research project with fifth graders in Austin, Texas. In
136
Bullying
this project students of teachers who used the prevention curriculum Bullyproof (Sjostrom and Stein 1996) saw themselves as active interveners, reporting that they took personal action to interrupt bullying. Among those students who received the curriculum, bullying behaviors were reduced, and students intervened in episodes of peer bullying without relying on teachers to do so (Sanchez et al., in press). Perhaps this discussion of the experience of bullying in children’s lives has pointed out areas to which attention needs to be paid, especially the policing of masculinity and the imposition of compulsive heterosexuality. Not to factor in or even to name these potent elements is to deny a central and operating feature of boy culture, which can be driven by tireless efforts to define oneself as “not gay.” In addition, the liberal usage of the term bullying may indeed be part of a general trend to overlabel children, particularly in a culture that tends to psychopathologize behaviors. As children age, the labels that are applied to them tend to become criminalized. Finally, the definition of bullying has become very elastic. Behaviors that have the potential to be called bullying could be the culturally constructed raising of one’s eyebrow, giving “the evil eye,” making faces, or even expressing preference toward particular people over others. There may be a tyranny of sameness in the subjective pursuit of eradicating bullying behaviors. However, very egregious behaviors can be minimized as bullying, when in fact they may constitute hazing or sexual and gender harassment. Bullying, unlike hazing or sexual or gender-based harassment, is not against the law. Yet illegal behaviors that are labeled “bullying” may cause confusion and con-
tribute to a climate of permission for illegal conduct. In this era of zero tolerance and concern with school safety, including bullying in the list of behaviors that will not be tolerated in school settings may run the risk of suspending or expelling large numbers of children from school. Since those suspended from schools are mostly boys, predominately African American boys, civil rights and gender matters are involved. It may be time to rethink the definition and consequences of bullying. Nan D. Stein
References and further reading Ahmed, Yvette, and Peter K. Smith. 1994. “Bullying in Schools and the Issue of Sex Differences.” Pp. 70–83 in Male Violence. Edited by John Archer. New York: Routledge. Atlas, Rona, and Debra Pepler. 1998. “Observations of Bullying in the Classroom.” Journal of Educational Research 92: 1–86. Bosworth, Kris, Dorothy L. Espelage, and Thomas R. Simon. 1999. “Factors Associated with Bullying Behavior in Middle School Students.” Journal of Early Adolescence 19: 341–362. Eder, Donna. 1997. “Sexual Aggression within the School Culture.” In Gender, Equity, and Schooling: Policy and Practice. Edited by Barbara J. Bank and Peter M. Hall. New York: Garland. Eder, Donna, with Catherine Colleen Evans and Stephen Parker. 1995. School Talk: Gender and Adolescent Culture. New Brunswick, NJ: Rutgers University Press. Espelage, Dorothy, and Christine Asidao. In press. “Conversations with Middle School Students about Bullying and Victimization: Should We Be Concerned?” Journal of Emotional Abuse. Espelage, Dorothy L., Kris Bosworth, and Thomas R. Simon. 2000. “Examining the Social Environment of Middle School Students Who Bully.” Journal of Counseling and Development 78: 326–333.
Bullying Espelage, Dorothy L., and Melissa K. Holt. In press. “Bullying and Victimization during Early Adolescence: Peer Influences and Psychosocial Correlates.” Journal of Emotional Abuse. Jeffrey, Linda, Demond Miller, and Margaret Linn. In press. “Middle School Bullying as a Context for the Development of Passive Observers to the Victimization of Others.” Journal of Emotional Abuse. Keise, Celestine. 1992. Sugar and Spice? Bullying in Single-Sex Schools. Stokeon-Trent, Staffordshire, UK: Trentham Books. Kimmel, Michael. 1996. Manhood in America: A Cultural History. New York: Free Press. ———. 2000. The Gendered Society Reader. New York: Oxford University Press. Kindlon, Dan, and Michael Thompson. 1999. Raising Cain: Protecting the Emotional Life of Boys. New York: Ballantine. Lesko, Nancy, ed. 2000. Masculinities at School. Thousand Oaks, CA: Sage. Limber, Susan P., P. Cunningham, V. Flerx, J. Ivey, M. National, S. Chai, and G. Melton. 1997. “Bullying among School Children: Preliminary Findings from a School-Based Intervention Program.” Paper presented at the Fifth International Family Violence Research Conference, Durham, NH, June–July. Oliver, Ronald, Richard Hazler, and John Hoover. 1994. “The Perceived Role of Bullying in Small-Town Midwestern Schools.” Journal of Counseling and Development 72, no. 4: 416–419. Olweus, Dan. 1993. Bullying at School: What We Know and What We Can Do. Oxford: Blackwell. Pelligrini, Anthony D. In press. “The Roles of Dominance and Bullying in the Development of Early Heterosexual Relationships.” Journal of Emotional Abuse. Pollack, William. 1998. Real Boys: Rescuing Our Sons from the Myths of Boyhood. New York: Henry Holt. Rodkin, Philip C., Thomas W. Farmer, Ruth Pearl, and Richard Van Acker. 2000. “Heterogeneity of Popular Boys: Antisocial and Prosocial Configurations.” Developmental Psychology 36, no. 1 (January): 14–24.
137
Sanchez, Ellen, Trina Reed Robertson, Carol Marie Lewis, and Barri Rosenbluth. In press. “Preventing Bullying and Sexual Harassment in Elementary Schools: The Expect Respect Model.” Journal of Emotional Abuse. Sjostrom, Lisa, and Nan D. Stein. 1996. Bullyproof: A Teacher’s Guide on Teaching and Bullying for Use with Fourth and Fifth Grade Students. Wellesley, MA: Wellesley College Center for Research on Women. Smith, Peter K. 1991. “The Silent Nightmare: Bullying and Victimization in School Peer Groups.” The Psychologist: Bulletin of the British Psychological Society 4: 243–248. Stein, Nan D. 1999. Classrooms and Courtrooms: Facing Sexual Harassment in K–12 Schools. New York: Teachers College Press. ———. In press. “What a Difference a Discipline Makes.” Journal of Emotional Abuse. Swearer, Susan, and Beth Doll. In press. “Bullying in Schools: An Ecological Framework.” Journal of Emotional Abuse. Tattum, Delwyn P., ed. 1993. Understanding and Managing Bullying. Oxford: Heinemann. Tattum, Delwyn P., and David A. Lane, eds. 1988. Bullying in Schools. Stokeon-Trent, Staffordshire, UK: Trentham Books. Thorne, Barrie. 1993. Gender Play: Girls and Boys in School. New Brunswick, NJ: Rutgers University Press. Walsh, Mark. 2000. “Hazing Is Widespread, Student Survey Shows.” Education Week 20, no. 1 (September 6): 14. Wheeler, Raymond, and Francis Perkins. 1932. Principles of Mental Development. New York: Thomas Y. Crowell. Whitney, Irene, and Peter K. Smith. 1993. “A Survey of the Nature and Extent of Bullying in Junior/Middle and Secondary Schools.” Educational Research 31, no. 1: 3–25. [Reprinted by permission of the publisher from Nan Stein. 1999. Classrooms and Courtrooms, chap. 3. New York: Teachers College Press. © 1999 by Teachers College, Columbia University. All rights reserved.]
C California Missions
suffered the increase in intertribal warfare that often occurred in areas where the missionaries settled. Children grew up in a time of little peace, yet it proved impossible to expel the Spanish. A major revolt of the Kumeeyay in San Diego in 1775, for example, lasted for a year. Most of the soldiers in the province went to San Diego to quell the revolt. There, as elsewhere, the guns and horses possessed by the Spanish gave them gave military superiority far greater than their numbers. Antagonisms among tribal groups made the unity achieved in the San Diego rebellion an exception and further enabled the Spanish to maintain their presence. Children frequently comprised the first group baptized. A boy named Hanajiban became the first convert at Mission San Juan Capistrano. The missionaries immediately changed his name to Juan Bautista after the biblical figure who announced the coming of Christ. This name gave Hanajiban, though still a child, the symbolic role of encouraging conversion of others (Haas 1995). The missionaries encouraged the baptism of children because they felt the young formed the most malleable group in which to inculcate Spanish religious and cultural practices. Parents might have initially viewed this as a fairly benign act. They received annual gifts from the missionary after a child’s baptism, and children
Native American boys played a significant role in the California missions from their foundation in 1769 to the emancipation of mission Indians in 1834. Boys who entered the missions alone or with their families and those born at the missions formed part of the most linguistically and ethnically diverse tribal societies within the current boundaries of the United States. They lived during a time of great change due to the Spanish conquest. The process of conquest began when missionaries and Spanish soldiers explored the lands near the coast of California. They decided on a mission’s location according to the geography and organization of tribal society. Once the missionaries and soldiers selected the site, they attempted to negotiate their settlement with the tribal group who claimed the land, generally for hundreds of years prior to Spanish conquest. Sometimes the missionaries reached an agreement. Other times they established a mission within a region overtly hostile to their presence. They relied on negotiation, gift giving, and military force to remain there. Boys would have witnessed male and female tribal elders divided over their response to the Spanish presence. Some saw their fathers, uncles, brothers, sisters, and aunts defeated in revolt and conspiracy to oust the missionaries and soldiers. Others
139
Drawing by Pablo Tac from Indian Life at Mission San Luis Rey. Edited and translated by M. Hewes and G. Hewes. 1958. Old Mission San Luis Rey, CA. (Photograph collection, San Diego Historical Society)
California Missions younger than eight remained with their parents. For Spanish missionaries and indigenous families, these baptisms helped forge alliances to curtail antagonisms. But the very presence of the missions changed the tribal world in which the boys grew up. Tribal society rested on a careful definition of land and resource rights. Missionaries’ claims to the land of converts challenged the political order. Equally devastating, children saw and experienced the effects of diseases such as syphilis, which became endemic. They died in large numbers from European diseases and lost their parents and relatives to prolonged and debilitating illness or swift death when a plague of measles or smallpox swept the missions and tribal villages. Cattle and other livestock altered the natural environment. The tribal world began to collapse, especially because the military and spiritual power of native people seemed incapable of curtailing these developments. Mass baptisms of whole families occurred at each mission anywhere from ten to twentyfive years after their founding. Once inside the missions, it proved difficult for boys to survive to adulthood. People lived an average of eight years after conversion. High rates of infant mortality meant that each mission population began to decline after most Indian people in a region had been converted or left the region. Before conquest, the nature of boyhood differed among indigenous groups, and these differences persisted to varying degrees within the missions. In the tribal world, boys from highly stratified societies like the Chumash could expect to inherit political power and wealth if they formed part of the nobility. In less stratified societies, the elders selected boys who demonstrated particular kinds of
141
skills and aptitudes and prepared them to assume positions of leadership. Some boys and girls would have seen women in their tribes assume political power, and all knew the forms of authority and knowledge that women possessed. However, the patrilineal societies of most native Californians meant that girls prepared to move to their husbands’ village or clan upon marriage, and most forms of wealth passed through the male lineage. In each tribal society, a strict division existed between the roles and responsibilities of boys and girls and men and women. Boys underwent ceremonies of initiation into manhood. Many would be tattooed. Almost all learned to fight and hunt for deer, other game, and fowl. Older men taught them to fish and to make bows, arrows, fishing nets, and other goods. They might have learned more than one language or dialect because their mothers often came from a dialect group different from that of their fathers. In many societies, a few young men assumed the social and cultural role of a woman. In the missions, parents still taught their children traditional skills, practices, and beliefs, but the missionaries sought to restrict their ability to do so. The youngest boys and girls lived with their families and worked in mission production. Boys scared birds from the fields and orchards, gathered wood, and hauled water. They occasionally left the missions on passes to cultivate and gather seeds and acorns with their mothers. The missionaries yielded to indigenous demands to practice their culture but prohibited those things they deemed most adverse to Catholicism. Hence, native elders took initiation rites, other ceremonies, and tattooing underground. They ceased to pass on particular kinds
142
California Missions
of knowledge to those youth they perceived to be too close to the missionaries. The missionaries sought to fortify their ties to the young by removing the children from their parents at the age of eight. Made to live in dormitories, they remained separated from their families until they married, died, or fled to find refuge among tribal people who lived in mountains and valleys far from the coast. Once in the dormitories, boys began to work as men. They prepared fields, sowed and harvested crops, worked in the orchards, built and maintained irrigation ditches, and learned particular skills. Some were apprenticed into jobs such as cook, cowboy, blacksmith, or specialist in a particular crop or in the production of wine. Though the seasons defined work rhythms, the entire population tended to rise at dawn, attend mass, and work until sunset, breaking for midday meals. Boys held a more privileged place in the missions than did girls. They were prepared to assume positions of authority over the native population. Sometimes the priest chose them for that training, but other times boys derived the position from their standing in tribal society. Boys, not girls, became the personal assistants of the priests. Those deemed worthy assumed responsibilities over the religious instruction of others. Though older women and men acted as translators and interpreters, boys rather than girls were trained for that work. Though formal education rarely existed, when it did the missionaries taught boys only to read. Living between contending cultures at the missions created painful realities for children, and their parents and relatives had little ability to protect them. Events surrounding the deaths of two mission
youth offer a sense of the anguish that potentially existed. Both boys, raised at the side of the missionaries and thoroughly instructed in Catholicism, spoke excellent Spanish and interpreted for the missionaries. Yet when one took terribly ill, he refused the medicines and advice of the priest. Instead, he called for a native healer, who told the boy that he had angered his people’s god by going into the mission. The god sent him an incurable illness because he always believed in the missionaries. The boy died a turbulent death. The other boy also refused the missionaries’ advice as he lay dying. Rejecting the final sacrament while shouting blasphemies, he angrily explained that having lived deceived, he did not want to die deceived. In 1832, shortly before the emancipation of mission Indians, a missionary from San Luis Rey took two mission boys named Pablo Tac and Agapito Amamix to Rome. They left San Luis Rey at the ages of ten and twelve, respectively, to study to become missionaries. Agapito Amamix died a few years after arriving in Rome, but Pablo Tac survived smallpox, lived to the age of nineteen, and studied rhetoric, humanities, and philosophy. Probably at the age of thirteen or fourteen, he wrote a manuscript for the Vatican librarian entitled “Conversión de los San Luiseños de la Alta California.” The manuscript, which offers the only contemporaneous writing by an indigenous Californian, illustrates how a boy might comprehend his experience at the mission. Though a steadfast student of Catholicism, Tac nonetheless expressed his love and affinity for his people and distinguished between the perspectives held by the missionaries toward indigenous society and his people’s identity and worldview. Tac wrote, for example, that the
California Missions Franciscans called the territory San Luis Rey but reflected, “we call it Quechla in our language. Thus our grandparents called it.” Though he frequently referred to the mission population as neophytes and indios, names used by the Spanish, he also made it clear in the same passage that “we inhabitants of Quechla call ourselves Quechnajuichom” (Hewes and Hewes 1952, 98). Tac described the structure of work, authority, and punishment that existed at San Luis Rey and recorded the tremendous losses his relatives experienced with conquest. “In Quechla not long ago,” he wrote, “there were 5,000 souls, with all their neighboring lands. Through a sickness that came to California 2,000 souls died, and 3,000 were left” (98). In mapping the mission, he identified the dormitories, depicting a large room for the boys with a patio and two gardens and a room for the girls. He wrote briefly about children’s lives. “The sons,” he stated, would go to “school to learn the alphabet, and if they already know it, to learn the catechism, and if this also, to the choir of singers, and if he was a singer, to work, because all the musical singers work the day of work and Sunday to the choir to sing. . . . The daughter joins with the single girls who all spin for blankets for the San Luiseños and for the robe of the Fernandino Father” (101). In describing the dances performed frequently at the missions, Tac illustrated the kinds of connections between elders and mission youth. Writing about a specific dance, he noted: “No one can dance without permission of the elders, and he must be of the same people, a youth of ten and more years. The elders, before doing the dances publicly, teach them the song and make them learn perfectly” (102). He frequently asserted the impor-
143
tance of elders and wrote that Quechnajuichom danced in memory of grandparents, aunts and uncles, and parents who had already died. Tac paid careful attention to the kind of authority native men held and the extreme limitations on that authority. Though Tac never expressed remorse about the conquest of his people, in the last paragraph he took pleasure in the way one indigenous leader contested Spanish authority. In that passage he described a fight that occurred between ball players from San Luis Rey and San Juan Capistrano missions. Spanish soldiers arrived at the conclusion of the fight, armed and ready to intervene. The Indian responsible for the Quechnajuichom men, who spoke like a Spaniard, said to the soldiers: “Raise your saber, and then I will eat you.” But he said this, Tac concluded, “in his language, and afterwards there was no trouble” (106). Pablo Tac, the most privileged and scholarly of mission boys, gave the last words of his extraordinary manuscript to a Quechnajuichom elder. This typifies the quiet resistance many young men demonstrated to the humiliations of conquest. Each boy knew those humiliations intimately, as he witnessed the diminished strength and power of his parents and other elders and lived between contending cultures during a time of substantial loss. Lisbeth Haas See also Native American Boys References and further reading Haas, Lisbeth. 1995. Conquests and Historical Identities in California, 1769–1936. Berkeley: University of California Press. Hewes, Minna, and Gordon Hewes. 1952. “Indian Life and Customs at Mission San Luis Rey: A Record of California
144
Camping
Indian Life Written by Pablo Tac, an Indian Neophyte.” The Americas 9: 87–106. Jackson, Robert, and Edward Castillo. 1995. Indians, Franciscans, and Spanish Colonization: The Impact of the Mission System on California Indians. Albuquerque: University of New Mexico Press. Milliken, Randall. 1995. A Time of Little Choice: The Disintegration of Tribal Culture in the San Francisco Bay Area, 1769–1810. Menlo Park: Ballena Press. Shoup, Laurence, and Randall Milliken. 1999. Inigo of Rancho Posolmi: The Life and Times of a Mission Indian. Menlo Park: Ballena Press.
Camping Broadly defined, a camp is a temporary outdoor residence, commonly using tents or simple buildings. In the context of boyhood, it has come to refer to a temporary community of young people living in an outdoor setting for some days or weeks for purposes of personal and social development. A variety of boys’ camps developed in the decades around 1900: private camps to shield well-to-do boys from idle summers in cities or resorts; Young Men’s Christian Association (YMCA) camps to make manly, muscular, and converted Christians of their middle-class campers; fresh-air and boys’ club schemes to give poor city children a taste of country life and middle-class mores; and eventually Boy Scout camps to teach woodcraft and citizenship. Over time, camping standards converged, with safety and a measure of comfort triumphing over rough adventure, and structured instructional programs filling most days. Only a small minority of American children attended in any given year, but campers generally found the experience impressive and enjoyable, though researchers have struggled to specify the
children’s gains. Over the course of the century gender distinctions weakened, ages dropped, and programming specialties proliferated, but woods and water remained central. As a temporary expedient—living in makeshift, mainly outdoor quarters— camping was not new in the late nineteenth century: soldiers, backwoods travelers and hunters, and worshipers at mass religious meetings all camped out. As an extension of outdoor play, moreover, nineteenth-century boys camped out fairly often. Authors of the late nineteenth century who celebrated mischievous “bad” boys and published directions for outdoor play described camping out as something small groups of boys might do on their own. Tom Sawyer and his friends camped out, and Daniel Carter Beard, later a symbol of outdoor life in Boy Scouting, promoted boys’ daydreams of camping with his American Boy’s Handy Book (1882), full of designs for the sort of bivouac that required two trees spaced just right and forty straight poles. Thus a 1915 survey of Cleveland, Ohio, reported that even in this big city 25 percent of grammar school boys camped and 45 percent hiked, though relatively few were members of organized youth groups. Organized camping—that is, group camping under adult leadership undertaken for its own sake in the hope of improving boys—grew from diverse origins in the decades around 1900. Pioneer camps for sons of the well-to-do appeared early on New Hampshire lakes, notably Chocorua (1881–1889) and Harvard, renamed Asquam in 1887, which lasted from 1885 to 1909. By 1900 about twenty such camps operated, all but one in the Northeast. Believing that boys regressed between spring and fall, corrupted by city life or the idleness of resort hotels, clergy-
Camping
145
Youth playing in the lake at Camp Tunkhannock, Pocono Lake, Pennsylvania (Library of Congress)
men, physicians, and educators within the ambit of the northeastern prep schools began constructing environments to keep prep school boys vigorously active but morally sheltered. Some camps ran almost alongside resorts and colonies of summer homes, offering a safer alternative for boys. Since late-nineteenth-century uppermiddle-class men worried that they and especially their sons were growing soft and unmanly, the camps laid on baseball, water sports, swimming, and at least one backwoods camping trip per session. Just how far to take the toughening was debated from the start, however. Ernest Balch, the founder of Camp Chocorua, imposed a full morning of chores and decried Asquam’s hiring of a cook. Yet boys at most private camps lived in rustic buildings rather than tents, and the trend
ran toward modest chores and considerable comfort. As Balch’s views suggest, complaints that camps have grown soft are almost as old as organized camping itself. Thus an Asquam camper from the 1890s felt entitled by 1907 to complain, “Now two-cycle engines flush the camp washroom from the lake, and that handpump and that well . . . are alas no more. . . . Educators . . . are getting very busy with boys’ camps, and YMCA’s too, mercy on us” (Eells 1986, 34). For their part, YMCA camps traded on muscular Christianity in order to fend off such criticisms and convince boys that they could be both sincerely religious and securely masculine. Yet in 1910 the YMCA’s Edgar M. Robinson welcomed the new woodcraft emphasis that came with scouting, complaining of camps “where almost everything is done for the boy ex-
146
Camping
cept the eating of his meals” (Macleod 1983, 238). Having promised boys outdoor adventure, however, the new movement’s handbook warned overzealous scoutmasters: “There are a lot of false notions about courage and bravery and grit . . . and long hikes for boys is one of the most glaring of these notions” (Macleod 1983, 241). The first YMCA camps in the 1880s were outings to hold boys’ interest in summer—rather like elongated picnics with swimming, games, and big meals. Camping out kept costs down but was not an end in itself. Soon, however, terms like “boys’ camp” and “camper,” considered neologisms in the 1880s, shed their quotation marks as leaders in YMCA boys’ work such as Sumner Dudley promoted summer camps as places to convert boys. Though cheaper than the private camps, YMCA camps imitated their facilities, moving fairly quickly from tents into buildings, and YMCA workers became recognized experts in organized camping after 1900. Yet as committed as YMCA men were to providing relatively affordable camping for large numbers of boys, they could accommodate only 16,690 boys in 1915, fewer than one in six of their national membership. Because of the need for adult supervision and potentially high cost, camp organizers had to balance a desire for close guidance of campers, which required small numbers, against hopes for broad social influence, which mandated larger enrollments. Founders of private camps for sons of the northeastern elite could have it both ways, since they believed they were training future social leaders. To a degree, YMCA workers convinced themselves likewise that they were building up the leading boys among their membership. Despite a shared belief
among campers and camp leaders that the experience was beneficial, later studies could not conclusively demonstrate long-term effects from camping. Given the brevity of most summer camps, even immediate effects upon children’s return to daily life are hard to prove with systematic evidence. Through the early 1900s, YMCA men sought religious conversions as evidence of success. Two leaders from Camp Tuxis in Brooklyn, New York, described the climactic Sunday evening campfire in 1904: “The boys have been away from home for some time. They are unusually thoughtful and tender. The stars twinkling overhead, the sighing of the breeze in the tree tops, the breaking of the waves on the rocks all tend towards turning the mind of the boy towards the God of nature. . . . It is the critical hour that settles a boy’s destiny and many a spot on old Tuxis has witnessed the surrender of a boy’s life” (Macleod 1983, 237–238). The atmosphere lingered for decades at some YMCA camps, and evening campfires at almost all camps closed on a note of moral seriousness; but conversion rates of 90 percent risked massive backsliding once YMCA campers returned home. As a result, YMCA workers increasingly turned toward less pressured forms of religious education and claimed to make their values implicit in the entire camp program. About the same time, in the 1920s, various denominations began to run their own church camps. By the 1940s they were in contact with the broader camping movement, though many of its leaders thought church camping too sectarian. Fresh air funds and outings and camps sponsored by social settlements and boys’ clubs began in a few instances in the 1870s and proliferated in the 1880s and 1890s. Intended to give poor children
Camping of the central cities—mostly children of white immigrants—a respite from crimeridden neighborhoods and a taste for natural beauty and middle-class mores, these camps were often deprecated by educationally minded camp leaders as mere escapism. “As recently as the 1950s,” camping historian Eleanor Eells writes, “many camp brochures assumed that fresh air, sunshine, and good food could, in ten days, overcome the evils of life in the inner-city, including delinquency” (1986, 44). Fairly frequently, early organizers presumed that lower-class children were unruly and unsuited to the wilderness, placing them instead in farm settings where they could learn the virtues of traditional, native-stock rural communities and perhaps do some farmwork. The Boy Scouts of America (BSA), which became the largest provider of camping for American boys, drew upon several traditions. One, exemplified by the nature writer Ernest Thompson Seton, who served briefly as the BSA’s chief scout and whose ideas influenced the core Boy Scout program, exalted woodcraft and blended it with the emphasis of scouting’s founder, Robert Baden-Powell, on scouting (loosely based on military scouting) as an outdoor pursuit for boys grouped in small patrols and troops. Seton gave to American scouting an enduring fascination with quasiIndian campfire ritual and a respect for nature study. But troop camps under untutored scoutmasters risked frequent food poisoning and occasional drownings, so the BSA soon favored larger, supervised camps. Church-oriented youth groups such as the United Boys’ Brigades of America that tried mass camping in the early 1900s followed military models, with tents in straight rows around a parade square, and early Boy Scout camps
147
often borrowed military styles. With YMCA men providing most of the expertise, though, the trend ran toward YMCA-style camps with a fully planned daily program and paid staff. Ideally, each scoutmaster brought his troop to a camp run by his local council, but scouts whose scoutmaster could not sacrifice so much vacation could come on their own. Troops camped at separate sites; boys swam and enjoyed evening campfires; but central camp staff provided the lifeguards, most meals, extensive instructional programs, and often the evening campfire program. As advancement in rank became the test of success in scouting, the programming at Boy Scout camps became—and has remained—heavily oriented toward training boys to earn first-class rank and then merit badges; thus success in earning badges became the evidence that camping accomplished something. Still, even the Boy Scouts could not get every member to camp. Estimates of the percentage of Boy Scouts nationwide attending summer camp have varied: in 1920 almost 45 percent managed a week at camp, in 1940 fewer than one-third did so, and again in 1973 about 45 percent attended (Macleod 1983, 242, 245, 297–298). Standards of camp management tended to converge over the years. Beginning in 1910, the Camp Directors Association of America brought private camp operators and YMCA men together; in 1924 it merged with the National Association of Directors of Girls’ Camps and in 1935 reorganized as the American Camping Association (ACA), which gradually drew in other organizational camps and became by midcentury the main accrediting agency for camps, reporting more than 2,200 in 1999. In broadly developmental terms reminiscent of the YMCA’s tradi-
148
Camping
tional fourfold program, the ACA defined camp as “a sustained experience which provides a creative, recreational and educational opportunity in group living in the out-of-doors. It utilizes trained leadership and the resources of the natural surroundings to contribute to each camper’s mental, physical, social, and spiritual growth” (American Camping Association 1998, 3). These broad but necessarily imprecise goals have generated a considerable literature dedicated to showing that camping benefits the young. Camps have continued to thrive but have diversified over the years. The beginning of girls’ camping in the early 1900s and the growing trend by midcentury toward coed camps—at least outside scouting—undercut perfervid rhetoric about camping as a toughening experience fit only for building masculinity. Indeed, by 1999 some 55 percent of campers at ACA-accredited camps were girls; about 53 percent of ACA camps were coed, with 28 percent female only and just 19 percent male only (American Camping Association 2000). Campers’ average ages declined, especially in the 1960s and 1970s as school activities and perhaps a feeling that camp was juvenile distracted adolescents and as older counselors grew hard to hire. By the late 1990s the median age of campers was 11.2, and many residential camps enrolled children down into the primary grades—far younger than in the early 1900s. Day camps have grown common, and summer activity programs teaching sports, music, and other skills or simply offering daytime recreation have appropriated the term “camp,” although few register with the ACA. By the ACA’s 1999 estimate, 8,500 camps (5,500 overnight, 2,200 day camps, and 750 that
offer both) enrolled nearly 9 million children and youths. The ACA estimated that 25 percent of camps in 1999 were privately operated, with the rest run by nonprofit organizations, including some 19 percent with religious affiliations. With the nationwide expansion of organizational camps (YMCA, Young Women’s Christian Association, Boy Scouts, Camp Fire, etc.), New Englanders lost their numerical predominance among campers as camping followed the population. By 1999, New York, Pennsylvania, New Jersey, Michigan, and California had the largest number of residential camps, followed by other large-population states such as Texas, Illinois, Florida, Ohio, and Massachusetts. Programs changed and diversified as camp directors catered to specialized interests such as fitness in the 1950s, the environment in the 1960s and 1970s, and computers in the 1980s. Yet much that was traditional remained. When asked to name their three leading activities for 1999 campers, ACA camps most often listed the following (in descending order of popularity): horseback riding, swimming, camping skills and outdoor living, challenge and rope courses, arts and crafts, and nature and environmental studies. David I. Macleod
See also Boy Scouts; Young Men’s Christian Association; Young Men’s Hebrew Association References and further reading American Camping Association. 1998. Accreditation Standards for Camp Programs and Services. Martinsville, IN: ACA. ———. 1999. Guide to ACA-Accredited Camps. Martinsville, IN: ACA. ———. 2000. “ACA Fact Sheet,” http:// www.acacamps.org/media (accessed June 25, 2000).
Cars “Camping Then and Now.” 1999. Camping Magazine 72 (November– December): 18–31. Chenery, Mary Faeth. 1991. I Am Somebody: The Messages and Methods of Organized Camping for Youth Development. Martinsville, IN: ACA. Eells, Eleanor. 1986. Eleanor Eells’ History of Organized Camping: The First Hundred Years. Martinsville, IN: ACA. Fetto, John. 1999. “Happy Campers.” American Demographics 21, no. 7 (July): 46–47. Macleod, David I. 1983. Building Character in the American Boy: The Boy Scouts, YMCA, and Their Forerunners, 1870–1920. Madison: University of Wisconsin Press. Maynard, W. Barksdale. 1999. “‘An Ideal Life in the Woods for Boys’: Architecture and Culture in the Earliest Summer Camps.” Winterthur Portfolio 34, no. 1 (Spring): 3–29.
Cars There is something about a car that especially appeals to boys. Boys who like cars learn to identify every make and model at an early age. They long for the day they can drive and never forget the experience of owning their first car. Some boys learn to take cars apart and put them together again to improve their performance and looks. These interests may lead to a career or a passionate hobby for life. The automobile came into prominence in the United States in the early 1900s, as the noisy, smelly, and dusty “horseless carriage” began to replace the horse and buggy. Boys chased these early cars on their bicycles, hitched rides on their massive fenders and bulky running boards, and sometimes mastered the operation of the new family automobile before their fathers did. Cars were early associated with masculinity, and driving the new motorcar was a man’s job. These early cars were difficult to operate. They did not have electric starters, and the driver
149
was required to make several adjustments for spark and gas and then turn the engine over using a hand crank that protruded from the front of the car. Once the car started, the driver, now seated behind the steering wheel, made additional adjustments and set off at 10 to 20 miles per hour. Because the early cars seldom had tops and some had no windshields, the driver and passengers had to wear goggles, a coat called a “duster,” and gloves on their journey. Bicycles preceded cars in the nineteenth century, a dream of personal transportation ultimately realized by several inventors in Europe. Starting out as a fad among wealthy young men, the bicycle, or velocipede (Latin for “quick foot”) gained public acceptance as Europe’s Industrial Revolution advanced technology. Inventors and manufacturers of machines ranging from guns to sewing machines began creating human-powered vehicles, some resembling bicycles and others with three, four, or more wheels. Henry Ford and the Wright brothers were bicycle mechanics whose backyard tinkering led to the mass-produced automobile and the airplane. By 1900, the bicycle’s golden age, thousands of brands and types of bicycles were being produced for sale, with riding lessons included. Boys, already well acquainted with the bicycle, trained their attention on the automobile. Their fascination led them to the shops and garages where automobiles were sold and repaired. Often these shops were former blacksmith shops or livery stables where horses were kept. Many boys who excelled in learning subjects in which they were interested became proficient mechanics. Their skills were called upon during World War I, when motordriven trucks, cars, and ambulances re-
150
Cars
Jerry Van Dyke in My Mother the Car reflects youth’s fascination with cars. (Photofest)
placed the horse on the battlefield. Returning from the war in Europe, young men took jobs in automobile factories and industries such as steel, oil, glass, and electricity that supported automobile manufacturing. Additional advances in technology and public demand led to safer, more powerful, and better-crafted cars suited to all stations in life, from inexpensive family cars to luxurious, chauffeur-driven cars and sports cars. From the earliest days of automobiles, racing had been an exciting and popular pastime. By the 1920s, racing cars were quite sophisticated and very fast. Races were conducted on public roads and on closed tracks, some of which were
steeply banked and made of hardwoods. Boys were attracted to these events, and some aspired to drive a racer. Toy cars, model kits, and automobile-like bicycle accessories were created. Downhill racers were built by boys and their fathers, who competed in local “soapbox derbies” and hoped to win a trip to Akron, Ohio, for the final heats. From 1939 to 1945, World War II demanded the efforts of all engineers and builders to produce aircraft, tanks, trucks, and guns. Automobile factories were converted to serve the U.S. and European need to win the war. In Southern California, high school boys were recruited to participate in a “4 by 4 pro-
Cars gram,” in which four hours were spent in a classroom and another four hours at a local aircraft plant studying engineering. These students were then given jobs designing and building new vehicles and weapons to further the war effort. The advanced skills they learned during military service could be applied to the hobby of automobile racing. Using their postwar jalopies as a basis, boys rearranged the bodies and rebuilt the engines. To prove their work, they raced their cars on the dry lake beds of the Southern California desert. They called this type of car a “hot rod.” When driven on the street, hot rods came to be considered symbols of youthful rebellion. Although nobody knows the origin of the term hot rod, it suitably described these cars, stripped down to the essentials, with fenders, lights, and windshields removed. The drivers raced on the dry lakes for top speed over a 3- to 5-mile straight-line course. Other types of racing were developed as well, including drag racing, which took place on abandoned airstrips. Drag racers raced on a straight, quarter-mile paved strip, competing for the highest speed and elapsed time. This form of racing inspired highly creative solutions, often requiring huge, powerful engines using exotic fuels and special tires to obtain traction at the start of these very quick, extremely noisy contests. World War II had another impact on cars in the United States when U.S. soldiers returned with a love for European automobiles. Many young men admired the British, German, and Italian automobiles of the 1930s, which they saw during military service there, and great numbers of these cars were shipped to the United States. The British cars had steering wheels on the right side. They were very small compared to U.S. cars and had
151
smaller engines; large, spindly wire wheels; and exotic mechanical features. They were termed “sports cars” because they offered a different, sportier, racier, more fun driving experience on the road compared to the heavier, less nimble U.S. cars. Soon sports car enthusiasts organized races on public roads, in parks, and on airstrips. The racecourses were lined with hay bales to prevent an out-of-control racer from hitting spectators. By the middle of the twentieth century, nearly every American family had a car or two. Cars were such an integral part of American life that television programs like My Mother the Car and movies like The Love Bug (1969) and American Graffiti featured cars in starring roles. American Graffiti, directed by George Lucas and released in 1973, was a coming-of-age film set in his hometown of Modesto, California, circa 1960. The film portrayed the automotive rituals of youths in this Central Valley town, including the weekend motorcade up and down the town’s main street, looking for girls, sampling milkshakes and deep-dish berry pies at the local drive-in restaurant, and experiencing the thrills and dangers of street racing. The film received accolades for its incisive view of American life, and for men (and women) who spent their youths in similar circumstances, the film sustains nostalgic feelings. There is some concern today among car enthusiasts about the future of their hobby. The old cars that inspired passion among earlier generations are rare and expensive; newer cars are far more complicated, having computer-generated systems that are too complex for the backyard do-it-yourself mechanic. And there are many more modern inventions that appeal to boys and invite their creative involvement, including computers,
152
Cars
Boys’ fascination with cars begins at an early age. (Joseph Sohm; ChromoSohm Inc./Corbis)
skateboards, mountain bikes, roller blades, and video games. High school curricula have also changed in recent years, with classes in computer skills, art, music, and sports replacing the machine shop, auto mechanics, and auto body repair courses of an earlier era. Finally, ecological concerns are forcing people to reconsider transportation choices. People living in densely populated cities are learning to live without a car when necessary. Electric cars are also being considered as an alternative to the polluting petroleum-based, internalcombustion-engine cars people currently drive. Although electric cars may be the wave of the future, current automotive enthusiasts have little love for them. Perhaps they cannot give up the nostalgic
thrill of the sound and smell of their cars and find it hard to believe that boys will feel the attraction to electric models in any way comparable to the excitement of having and working on an old car. Some manufacturers are producing steel and fiberglass replicas of old cars to fill the current demand, but this may be just a temporary measure to satisfy the desires of an older generation. Future generations may have to visit museums to see the cars that generated such passion among boys and men in the twentieth century. Indeed, there is much discussion today about whether the automobile should be considered art, and many museums, including the Museum of Modern Art and the Oakland Museum of California, have automobiles in their collections. Granted, these are vehicles of spe-
Chinese American Boys cial design and engineering significance, but their presence in the museum pays tribute to the passion and creativity of the men who, in their boyhood, chased “horseless carriages” on their bicycles. Philip E. Linhares See also Bicycles; Drag Racing; World War II References and further reading Batchelor, Dean. 1995. The American Hot Rod. Osceola, WI: Motorbooks. Dobrin, Michael, and Philip Linhares. 1996. Hot Rods and Customs: The Men and Machines of California’s Car Culture. Oakland: Oakland Museum of California. Gagne, Luc. 1995. Moving Beauty. Montreal, Quebec: Montreal Museum of Fine Art. Road and Track. New York: Hachette Fillipacci Magazines. Rod and Custom. Los Angeles: emap usa. The Rodder’s Journal. Huntington Beach, CA: Rodder’s Journal. Street Rodder. Anaheim, CA: McMullen Argus/PRIMEDIA Publishers.
Chinese American Boys The experience of Chinese American boys during the past 150 years can be described as a gradual transition from the margin to the mainstream of American society and as a balance between two worlds. Although they have participated in both Chinese and American cultures, over time they have become more identified with the latter, living much like other American boys. Their experiences can be framed in three periods of history: 1850 to about 1910, 1910 to World War II, and the postwar period to the present. In each period the lives of Chinese American boys have reflected institutions and practices of their ancestral homeland and of the United States.
153
Very little is known about Chinese American boys during the period from 1850 to 1910. Their numbers were quite small because few boys immigrated to or were born in the United States. Early Chinese immigrants were mostly men who worked to earn money to send home to their families. After years of sojourning, they planned to return to China to live a comfortable life with their savings. Life in the United States was difficult for the Chinese in the West Coast region to which most of them immigrated. Not only were there acts of physical violence against them, but also anti-immigration laws such as the Chinese Exclusion Act of 1882 were passed to keep them from entering the United States. In 1868 the Fourteenth Amendment to the U.S. Constitution excluded the Chinese from naturalization, preventing them from enjoying the rights of citizenship. It was not until the repeal of the exclusion acts in 1943 that the Chinese were permitted to immigrate and naturalize. Because Chinese tradition also discouraged wives and children from joining husbands and fathers who went away to work, there was little reason for women and children to immigrate to the United States. The earliest Chinese American boys were mostly sons of merchants who settled in the Chinatowns of large cities such as San Francisco. Merchants were among the few classes of Chinese allowed to immigrate to the United States and to bring their wives and children. Sons of merchants were often treated to a very unkind welcome and could be stoned and taunted by Euro-American youths while traveling from the landing docks to their fathers’ stores. Because of this hostility, a boy learned to stay in the confines of his father’s business, which usually also served as the family’s residence. Although
154
Chinese American Boys
A boy in Chinatown, San Francisco, is shown a new toy, ca. 1905. (Library of Congress)
the space could be sparse and cramped, the family slept, cooked, and ate their meals at the back of the store. A small altar or shrine served the family’s religious needs, which were a mixture of
Confucianism, Buddhism, Taoism, and ancestral worship. The boy received his education in Chinese language, history, and culture at home from his parents. If his parents decided to settle permanently
Chinese American Boys in the United States, he would be enrolled in a public school to get an American education. Sometimes parents took legal recourse to force the local school to enroll him. Some schools were segregated, but others were integrated. While in public school, the boy undertook the study of the English language, his first step toward becoming an American. Yet Chinese American boys in this period did not become acculturated because they were not welcomed in the United States. They continued to speak the Chinese language, dress in Chinese clothes, and eat Chinese food. Although some families provided many siblings with whom the Chinese American boy could play, others had only one or two children. Parents tended to favor their sons because of the Chinese tradition that valued boys more than girls. Although one parent could favor a daughter, a boy could easily be spoiled with all the attention. The mother gave primary care to her son, but as he grew older, he became closer to his father. Beyond childhood, a boy’s relationship with his family was close, but not in the sense that he expressed his feelings openly and directly. In Chinese families the show of mutual concern and responsibility was equivalent to the show of love in contemporary American families. Chinese mothers expressed their love, for example, by feeding and caring for their children. Children reciprocated by respecting and obeying their parents, two of the virtues admired by Chinese culture. The Chinese American boy also learned to be polite and reserved toward others outside his family. He rarely ventured from home, except when he accompanied his parents on visits to other families of merchants. In time, some merchants returned to China with their families, but
155
others decided to settle permanently in the United States. During the nineteenth century, some Chinese boys immigrated to join their fathers and uncles already in the United States. Even as young as thirteen years old, they were still old enough to earn money to help support the families who sent them. It was the first time they had left their homes, and they were usually bewildered and frightened. Most Chinese laborers lived a migratory life, following agricultural work wherever their labor was needed and saving as much money as possible to send back home. These Chinese American boys also lived a migratory, austere life, working long hours with very little or no opportunity to settle down long enough to have a social life or an education. Some Chinese American boys lived in cities and towns, working as apprentices in Chinese laundries, restaurants, and factories, or in homes as domestic servants for Euro-American families. After about ten years, they returned to China to marry a Chinese girl selected by their families. They either remained in China to raise their families or returned to the United States without their families to earn additional money. From 1910 to World War II, the lives of Chinese American boys reflected the development of the Chinese American community. An increase in immigration of Chinese women allowed more families to be formed, and the population of American-born children gradually rose. Families settled in and outside Chinatowns, in close-knit communities in which everyone knew each other. Some families operated such businesses as laundries, restaurants, and grocery stores, and other families farmed. Children were expected to help with the work after school and during their summer break for
156
Chinese American Boys
no pay. They began by doing small chores and progressed to bigger responsibilities as they grew older. If children got such jobs as picking fruit on farms, their earnings would go to their parents, who would fulfill their requests for money if they deemed the expense was necessary. The traditional Chinese value of collective responsibility was still very strong. Everything was shared, as exemplified during mealtime, when food was served family-style. As in the past, most families lived on the premises of their businesses. Some families rented houses, but it was very difficult to find property owners outside Chinatowns who would rent to the Chinese. Whether behind a store or in a house, the living quarters tended to be crowded. A family with nine or ten children was not uncommon, and children usually lived with their families until they married. Boys and girls attended public schools, some of which were still segregated, although others were integrated or attended predominantly by Asian American and other minority children. After school and on Saturdays, the children attended Chinese school for two hours or more to learn the Chinese language (reading and writing), culture, and history. Most were required to do so by their parents, who wanted them to retain a sense of ethnic and cultural identity. Most children attended Chinese school during their elementary school years but ceased to do so when they reached high school. In public schools, boys often faced racial teasing and hostility. Some boys fought back physically against their tormentors, but most ignored the taunting, reassured by their parents that they were better than their ignorant and uncivilized tormentors. Chinese American boys were taught to be proud of their superior her-
itage dating back thousands of years. After elementary school, some sons were sent back to China for more years of Chinese education. A few remained in China and were joined later by their parents, but most returned because they had become accustomed to the American lifestyle and culture. Although the United States was their home, Chinese American boys lived in two worlds. There were many examples of this situation. A boy usually had a Chinese name as well as an American name. Boys spoke Chinese to their parents at home and English to their friends outside the home and at school. They ate Chinese food off plates with forks. They celebrated Chinese and American holidays but loved Chinese holidays the most. Chinese New Year brought firecrackers and lucky money wrapped inside red envelopes called lai see, which adults gave to children to buy sweets and other treats. There were lion dances, acrobatics, parades, and large family banquets. Food has always been emphasized at Chinese holidays. The Dragon Boat Festival features large rice dumplings wrapped in bamboo leaves called jung. The Mid-Autumn Festival, a harvest festival similar to Thanksgiving, is known for its sweet moon cakes. In nearly all celebrations, firecrackers were exploded to scare away evil spirits and to ward off bad luck and illness, although boys did it because it was fun. While practicing traditional values, Chinese American boys also tried to fit into mainstream society, which they learned about in school and from the media. As youngsters, they read comic books and imagined themselves as cowboys and superheroes. They went swimming in the public pools (if they were allowed to) and to movies that cost 5 cents.
Chinese American Boys As teenagers, they formed their own social and sport clubs, which were sponsored by schools and churches, and participated in such activities as choir, Boy Scouts, bowling, baseball, football, basketball, and tennis. They joined the local Chinese drum and bugle corps and danced to the big band sounds of the 1940s, stopping at the local soda fountain afterward. Families did not encourage their sons to pursue education after high school. Before World War II, there were still limited opportunities for Chinese American college graduates to secure professional or clerical jobs outside their own communities because of lingering racial discrimination and the Depression. Parents advised their sons to find whatever jobs were available in their community or to prepare to take over the family business. Some parents sent their sons to college in the hope that their educated sons would return to China to help the country modernize. During World War II and afterward, the situation changed for the better. Because China was an ally of the United States against Japan, Americans began to treat Chinese Americans more favorably, for example, by repealing the exclusion acts and granting the right to naturalize. Soon after World War II, the numbers of Chinese American children increased greatly because more Chinese women were permitted to immigrate and more families were formed. Increasing numbers of Chinese families were also allowed to immigrate after the mid-1960s. Conflicts arose between immigrant parents and their second-generation children who were embracing American culture in order to gain acceptance. Immigrant parents were dismayed at their children’s abandonment of Chinese culture and
157
A Chinese American teenage boy in California plays a game of Go, 1994. (Jim Sugar Photography/Corbis)
adoption of such American virtues as independence and progressive thinking, which were contrary to traditional conservatism. The greatest conflict occurred over making friends and dating. Teenagers wanted to make their own choices, but their parents wanted to control whom they should see. Second-generation children understood Chinese but usually were unable to speak it at home, and it was even more rare that a third-generation Chinese American understood or spoke the Chinese language. The American-born generations also knew little about traditional Chinese culture, an unfortunate consequence of Americanization.
158
Circumcision
Chinese American families began moving out of the Chinatowns and older urban neighborhoods and into the suburbs, becoming part of middle-class America. Families also were becoming smaller, and consequently, a child was more likely to have his or her own bedroom. Adolescent boys watched television and the latest movies, identifying with their Chinese and Chinese American heroes. Martial arts films from Hong Kong and Taiwan were popular, giving boys a sense of pride. Teenagers cruised in automobiles, trying to impress their peers and girls by looking “cool.” With increasing opportunities, high school graduates planned to attend colleges and universities to prepare for careers in such professions as medicine, dentistry, pharmacy, optometry, accounting, architecture, and engineering. Parents urged their children to study hard in order to take advantage of opportunities that they did not have. Parents also reminded their children that their success reflected on their families. By and large, the children did not disappoint their parents. Since the 1960s, Chinese American youths have discovered their cultural heritage and their forebears’ historical place in the United States. Today, Chinese American boyhood is basically the same as that of any other ethnic group. Although all follow the flow of popular culture, each contributes a current of ethnic and cultural diversity to the mainstream. Like other ethnic groups, however, Chinese American youths still have their problems. New immigrants with few skills and lacking English-language proficiency still work in sweatshops and restaurants because of economic necessity. Boys are still left alone in Chinatown apartments without supervision because both parents work. Family members want
money to buy the things that others have. Boys have difficulty in school and in adjusting to American life and spend most of their time with Chinese-speaking friends. Because many American-born Chinese (ABC) do not speak Chinese, it is difficult for them to communicate with new immigrants and become friends. There are cultural differences, too, which lead to friction between the ABC and recent Chinese immigrants. Not part of mainstream society, some immigrant boys feel angry, frustrated, and alienated. Some form clubs and gangs for a sense of belonging in Chinatown neighborhoods. Gang members drop out of school and engage in fights, robbery, and even murder. But gang members also have a constant fear of being robbed, beaten, or killed themselves. With little hope of employment or education, they find themselves at the margins of mainstream American society. Alfred Yee See also Asian American Boys; Gangs References and further reading Hoobler, Dorothy, and Thomas Hoobler. 1994. The Chinese American Family Album. New York: Oxford University Press. Mark, Diane Mei Lin, and Ginger Chih. 1993. A Place Called Chinese America. Dubuque, IA: Kendal/Hunt Publishing. Nee, Victor G., and Brett de Bary Nee. 1986. Longtime Californ’: A Documentary Study of an American Chinatown. Stanford: Stanford University Press. Sung, Betty Lee. 1967. Mountain of Gold: The Story of the Chinese in America. New York: Macmillan.
Circumcision Circumcision, the surgical removal of all or part of the foreskin or prepuce of the penis, is the most common surgical oper-
Circumcision
159
An Orthodox Jewish circumcision ceremony, 1995 (David Turnley/Corbis)
ation carried out in the United States. The procedure has been performed ritually for thousands of years. Stone Age cave paintings that depict circumcision have been found, showing that the use of the operation as part of religious or cultural ritual predates its more widely known ritual use in Judaism and Islam. Since 1850 medical, or nonritual, circumcision, the removal of the foreskin for health reasons, has become common practice in the United States: this form of circumcision will be discussed here. In the United States and Europe nonritual circumcision was first promoted as beneficial to health during the mid-nineteenth century, being seen as having advantages for the physical and mental well-being of boys and men. In the United States this view has remained
strong. Even though male circumcision is now the subject of debate and strong opposition to the practice exists, the majority of boys born in the United States are circumcised, with over 1 million circumcisions being performed annually (Circumcision Information and Resource Pages 2001). Ritual circumcision has been carried out on girls as well as boys in many cultures across the world. Ritual male circumcision continues to be part of different religions and cultures today, although the majority of boys across the world are not circumcised. In the Americas, ritual male circumcision was practiced by some cultures in pre-Columbian times. The physicians and surgeons of the United States did not practice or encourage nonritual circumcision, except in
160
Circumcision
specific cases of damage or injury to the penis, until after the Civil War. Nonritual circumcision first became popular in North America and western Europe in the last half of the nineteenth century, beginning in February 1870 when Lewis A. Sayre performed a circumcision in the United States on a fiveyear-old boy whose legs were paralyzed. The boy’s condition improved markedly, and Sayre deduced that the circumcision had relieved the paralysis by ending severe irritation of the penis. Further circumcisions, by Sayre and others, appeared to produce similar improvements in other boys, and the curative power of circumcision in a variety of disorders was soon being publicized by Sayre, who was already a surgeon of reputation and status (Gollaher 1994). Soon after Sayre began to promote circumcision as a curative operation, other doctors began to extol its preventive qualities. In the early 1880s Norman H. Chapman, a professor of nervous diseases at the University of Kansas City, wrote that circumcision should become a standard preventive operation (Gollaher 1994). Peter C. Remondino, a Californian physician and vice president of the California Medical Society, wrote in his 1891 text History of Circumcision that the foreskin was an “outlaw,” a piece of tissue that had held an important protective function in prehistoric times but was now actually harmful to health. The foreskin, Remondino wrote, had “outlived its usefulness” (Remondino 1891, 208). He emphasized that uncircumcised boys and men were in greater danger from masturbation, bedwetting, cancer, syphilis, nervous disorders, and epilepsy than their circumcised counterparts. The move toward preventive circumcision also led to a downward shift in the
age of those boys undergoing the operation. The supporters of medical circumcision suggested that to gain the most benefits from the operation, it needed to be carried out soon after birth, and by the early twentieth century infant circumcision was common. Families who arranged to have their baby boys circumcised showed that they were wealthy by being able to afford the surgery, as well as fashionable and up-to-date. Their boys were being given a healthy start to life, the act of circumcision conferring upon them both physical and moral cleanliness. Circumcision’s popularity in the United States continued to grow throughout the interwar years and particularly during the two world wars themselves, when it was encouraged for its supposed hygienic benefits in preventing the spread of sexually transmitted disease. By the mid-1950s it was common among middle-class families and was supported by Benjamin Spock, the doyen of baby care experts, although he would later change his opinion. Because the medical community accepted the benefits of circumcision almost without question, so too did the organizations that financed medical treatment. In the 1960s the practice became widespread among working-class families as well. The act of circumcising male children had become a standard practice in American families, and noncircumcision of baby boys came to be seen as an indication that their families were too poor or too unconcerned with their children’s welfare to have the procedure performed. Further benefits were claimed for circumcision in succeeding decades. Supporters continued to claim that the procedure could protect against cancer of the penis and also suggested that it offered protection against urinary tract infec-
Circumcision tions and sexually transmitted diseases, including acquired immunodeficiency syndrome (AIDS). It was also claimed that women whose sexual partners were circumcised were less at risk from cervical cancer than women with uncircumcised partners. The rate of circumcision peaked around 1980. Although accurate figures are difficult to obtain, it is likely that at least 80 percent of boys born in the United States during the late 1970s and early 1980s underwent the operation (Zoske 1998). Opponents of nonritual circumcision could always be found, but in the late nineteenth and early twentieth centuries they had little influence. In 1949 the British physician Douglas Gairdner published a critical paper on the subject, and the United Kingdom’s new National Health Service refused to include routine medical circumcision on its list of available operations because it could not find sufficient evidence for its therapeutic value. Circumcision rates fell markedly in Britain, but the procedure remained common in the United States. In 1970 Captain Noel Preston, a U.S. Air Force doctor, published a paper that reviewed the evidence for circumcision and concluded that routine medical circumcision was unnecessary. It was not until 1971 that a major American medical organization openly questioned the practice of nonritual circumcision. In that year the American Academy of Pediatrics (AAP) established a task force on circumcision, which declared that there was no medical indication for the removal of the foreskin of a healthy boy, a view that the American College of Obstetrics and Gynecology supported in 1983. The AAP reviewed its position in the late 1980s, following the publication of research studies that appeared to sup-
161
port the use of circumcision, and declared a more neutral stance. In a further revision in 1999, the AAP acknowledged that routine circumcision may have some medical benefits but that the evidence was not strong enough to recommend the procedure. According to the AAP, the decision to circumcise should be left to individual families. As the AAP became less publicly opposed to circumcision, a number of consumer groups began to form with the intention of raising the public’s awareness of and opposition to nonritual use of the procedure. The National Organization to Halt the Abuse and Routine Mutilation of Males (NOHARMM) and the National Organization of Circumcision Information Resource Centers (NOCIRC) are two examples. Issues of concern include the detrimental effects circumcision may have on boys’ and men’s mental health and the continued performance of circumcision without the use of anesthetic. A move to promote “uncircumcision,” the reformation of the foreskin, also developed, a seemingly new idea that is actually as old as circumcision itself. In the last two decades of the twentieth century, a new ethical dimension was brought to the debate. The vast majority of nonritual circumcisions are performed on young boys who have no understanding of what is being done to them or why it is being done and have no opportunity to refuse the operation. For many opponents, the carrying-out of a procedure without proven medical benefits but with medical risks constitutes a form of child abuse and is ethically indefensible. Nonritual male circumcision remains popular in the United States, in stark contrast to other Western societies, although the incidence has fallen from its peak. It is estimated that about 1.1 mil-
162
Civil Rights Movement
lion circumcisions were performed in the United States in 1998 and that approximately 62 percent of boys are circumcised, usually in the first few months of life (Circumcision Information and Resource Pages 2001). Research findings are regularly published and cited in favor of and in opposition to the continuation of routine circumcision. Much of the research lacks validity and reliability, however, as it has done since the early years of medical circumcision, and incontrovertible evidence for or against nonritual circumcision remains elusive. Although circumcision’s popularity may have peaked, it is likely to remain one of the commonest experiences of American boyhood for many years to come. Bruce Lindsay References and further reading Bigelow, Jim. 1994. The Joy of Uncircumcising! Exploring Circumcision: History, Myths, Psychology, Restoration, Sexual Pleasure and Human Rights. 2d ed. Aptos, CA: Hourglass Books. Boyd, Billy Ray. 1998. Circumcision Exposed: Rethinking a Medical and Cultural Tradition. Freedom, CA: Crossing Press. Circumcision Information and Resource Pages. 2001. “United States Circumcision Incidence,” http://www. cirp.org/library/statistics/USA (assessed March 9, 2001). Denniston, George C. 1999. Male and Female Circumcision: Medical, Legal and Ethical Considerations in Pediatric Practice. Norwell, MA: Kluwer Academic. Gairdner, Douglas. 1949. “The Fate of the Foreskin.” British Medical Journal 2 (1949): 1433–1437. Gollaher, David. 1994. “From Ritual to Science: The Medical Transformation of Circumcision in America.” Journal of Social History 28, no. 1: 5–36. Kessler, Christina. 2000. No Condition Is Permanent. New York: Philomel Books.
Morris, Brian, 1999. In Favour of Circumcision. Sydney, Australia: University of New South Wales Press. Remondino, Peter C. 1891. History of Circumcision. Philadelphia: F. A. Davis. Ritter, Thomas J., and George C. Denniston. 1996. Say No to Circumcision: 40 Compelling Reasons. Aptos, CA: Hourglass Books. Whitfield, H. N., J. D. Frank, G. Williams, and J. A. Vale, eds. 1999. “Circumcision: BJU Supplement 1.” BJU International 83, no. 1 (January). Zoske, J. 1998. “Male Circumcision: A Gender Perspective.” Journal of Men’s Studies 6, no. 2: 189–208.
Civil Rights Movement See African American Boys
Civil War Few boys who grew up during the Civil War, whether white or black, northern or southern, escaped its influence. Significant differences existed, however, in the ways that this generation experienced the war. Northern youngsters, at least those who remained civilians, generally viewed the war from a distance, whereas southern children were much more likely to have direct contact with the horrors of combat. Boys in the North celebrated the Union Army’s eventual triumph, but white boys in the former Confederacy tasted the bitterness of loss and defeat. African American boys experienced both the joys of liberation and the disillusionment of the post-emancipation years. Of all these different groups, the Civil War had the most dramatic impact on young slaves. African American parents in bondage did not have custody rights to their children. Black mothers and fathers stood by helplessly as masters separated
Civil War their families at will, put their children to work, and whipped them. The war, especially the recruitment of African American soldiers after 1863, began to change this situation. Joining the Union Army transformed black men into liberators and defenders of their families, although it also created stresses for those they left behind. The newly acquired stature of slave fathers following their enlistment made a lasting impression on their sons, who looked to them as role models. Although violence was an everyday affair under slavery, when the war broke out, African American youngsters encountered an unprecedented level of death, injury, and physical destruction. After one battle, James Goings told an interviewer many years later, “de dead wuz laying all long de road an’ dey stayed dere, too.” Goings, who was six years old at the start of the war, observed, “In dem days it wuzn’t nuthin’ to fin’ a dead man in de woods” (Bardaglio 1992, 223). One of the most dramatic moments for young slaves involved the arrival of the federal troops on the plantation or farm where they lived and worked. Anticipating the northern advance, white southerners tried to frighten the children with stories about the atrocities that Union soldiers would commit. Although not the monsters portrayed by southern whites, the Yankees engaged in looting and destruction, and for more than a few of the younger slave boys it was an upsetting experience, especially when they went hungry in the wake of the soldiers’ pillaging. Given the stories of the slaveholders and the abuse that they actually suffered at the hands of the white Union troops, it is not surprising that many black children feared their presence or at least experienced ambivalent feelings about them.
163
Many boys too young to enlist as soldiers in the Civil War enlisted as drummer boys. (Library of Congress)
But with the northern army also came the hope for liberty. Thousands of African Americans took advantage of the Yankees’ advance to escape from their masters. Those who ended up behind Union lines became “contraband of war,” living in crowded camps under primitive conditions, suspended in a kind of twilight zone between freedom and slavery. Nonetheless, many slave children in these camps had an opportunity to get some schooling from northern missionaries and teachers. On the Sea Islands, black refugee children worked in the cotton fields during the morning and attended class in the afternoon. By the time the war ended, more than 1,400 instructors taught in 975 schools throughout the occupied South.
164
Civil War
With emancipation, African American fathers and mothers throughout the South sought to reunite their families scattered by the slave trade and the war. Boys who had grown up in slavery experienced for the first time the security of living in a family that could not be broken up by whites at a moment’s notice. Although the clash of armies eventually brought freedom to slave youth, white children in the Confederacy endured little else but chaos, deprivation, and destruction. The war disrupted every aspect of young whites’ lives in the South. Planters, fleeing from the invading army, took their families and slaves into the interior and beyond. Boys took on increased workloads and heightened responsibilities to help their families deal with the challenges of war, including the loss of their homes. As with young slaves, the war was a formative event in the lives of southern white boys, profoundly shaping their understanding of what it meant to be a man. Letters from their fathers in the Confederate Army offered advice and guidance, and the examples of these soldiers provided valuable lessons to the younger generation about sacrifice and dedication. Years later the ideal of the brave Confederate soldier fighting against all odds continued to influence the beliefs and attitudes of white southern males who had grown up during the Civil War. The ritual of men going off to war fascinated boys in both the South and North. Their intense interest in the progress of the conflict led to a discernible shift in children’s magazines from promoting religious concerns to providing accounts of battles, brief biographies of military and political leaders, and stories about life in the army. Toys and a variety of public performances, including plays, concerts, and
“magic lantern” shows, sought to encourage patriotism and tap into the growing martial spirit. Although traditional games such as marbles and horseshoes remained favorites, the play activities of young whites and blacks also mirrored the coming of war. Forming their own mock militia units and playing army, boys throughout the country mimicked their elders, learning the rudiments of drill and holding parades. Sometimes play spilled over into politics. In Baltimore, after military authorities prohibited the display of Confederate flags and decorations, several dozen youngsters observed the Fourth of July by marching into a Union encampment bearing a crepe-paper Confederate flag and wearing red shirts and caps. Many boys in the North and South refused to wait until they were old enough to become real soldiers. Each of the two armies at the outset of the war had recruitment policies that prohibited boys under eighteen from joining and fighting. A tall fourteen- or fifteen-year-old, however, could easily sneak by the recruiting sergeant in the rush to form a unit. It is difficult to know precisely how many underage boys joined the Union and Confederate Armies. Some historians claim that between 10 and 20 percent of all soldiers in the North and South were under eighteen when they signed up, but others suggest that 5 percent is a more accurate number (Marten 1998, 244, n. 6). Loyalty to the Union drove more boys from the North to sign up with the federal army than did a commitment to abolish slavery. Young white southerners entered the Confederate Army out of a desire to defend their homes and families. But for most boys, Confederate or Yankee, the main lure was the opportunity to take part in what looked like an exciting escapade. Even if obviously too
Civil War young to fight as a soldier, a boy could enter the ranks of the army as a musician, especially as a drummer or bugler. These were considered to be noncombatant positions, so recruiters often allowed a boy to join without inquiring too closely about his age. As with the older soldiers, the initial enthusiasm of the young recruits soon gave way to the harsh reality of serving in wartime. An awareness of what lay ahead sometimes dawned as soon as a boy put on his uniform and discovered how poorly it fit him. Besides getting used to military discipline, boys in the army often had to deal with harassment and abuse from other soldiers. Like most Americans at the outset of the Civil War, when the boys enlisted in the army, they expected to fight the enemy and settle the dispute quickly. Of course, they were sadly mistaken. They had little choice but to cope with the death and destruction around them and try to stay alive. Private Elisha Stockwell of the Fourteenth Wisconsin Volunteers found himself facedown on the ground at the Battle of Shiloh, with shells exploding overhead and soldiers screaming for help: “As we lay there and the shells were flying over us, my thoughts went back to my home, and I thought what a foolish boy I was to run away to get into such a mess as I was in” (Werner 1998, 23). No longer the naive youth who had signed up to take part in a great crusade, boys in the Union and Confederate Armies found themselves immersed in a senseless carnage that no civilian, young or old, could possibly imagine. Because most of the Civil War was fought on southern soil, few northern boys besides those who entered the Union Army came into direct contact with the conflict. But like their southern counterparts, boys in the North had fam-
165
ily members who went off to war and wrote home about their lives in the military. Correspondence with their fathers and older brothers allowed northern boys to experience vicariously the faraway battles and helped them make sense out of the changes brought about by the conflict and the departure of loved ones. Children on the home front were not just spectators; they participated in efforts to maintain the morale of the troops and to bring about victory. One of the most common activities was producing lint to pack into wounds, a task some boys complained was more suitable for girls. Youngsters also helped raise money to buy flags and to assist hospitals, soldiers’ homes, and other war-related causes. Northern boys, for example, took part in activities that generated thousands of dollars for the U.S. Sanitary Commission. Another popular pursuit among youth in the North was the production of amateur newspapers for friends, relatives, and neighbors. These publications provided a mix of news and commentary that reflected the involvement of children in Civil War politics. “Don’t give up the ship, boys!” exclaimed the young editors of the Concord, Massachusetts, Observer in 1862. “Stand by her to the last hour. . . . War must become the daily vocation of us all,” lest “we . . . be conquered and forever kept beneath the foot of Slavery and Oppression” (Marten 1998, 157). Not all children in the North escaped the terror and excitement of seeing the war close up. When General Robert E. Lee and his Confederate forces entered Pennsylvania in the summer of 1863, the residents of Gettysburg, young and old, got swept up in the fighting. Charles McCurdy, who was ten years old when the battle erupted that July, vividly recalled later in his life the sounds and sights of
166
Clothing
battle, particularly the heap of amputated arms and legs that lay in the churchyard, where a field hospital had been set up. During the battle, parents and children retreated to their cellars, subsisting largely on biscuits and berries. All the public buildings were overflowing with the wounded, and many families took care of soldiers in their homes. Although adults tried to spare youngsters the grimmest sights, it was impossible to shield them from the carnage. Perhaps the boys most traumatized by the war were those who lost one or both parents. More American children became orphans during the conflict than at any other time during the nineteenth century. Private philanthropy as well as state and local funding led to the establishment of orphanages in the North and the South to care for the thousands of boys and girls left without homes. Limited resources in the former Confederate states made it almost impossible to meet the demand for adequate care of orphans, and segregated facilities left black orphans especially vulnerable. African American families took in homeless children as a matter of course, but state legislatures hurried in the aftermath of Confederate surrender to pass new apprenticeship laws that allowed planters to exploit this valuable source of labor. Courts considered children without fathers to be orphans, and the new apprenticeship procedures gave freedwomen little say in determining the fate of their children. Even young blacks whose parents were both alive could become indentured if the parent could not support them adequately or the courts considered them unfit. As a result, hundreds of African American boys in the South found themselves trapped in apprenticeships that closely resembled slavery. Denied the
promise of emancipation, these children experienced one of the most tragic and ironic consequences of the Civil War. Peter W. Bardaglio References and further reading Bardaglio, Peter W. 1992. “The Children of Jubilee: African American Childhood in Wartime.” Pp. 213–229 in Divided Houses: Gender and the Civil War. Edited by Catherine Clinton and Nina Silber. New York: Oxford University Press. Clinton, Catherine. 1998. Civil War Stories. Athens: University of Georgia Press. Daniels, Elizabeth. 1989. “The Children of Gettysburg.” American Heritage 40 (May–June): 97–107. King, Wilma. 1995. Stolen Childhood: Slave Youth in Nineteenth-Century America. Bloomington: Indiana University Press. Marten, James. 1998. The Children’s Civil War. Chapel Hill: University of North Carolina Press. ———. 1999. Lessons of War: The Civil War in Children’s Magazines. Wilmington, DE: SR Books. Murphy, Jim. 1990. The Boys’ War: Confederate and Union Soldiers Talk about the Civil War. New York: Clarion Press. Werner, Emmy E. 1998. Reluctant Witnesses: Children’s Voices from the Civil War. Boulder, CO: Westview Press.
Clothing The history of American boys’ clothing reflects all the subtle complexities that characterize the definition of masculinity and the role of men in American society. In less than 400 years, boys’ clothing has included sexless infant gowns, miniature versions of women’s clothing, and fanciful costume styles, as well as the unmistakably masculine styling of the late twentieth century. Boys of the colonial era wore three styles of dress as they grew from babies
Clothing to men: the genderless clothing of infancy, the simplified women’s styles worn by young children, and the fashionable dress of manhood. Babies up to four months of age were wrapped in layers of cloth that immobilized their arms and legs. Once out of these swaddling clothes, babies wore long wool or silk dresses that were based on women’s fashions over long linen or cotton “shifts.” The outer dresses were very colorful— dark red, yellow, blue, and other fashionable colors—while the underclothing was generally white. As the baby boy became a toddler, he exchanged his plain dresses for clothing that was even more similar to adult women’s clothing. A “pudding” (a padded hat that tied under the child’s chin) protected him from bumps on the head as he learned to walk and run. Toddlers’ clothing also featured leading strings, long strips of fabric that hung from the shoulders of their dresses. Leading strings were used to restrain the child’s movements, for example, to tie him to a table leg to keep him away from the fire in the hearth. Other than the pudding and leading strings, clothing for boys from one to six years of age was quite similar to women’s fashionable styles. Some boys even began to wear stays (corsets) when they were about four years old. Stays helped mold the growing boy’s body into the ideal posture and were part of a larger process of training him in adult ways. By the time a boy was about seven, he was well on the way to becoming an adult and began to dress like one. At a time when boys left home for work, boarding school, or college in their early teens, no one looked askance at an eightyear-old boy who was a precise replica of his father, down to his satin breeches and buckled shoes.
167
Young boys in the eighteenth century dressed in breeches and frock coats. Charles Willson Peale, The Staircase Group, 1795. (Francis G. Mayer/Corbis)
By the end of the eighteenth century, the educational philosophies of John Locke and Jean-Jacques Rousseau gained acceptance to the point of transforming children’s clothing. For the first time,
168
Clothing
dress for infants and children was designed for comfort and freedom. Swaddling fell from favor, and the use of stays for children drew a great deal of criticism. In general, boys’ clothing became more simple and unrestrictive. Simple white cotton muslin or gauze frocks that hung from the shoulders, bound only by a simple ribbon at the waist, were the ideal garments for babies, toddlers, and small children. Little boys continued to wear dresses until they were four or five years old but no longer went directly from dresses to men’s clothing. New styles of dress emerged that were specifically intended for boys. The most prominent feature of these new styles was the adoption of full-length trousers. Once the mark of the working class, pantaloons rapidly replaced the more “gentlemanly” knee britches beginning in the 1760s. There is still no satisfactory explanation for this transition in which the sons of the wealthy and fashionable began to wear clothing once scornfully associated with the lower class. Perhaps it was the male counterpart of the fashion for rustic dress for females, which included aprons and straw bonnets. Whatever the motivation, it is certain that this particular innovation was adopted first for children’s clothing, while adults followed the older styles. The boys’ garment that most clearly evokes the turn of the nineteenth century is the skeleton suit, a garment with long trousers that resembled a jumpsuit. Probably introduced between 1760 and 1770, the skeleton suit was usually made in two pieces, a shirt and trousers of the same fabric, which were buttoned together at the waistline. For most of the nineteenth century, boys continued to wear skirts until the age of five or six. To nineteenth-century
parents, a newborn boy was male but not masculine; it was best to let a boy’s masculine nature emerge gradually and let his clothing reflect those changes, not foreshadow them. At the same time, clothing for school-age boys (seven to about fourteen years) reflected the increasing simplicity of men’s business dress. From 1820 through the 1840s, dresses for little boys were very much miniature versions of women’s clothing, even to the point of featuring light boning or being worn with corsets. In the 1820s, children’s dresses became much shorter, revealing the white cotton drawers, or pantalettes, worn underneath. It is very hard to distinguish between little boys and girls in illustrations or portraits dated before 1850. Most little children in portraits or fashion plates had hair that was about chin-length and very simply parted and combed. The cut of their clothing was virtually identical, as were the fabric and trimmings used. Late in the 1840s a few styles specifically for boys made their appearance. The primary influence behind this trend was Queen Victoria of England, much admired in the United States as a model wife and mother. In 1846, she commissioned a portrait of the Prince of Wales dressed in a child-sized replica of the uniform of a sailor in the Royal Navy. The sailor suit became enormously popular for big and little boys alike. Descriptions of the Highland dress worn by the royal family inspired plaid kilt suits for little boys as well as tartan dresses for toddler boys. In the 1840s, dresses gave way slightly in popularity to tunics and coatdresses worn over skirts, particularly for little boys of about four to six years. Other military styles were favored in addition to sailor suits; the most common during the
Clothing
169
The sailor suit became a popular dress for boys as early as the 1840s. (Library of Congress)
Civil War was the Zouave suit, which featured a short, unfitted, collarless jacket usually trimmed with braid. The introduction of the sailor suit and Highland costume accompanied a trend that divided clothing for little boys into styles for two age groups, today called toddlers and preschoolers. Boys from one to three years of age wore dresses and skirts very similar to those worn by girls. From three to about six years, they wore kilts or wide short trousers either gathered at the knee or left ungathered. The gathered style, called knickerbockers, first appeared in the 1860s, paired with a variety of jackets and tunics. The 1870s and 1880s marked the high point of elaborate dress for boys under
the age of eight. Closely reflecting trends in women’s fashions, a single outfit might be made up in two or three fabrics and trimmed in as many different trims. Little boys appeared as sailors, cavaliers, and highlanders and in numerous other guises. Gender distinctions became more blurred in the late 1870s and 1880s because of a vogue for short “garçon” haircuts for girls and very long, curly locks for boys. The peak of the fancy dress trend for boys was the late 1880s, with the Lord Fauntleroy craze. Little Lord Fauntleroy by Frances Hodgson Burnett was one of the most successful books of the century, and the play was a hit on both sides of the Atlantic. Little Lord Fauntleroy’s velvet
170
Clothing
cavalier suit inspired millions of imitations, available ready-made in stores or by mail. When a boy reached school age—anywhere between five and seven—he was old enough to wear styles that were closer adaptations of men’s clothing, without the ruffles, laces, and ribbons that typified little boys’ fashions. During the latter half of the nineteenth century, when long hair for little boys was in vogue, this transition in clothing accompanied a boy’s first short haircut, so the transformation was particularly dramatic. Big boys’ fashions were supposed to be simple, sturdy, practical, and handsome, not pretty. Sturdiness required heavier fabrics, leaving most light colors and delicate materials for girls and younger boys. Between approximately 1820 and 1870, a variety of styles were acceptable for older boys, including short Eton-style jackets, flared Zouave jackets (introduced in the 1850s), and even frock coats with flared skirts, just like the ones worn by grown men. Shirts with flat or slightly raised collars were considered more appropriate for boys than a man’s stock and cravat, though some portraits do show boys as young as nine or ten in adult-style neckwear. Long trousers, knickerbockers, and full-cut, knee-length garibaldis were all worn, though by around 1850, long trousers were being reserved more and more for boys aged eight and over. By 1870, full-length trousers were no longer common for boys not yet in their teens. The transition from short to long pants was becoming institutionalized as a rite of passage into manhood. Jackets and coats for boys became plainer and more tailored, echoing similar changes in men’s clothing. Sailor suits were popular for boys as old as ten or twelve and were
worn all year-round. Winter sailor outfits were made of dark-colored wool, and summer ones featured light wools, linen, or, more rarely, cotton. They were considered too dressy for school, where corduroy or wool knickers were more appropriate, and too playful for church and other more formal occasions, which required a plain serge or worsted knickers suit with a man’s tailored jacket. But for shopping, visiting, family outings, and children’s parties, the sailor suit was extremely popular. Other naval-inspired styles made their appearance in the 1870s, such as the reefer jacket and its shorter cousin, the pea coat. At the end of the nineteenth century, dresses for little boys lost most of the upholstered look found in the clothing of the late 1870s and 1880s. Styles with deep yokes were favored, the dress falling in gathers or pleats from the yoke to the hem. The yokes could be decorated with lace, embroidery, tucks, or smocking, with the same trim sometimes being used at the wrist as well. Dresses for toddler boys (one or two years) usually had no discernible waistline; slightly older boys wore dresses with belts or sashes at about hip level. For very little children, there was hardly any distinction made between boys’ and girls’ dresses. Pink, blue, gray, and tan were given as suitable colors for either sex, and the main differences seem to have been that extremely elaborately trimmed dresses were just for girls, though plain styles could be worn by either. Boys and girls had similar hairstyles, a simple “Buster Brown” bob with bangs, inspired by the popular cartoon character. Rompers and overalls were introduced around 1890 and became overwhelmingly popular for preschoolers’ play clothes.
Clothing Clothing for boys from four to fourteen years old continued its rapid evolution away from earlier patterns. Creepers and rompers offered parents an alternative to dresses for very little boys, and by 1910 boys over the age of three were often put into short trousers rather than kilts. Sailor suits were the dominant dressy style for boys from ages five to six until ages nine to ten. Older boys not yet in their teens could wear Norfolk jackets or sack coats with knee trousers. Certainly the costume styles of the 1880s had fallen from favor. Fauntleroy suits and Highland kilt styles still appeared occasionally as late as the 1920s but were rarely worn, except for formal portraits and very dressy occasions. For school-age boys, sailor suits, Norfolk jackets, and reefer coats were extremely popular, as were many other styles based on men’s dress. Military influence was strong throughout the period, whether in the form of English middy blouses or hiking clothes that resembled army uniforms. For dressy occasions, Eton jackets or plain sack coats were usually recommended. College and professional athletes also influenced boys’ clothing. Golfand tennis-style sweaters, baseball caps, and canvas shoes were among the more permanent adoptions. The trend toward earlier differentiation between boys and girls continued from 1919 to 1945, with most of the attention focused on new styles for toddler boys. Although in 1919 a two- or threeyear-old boy might have a few dresses in his wardrobe, by the end of World War II, only the youngest male infants would wear dresses instead of creepers. Age distinctions in boys’ clothing were still observed during the 1920s, but declined in importance during the 1930s and early 1940s. Throughout the late 1920s
171
and early 1930s, boys wore long trousers more and more often and at earlier ages. The tradition of keeping a boy in short pants or knickers quickly grew obsolete by 1945, when most suits with short pants were intended for boys no older than six or seven. Men who were born in the 1920s may still have vivid memories of their first knickerbockers and their first real suit with trousers. Men born in the 1940s had no such rite of passage. By the 1930s, there was much less distinction between clothing for boys still in infancy and boys nearly ready to start school. One-piece rompers (or two-piece playsuits that buttoned together at the waist) were worn by boys as young as six months (even younger, in some families) and continued to be worn until the boy was five or six. Toddlers and schoolboys alike wore knitted shirts, either long- or short-sleeved, over short trousers. Sleeveless full-length bib overalls, with or without a shirt, were extremely common for playwear for boys of all ages. Having finally been freed from the influence of women’s fashions, boys’ clothing from this period drew mainly on three sources: men’s everyday clothing, military dress, and sports clothing. The effect of men’s fashions was most strongly seen in clothing for older boys, whose dress suits, shirts, and ties became more like those worn by their fathers. Military clothing had long been an important source for style innovations in boys’ clothing; only the specific heroes and branches of the military changed. Just as the Zouave suit gave way to the sailor suit, after the First World War, boys forsook sailors for aviators. Leather jackets and aviator caps were treasured possessions, especially after Charles Lindbergh’s transatlantic flight in 1927. Sports, whether collegiate, Olympic, or
172
Clothing
professional, contributed an enormous range of styles from argyle golf stockings in the 1920s to ski pants and parkas in the 1930s. Sweatshirts and sneakers, already staples of the American high school boy’s wardrobe, became popular for younger boys during this era, as did baseball caps. School-age children’s clothing of the first half of the postwar period had not changed much in some respects since the late 1930s. A typical boy’s school “uniform” was a pair of corduroy pants and a striped knit shirt. Military styling continued long past V-J Day; Eisenhower jackets and Coast Guard slickers were among the favorites. Western styling was a very strong influence, with cowboy shirts, jeans, and denim jackets worn often. Popular culture influences multiplied; cartoon characters adorned slippers, pajamas, T-shirts, and many other garments. For dressy events, boys’ and men’s suit styling became almost identical, right down to the necktie. During the social and cultural upheaval of the 1960s, the differences between younger boys and teenagers became quite dramatic. For boys not yet in their teens, fashions mimicked adult men’s styles, though sometimes in a wider range of colors and patterns. The classic styles worn by the many Kennedy children also won widespread acceptance. Clothing for teenagers became increasingly troublesome for parents and educators. For many adults, the confusing smorgasbord of youth fashion offered few acceptable choices. Popular styles included mod British styles, West Coast surfing styles, and in the late 1960s the exotic hippie style, which mixed ethnic clothing with thrift and surplus-store chic (especially navy-issue bell-bottomed jeans). All these styles featured longer
hairstyles than were acceptable for older boys. (Long bangs, like those worn by John Kennedy Jr., were quite popular for toddlers and preschoolers.) Attempts to dictate short haircuts were failures; shifting standards made enforcement difficult. Boys’ hairstyles that seemed long in 1966 were laughably short by 1970, when even some male principals had long hair and sideburns. By the early 1970s, schools had largely abandoned dress codes as antiquated and unenforceable. A related issue, in the late 1960s and the 1970s, was gender stereotyping. The arguments over long hair for boys had heightened people’s awareness of how costume reflects the roles they played. In the early 1970s, the idea of nonsexist childrearing encouraged many parents to dress their children more androgynously. Boys wore brightly colored, patterned clothing and often had long hair for the first time since Lord Fauntleroy came on the scene. Girls wore overalls and knitted shirts. Blue jeans were worn by both sexes (and all ages). By the mid-1970s, boys’ fashions were settling down again, returning to traditional norms of masculine appearance, with shorter hair and less experimentation with color and design. Since then, the clothing available to boys has included a wide range of styles, from preppy to Goth to hip-hop, with teenage boys usually enjoying a much greater range of choice than their younger brothers. Jo B. Paoletti References and further reading Calvert, Karin. 1992. Children in the House: The Material Culture of Early Childhood, 1600–1900. Boston: Northeastern University Press. Ewing, Elizabeth. 1977. History of Children’s Costume. New York: Charles Scribner’s Sons.
Clubs Worrell, Estelle Ansley. 1980. Children’s Costume in America 1607–1910. New York: Charles Scribner’s Sons.
Clubs In the seventeenth and eighteenth centuries, the work responsibilities of boys precluded both the time and necessity of there being more than a few clubs formed by or for them. However, in the nineteenth century, as apprenticeship of boys declined and new childrearing techniques emerged, boys and young men began to form voluntary organizations. In the late nineteenth century, the modern boys’ club appeared as concerned middleclass adult men and women formed organizations to socialize both workingclass and middle-class boys. Informal education in youth groups came into its own in the early twentieth century. Then, as now, the organizations available to boys and the activities that would attract them varied greatly according to the area where they lived, their racial and ethnic backgrounds, their religious affiliation, and their socioeconomic status. As they have in the past, boys’ organizations vary from small autonomous groups to highly structured international associations controlled primarily by adults. Organizations have served the dual role of transmitting traditional values from one generation to the next and of socializing the young to new models of manhood required by a rapidly changing society. Prior to the nineteenth century, children older than six or eight years of age generally participated in production and other forms of association with adults. Craft apprenticeship is one of the most well-understood arenas of boyhood in colonial America. Indentures of apprenticeship stipulated mutual obligations,
173
and these responsibilities extended far beyond compensation for labor, length of service, or even the training of the boy in the “mysteries of the craft.” Apprentices were not merely young “employees” or “trainees” of master craftspeople, but were taken in as members of the master’s household. Masters had the authority and responsibility to compel upon threat of corporal punishment acceptable communal conduct that often included various public displays of deference, church attendance, and the study of catechism. As part of a much broader set of masterservant relationships, apprenticeship subordinated boys to men in ways that linked the moral and economic life of a productive household and defined manhood less in terms of chronological age than in terms of the ownership of tools and land. Nonetheless, some organized activity among and for boys existed during the colonial period, but such groups were local and often included a wider range of ages in their memberships than would be typical today. These early groups form one of the roots from which contemporary boys’ organizations grew. The scant surviving records of very early groups suggest that most were religious in nature. Young men (and also young women, but usually separately) met regularly for such activities as prayer and “mutual improvement.” Some youth societies also provided assistance to the less fortunate (but worthy) members of their communities. No doubt members also managed to have a bit of fun from time to time as well. Although they may have been associated with Protestant congregations—particularly in New England—these early groups appear to have been organized and controlled by the young people themselves. In New Orleans, however, Jesuit priests
174
Clubs
and Ursuline sisters organized young students in sodalities—groups that promoted piety, charity, and devotion to the church, establishing a model of adult guidance that would be followed as Catholic parishes were established elsewhere. During the nineteenth century, the rise of wage labor relations and the consolidation of the ownership of productive property combined to dismantle the law of master-servant and separate men’s work from household production. The effects of the change on childhood and on the American understanding of individual development have been truly revolutionary. For example, in preindustrial America, the term competency referred to an economically sufficient household. Through the economic transformation of the nineteenth century, it came to mean an individual in a state of fully mature and developed mental health. As part of the emerging individualism, childrearing literature during the 1830s and 1840s, such as Lydia Maria Child’s widely read The Mother’s Book (1831) and the pamphlets of the American Sunday School Union (established in 1824), proclaimed that it was better to teach a boy to “master himself than to master the boy.” According to these liberal tracts, boys should be separated from involvement in the “real” world of men in order to learn the selfcontrol needed for adult life. This gender upheaval in the household divorced childrearing from the men’s economy and opened a window for “boys’ work” by middle-class professionals and well-educated women. Thus, new academies, literary societies, temperance groups, and religious associations of young people spread throughout the country during the antebellum era. Their members included both males and females between the ages
of ten and thirty. The Second Great Awakening, the period of religious revivals that began in the 1790s and continued through the 1830s, gave rise to a profusion of adult reformist societies, including those dedicated to peace, home and foreign missions, benevolence, the abolition of slavery, and temperance. Children and adolescents were encouraged to participate in all these movements alongside their elders, but they also began to form juvenile branches. By the late 1820s, a growing number of children’s periodicals such as Juvenile Miscellany (1826–1834) and Youth’s Companion (1827–1929) provided additional impetus through encouraging young readers to form their own societies for a broader variety of purposes. Proud of the new nation’s federal constitution, writers for juveniles introduced another enduring theme in youth work, “civic education,” by inspiring children to create constitutions for their organizations as well. Although most of the associations formed during the Second Great Awakening were dedicated to liberal notions of self-improvement, others inverted the customary subordination of boys to men in the tradition of rituals like Halloween festivals, in which children demand treats from adults. Antebellum youth associations were quite diverse, but they shared two qualities that made them unlike later boys’ clubs. They disregarded narrow age differences, and they were not “staffed” by professionals or volunteers who saw themselves as social workers. Certainly, character was being developed through association with others, but it was not a specialized function clearly distinguished from other social and productive relations. Another ideological root of twentiethcentury boys’ organizations has been
Clubs found in the antebellum Protestant Sabbath schools that began to appear in the 1790s. The developing industrial economy—particularly textile manufacturing—provided employment for children and adolescents as well as for adults. Most youngsters worked six long days a week, with the Sabbath the only day for leisure and rest. The view that “idle hands are the devil’s playthings,” however, was a popular one. Schools were established on Sundays with the mixed motives of teaching literacy and numeracy skills to working children as well as “keeping them off the streets” and out of mischief. These aims—teaching skills and accomplishing social control—can be found in various guises and proportions in most of the clubs and organizations that adults have created for boys ever since. Although originally developed for poor children, Sunday schools quickly spread to congregations of the “middling classes.” Linked with this development was the fight against “King Alcohol.” In 1836, children’s temperance societies united under the banner of the Cold Water Army (CWA) and its sponsor, the American Temperance Union. Although both boys and girls belonged, they tended to meet in groups formed along gender lines. The temperance societies brought adult leaders into prominence in the field of youth work—and they never left. Simultaneous with the growth of the Sunday schools and temperance societies was an expansion of denominational religious education activity that also employed adult-sponsored youth groups as a means for delivering adult agendas. Until the great decline in child labor from 1900 to 1940 created the possibility of a childhood protected from adult production for all classes, clubs designed for middle-class boys were more successful
175
than clubs meant for working-class boys. For example, the Young Men’s Christian Association (YMCA) was founded in 1844 by Londoner George Williams to provide wholesome association among young men in the commercial trades; the first American YMCA appeared in Boston in 1851. The YMCA shifted slowly toward boys’ work between 1880 and 1900. During these years, its efforts centered on athletic leagues that appealed mostly to middle-class teenagers. James Naismith invented basketball at a YMCA training school in Springfield, Massachusetts, in 1891, and the game spread through local YMCAs and then to the schools of the nation. In fact, the success of the YMCA encouraged high school authorities to begin administering formal extracurricular activities such as clubs, societies, and teams to influence the social development of their overwhelmingly middleclass students. The YMCA formally staked out the basic parameters of character-building youth work that would follow and took the lead in developing the philosophy and strategies of boys’ work. In the meantime, other forces were at work to create a much broader market for youth programs. As the Victorian era drew to a close, multiple upheavals transformed American life. Changes in workplace routines, employer liability law, household economic practices, and the enforcement of compulsory education combined to greatly reduce children’s and youths’ employment. Although children and teens remained an important part of the labor force well into the twentieth century, young people were expected to spend less of their childhood and adolescence in the workplace and more days and years in the classroom preparing for future careers. Middle-class adults saw in the rise
176
Clubs
of the new urban schoolboy both the potential problems of gang membership and street temptations and the possibility for properly programming them. Within urban families, the spheres of men’s and women’s work were becoming more sharply divided. Increasingly, men left their homes for work in offices and factories, delegating to their wives the management of households and children. City boys had fewer opportunities to learn traditional “manly” skills firsthand. Unmarried women were replacing men in the teaching profession, and women often predominated in many Sunday schools—all of which led to rising concern about the feminization of boyhood and the potential for making “mollycoddles” instead of men. The new fields of evolutionary biology and the behavioral sciences also played a part. The late nineteenth century saw the popularization of “genetic psychology”— a theory that held that individuals pass through the same stages of development as did the human race. The presumed developmental progress from savagery in boyhood through chivalry in adolescence to the modern man about town helped provide a metaphorical base upon which “boys’ work” would form. A final but often overlooked ingredient is the role played by newspapers and periodicals in the spread of the youth organization idea. During and following the Civil War, the number of daily papers grew rapidly, and competition among them was fierce. The period of “yellow journalism” produced such extremes of sensationalism that the papers were often deemed unsuitable for women and children. The 1890s saw the beginnings of reform. Along with comics, both locally produced and syndicated family and children’s features began to appear—par-
ticularly in Sunday editions. In many city papers, the children’s features developed into actual clubs for young readers. Advances in printing technology also made magazines cheaper to produce, and numerous periodicals were aimed at specific audiences of men and women in rural or urban areas, as well as at their children. These ready markets became more accessible as the nation’s mail and rail networks grew. A surprising number of newspapers and magazines featured their own clubs for young people (e.g., the Minneapolis Journal’s “Journal Juniors,” the St. Nicholas League, and the Farm Journal’s Liberty Bell Bird Club, to name but a few). The stage was now set for an explosion of organizations for children and youth. Increasing numbers of young people had more leisure time to devote to their own interests. Adults were increasingly nervous about the forms these interests might take and were also concerned about the development of character and backbone in their sons and preparing them for mobility (preferably upward) in the job market. New ideas of psychological development saw early adolescence as a “gang age” when youth were particularly amenable to joining groups, and information and transportation technologies had helped create a growing and accessible youth market for such groups. All that was left was the formation of organizations that would be able to attract and sustain the attention of youth as well as articulate and transmit the agendas of their adult sponsors—no small order. Earlier organizations had attempted to force the attention of boys on things that adults felt should interest them. They had been fairly direct in the delivery of their messages, and most failed as boys “voted with their feet.” A number of pop-
Clubs ular metaphors began to emerge as the vehicles for engaging boys’ interest, among them the “noble savage” as represented by Native Americans; the rugged pioneer, brave soldier, and strong athlete (or the “muscular Christian” or “muscular Jew” depending upon the “Y” to which one belonged); the chivalrous knight; and the dependable fraternal brother. These metaphors were reflected in the names of groups popular at the turn of the century: the Woodcraft Indians (1902), the Sons of Daniel Boone (1900), the United Boys’ Brigades of America (1894), the League of American Wheelmen (1880), the Knights of King Arthur (1895), and the Order of the American Boy (1899). Many of these, as well as other boys’ organizations, were later absorbed by the highly successful Boy Scouts of America (BSA), brought to the United States from England in 1910. There were so many boys’ organizations that the youth work field was called, for a time, “boyology”! Boy Scouting learned much from the successes and failures of earlier groups. Originally, scouts chose their own leaders and governed all aspects of troop operation. Members could participate in popular group activities such as hiking and camping, learning first aid, and performing community service, yet pursue individual interests through merit badge work and rank advancement activity. The BSA was formed during a remarkable period that saw the creation in Boston of the Federated Boys Clubs in 1906 (which in 1931 had grown to become the Boys Clubs of America), as well as the establishment of the Big Brothers of America (1903–1910), the Junior Red Cross (1917), 4-H clubs (begun in 1898 and greatly extended with the Smith-Lever Act of 1914), Junior Achievement (1919), the American Standard Program of the
177
YMCA (1917), and several major girls’ organizations. There is no doubt that the urgency of World War I and the Americanization movement that followed provided impetus for the rapid growth of these youth groups, all of which continue in one form or another today. Some historians have drawn a tight distinction between the character-building efforts of clubs for middle-class boys and the crime prevention efforts of clubs designed for working-class boys. This difference still exists. The Boys and Girls Clubs of America (BGCA), which focuses on “disadvantaged” youths, says on its website that “on average [the BGCA costs] about $200 per youth per year. But consider the alternative: keeping a young adult in jail costs taxpayers anywhere from $25,000 to $75,000 per year” (Boys and Girls Clubs of America 2000). Although they have all made major efforts to serve inner-city youth, none of the traditionally white, suburban, middle-class organizations makes a direct reference to crime prevention in their Internet literature. Yet this difference should not be overemphasized. The boys’ club movement as a whole has a common ideological heritage. Late-nineteenth-century British and Yankee Protestants created almost all the early boys’ clubs and organizations. In industrial Britain and the northeastern United States among white, Protestant, middle-class professionals, there first emerged a belief that the social problems caused by capitalism might be solved by intervening in the lives of children. The social problems that middle-class professionals saw were not only ignorance, poverty, and crime but also a lack of purpose, solidarity, and robust manliness. To the degree that club founders acknowledged that the challenges of modern life varied between the classes, the
178
Clubs
solutions they offered were largely the same whether they were dealing with street-trading boys or schoolboys. When John Gunckel founded the Boyville newsboy club in Toledo, Ohio, in 1892 (which grew into the National Newsboys’ Association [NNA]), he was working in concert with newspaper managers and owners who wanted to bust the city’s newsboy and bootblack union. The negative implications of this action for working-class solidarity among street-trading boys is obvious, but it should not obscure the fact that NNA literature and services spoke in the same language as all boys’ clubs. The ideal of manliness that the NNA hoped newsboys would emulate was akin to the ideal exemplified by the image of a Boy Scout helping an old woman to cross the street. Gunckel’s boys were honored with a visit from the icon of self-made manhood, Theodore Roosevelt, and praised in newspapers for “learning to paddle their own canoes,” rather than relying on family and community. The common subtext of the boys’ work movement was that families should free boys from breadwinning responsibilities, but not so their sons could roam the streets either as thugs or dandies. Rather, boyhood should be a time to build physically, morally, and intellectually sturdy individuals who could fend for themselves in a world of competition, excess, and ambiguity. Clubs for working-class boys often established connections between their members and employers looking to train young men as clerks or in the skilled trades. According to these clubs, a successful boy was defined as one who gained a job that granted access to middle-class privileges. Club leaders recognized that personality, appearance, style, and language were extremely important
to a boy who wished to land the best opportunities in an increasingly anonymous city. Thus, it is no surprise that most twentieth-century boys’ clubs have provided ways to perfect masculine traits of dress, decorum, etiquette, and recreation, as well as occasionally provide health care services and useful skill training. Since 1945, boys’ clubs have grown larger. By the mid-1990s, the Boys and Girls Clubs of America numbered over 1,800 affiliates serving about 2.5 million children and youths. In 1999, the Boy Scouts of America and the Girl Scouts of the USA involved approximately 3.5 million scouts organized by an astonishing 1.2 million adult leaders (Boys Scouts of America 1999; Girls Scouts 2000; YMCA of the USA 2001; For Youth by Youth 2001). The YMCA, though providing a spectrum of services beyond children’s clubs, has spread to more than 120 countries with 30 million current members. In the United States, the YMCA has become the largest provider of out-of-home care for school-age children. It has been estimated that 50 million Americans were 4-H members during the twentieth century. Yet, having survived the Depression and another world war more or less unchanged, boys’ organizations faced the women’s movement and were compelled to reconsider their membership rules in the last quarter of the century. Vocational student organizations organized through the high schools, for example, are all now coeducational, although some, such as the Future Farmers of America (1928), the Future Business Leaders of America (1942), and the Vocational Industrial Clubs of America (1965), were initially all male or nearly so. Athletic competition remains organized primarily along gender lines, but this, too, is changing. Baseball programs such as Little League (1935) and
Clubs the Babe Ruth League (1951), originally for boys only, now include girls on regular teams or offer separate division play for girls. The YMCA, which invented the games of basketball and volleyball, now offers among its activities athletic and physical fitness programs for all members of the family, and the Boy Scouts of America began coeducation by admitting girls to their Exploring (1969) and Learning for Life Divisions (1991). The fact that today girls are less likely to be totally segregated from boys and that club activities are less obviously gendered than they were in the past has not significantly altered clubs’ underlying social purposes and ideology. These have not changed since the reorganization of economic and domestic relations in the late nineteenth and early twentieth centuries. Clubs continue to be organized by adult volunteers and professionals who try to foster sturdy individualists who are able to cooperate with a group to achieve their own ends. The activities they sponsor, such as team sports, camping, craft projects, civic improvements, and participation in public celebrations, have been standard practices for a century. Of course, today’s young people—boys and girls alike—are faced with increasing options for using their out-of-school time. Most choose to spend many hours per week working for pay, a preference that has taken its toll on youth organization membership among older adolescents. Dramatic change will continue to be a regular feature of American life; only those organizations flexible and inclusive enough to meet the current needs and interests of youth will endure in the twenty-first century. Judith Erickson Patrick J. Ryan
179
See also Boy Scouts; 4-H in the Midwest; Newsboys; Young Men’s Christian Association References and further reading Ashby, Leroy. 1983. Saving the Waifs: Reformers and Dependent Children, 1890–1917. Philadelphia: Temple University Press. Boy Scouts of America. 1999. “1999 Annual Report,” http://bsa.scouting. org/nav/pub/news.html (accessed May 14, 2001). Boys and Girls Clubs of America. 2000. “Who We Are: The Facts,” http://www. bgca.org/whoweare/facts.asp (accessed May 14, 2001). Carnegie Council on Adolescent Development, Task Force on Youth Development and Community Programs. 1992. A Matter of Time: Risk and Opportunity in the Nonschool Hours. New York: Carnegie Corporation of New York. Eisenstadt, S. N. 1956. From Generation to Generation: Age Groups and Social Structure. New York: Free Press. Elfenbein, Jessica Ivy. 1996. “To ‘Fit Them for Their Fight with the World’: The Baltimore YMCA and the Making of a Modern City, 1852–1932.” Ph.D. diss., University of Delaware. For Youth by Youth. 2001. “About 4-H,” http://www.4-H.org (accessed May 14, 2001). Girl Scouts. 2000. “About Us,” http://www. girlscouts.org (accessed May 14, 2001). Graebner, William. 1988. “Outlawing Teenage Populism: The Campaign against Secret Societies in the American High School, 1900–1960.” Journal of American History 74: 411–435. Kett, Joseph F. 1977. Rites of Passage: Adolescence in America 1790–Present. New York: Basic Books. Macleod, David I. 1983. Building Character in the American Boy: The Boy Scouts, YMCA, and Their Forerunners, 1870–1920. Madison: University of Wisconsin Press. Maupin, Melissa. 1996. The Ultimate Kids’ Club Book: How to Organize, Find Members, Run Meetings, Raise Money, Handle Problems, and Much More! Minneapolis: Free Spirit. McLaughlin, Milbrey W., Merita A. Irby, and Juliet Langman. 1994. Urban Sanctuaries: Neighborhood
180
Comic Books
Organizations in the Lives and Futures of Inner-City Youth. San Francisco: Jossey-Bass. Nasaw, David. 1985. Children of the City: At Work and at Play. Garden City, NY: Anchor Press/Doubleday. National Youth Development Information Center. 2001. www.nydic.org (accessed May 14, 2001). Includes a directory of more than 500 national contemporary youth organizations with links to individual organization websites. Public/Private Ventures. 2000. Youth Development: Issues, Challenges, and Directions. Philadelphia: Public/Private Ventures. Spring, Joel. 1974. “Mass Culture and School Sports.” History of Education Quarterly 14 (Winter): 483–499. Wessel, Thomas, and Marilyn Wessel. 1982. 4-H: An American Idea, 1900–1980. Chevy Chase, MD: National 4-H Council. Whisnant, David E. 1971. “Selling the Gospel News, or the Strange Career of Jimmy Brown the Newsboy.” Journal of Social History 5, no. 3: 269–309. YMCA of the USA. 2001. “YMCA’s at a Glance,” http://www.ymca.net (accessed May 14, 2001).
Comic Books Icons of boy culture from the late 1930s to the late 1950s, comic books are a relatively recent phenomenon. Although earlier newspaper comic strips had been published in book form, the first American comic book, Famous Funnies, appeared in 1934, and the first adventure comic book, Action Comics, featuring Superman, debuted in 1938. Moreover, comic books’ period of cultural dominance in boy culture ended with, or was at least weakened by, the coming of television in the 1950s. Indeed, fans and collectors define the golden age of comic books as little more than a decade, 1938–1949. Nevertheless, comic books are iconic in several ways. They are instantly recognizable as belonging to boy-
hood. Although there are comic books marketed to girls and girls willingly read comic books intended primarily for boys, the popular image of the comic book reader is that of a boy. Some comic book stories are humorous, but in most the defining story type concerns mythological superheroes with godlike powers. The artwork of comic books has its own aesthetic standards, with all styles tending toward garish colors and blatant messages, not unlike the portrayal of the lives of apostles in stained-glass church windows. Finally, despite the millions printed in the golden age, comic books are ephemeral. Surviving copies, like the relics of saints, are highly valued by those true to the faith of their boyhood. Comic books are, of course, related to newspaper comic strips, which in turn are descended from political cartoons and caricatures published in nineteenth-century humor magazines such as Puck. “The Circus in Hogan’s Alley,” drawn by Richard Felton Outcault, is generally considered the first comic strip, appearing May 5, 1895, in a color section of the Sunday edition of the New York World. The drawings featured a bald-headed, jugeared boy about seven years old who wore a dirty yellow nightshirt often adorned with derisive and flippant remarks. This character became known as the “Yellow Kid,” and his popularity was so great that the publisher William Randolph Hearst outbid his rival Joseph Pulitzer and brought Outcault’s comic strip to the New York Journal, where it appeared until 1898, when Outcault stopped drawing the strip. Despite his brief three-year existence, the Yellow Kid lived on in the form of dolls, games, and joke books. Moreover, he inspired many imitators, the most successful being “The Katzenjammer Kids” by Rudolf Dirks, which
Comic Books
181
A young boy reading his Superman comic book. (Library of Congress)
began running in the New York Journal in December 1897. The “Kids” were two boys whose mischief and vandalism grew increasingly horrendous each week. In 1902 Outcault created a new cartoon strip, “Buster Brown,” featuring a welldressed ten-year-old who disrupts the lives of his family and community with endless pranks. Buster Brown enjoyed even greater success than the Yellow Kid,
and his image appeared in advertisements for watches, bread, and shoes. Although newspaper comic strips were increasingly written and drawn to appeal to adult readers, the fact that many of the most popular strips were about delinquent boys reinforces the connection between boys and comics. Many boys were undoubtedly inspired to imitate the behavior of Hans and Fritz Katzenjammer and
182
Comic Books
Buster Brown, who constantly resisted the rules and norms of adult society. The strips also created an image of clever boys whose comments on life seem wise beyond their years, an idea continued in recent comic strips such as “Peanuts,” “Dennis the Menace,” “Calvin and Hobbes,” and “The Boondocks.” The evolution of comic strip boy characters into adventure strip heroes was an interesting, significant development that began in 1924 in a strip called “Wash Tubbs.” Wash was a small youngster who looked and acted like a teenager but got into harrowing adventures in the South Seas, Mexico, and other exotic places. With his partner, Captain Easy, Wash mixed comic gags with two-fisted action. In the 1930s adventure strips appealing to boys—“Tarzan,” “Dick Tracy,” “Dickie Dare,” and “Terry and the Pirates”—further evolved from the realm of real life to the imaginary world of superheroes. “Dickie Dare” is a good example of the way in which the process took place. Created by Milton Caniff in 1933, Dickie was a boy who had adventures in an adult world of a fictionalized past. Caniff changed the strip into straight adventure and then abandoned it to focus on “Terry and the Pirates.” Many of the adventure comics became weekly radio shows in the 1930s and 1940s, further stimulating the imaginations of boys and facilitating their identification with adult heroes. When two teenage boys, Jerry Siegel and Joe Shuster, created the character named “Superman” in the early 1930s, they unwittingly began the classic era of comic books. Superman made his first public appearance in a comic book in June 1938, followed by syndication in newspaper comic strips a year later. Toys, costumes, games, a radio program, animated cartoons, and movies followed in the 1940s,
culminating in a television series in the 1950s. It is as a comic book character, however, that Superman was most accessible to boys. The early stories concocted by Siegel and Shuster, more than the rather simple drawings, appealed to boys because they combined a number of elements that were reshaping childhood— rapid developments in science, new attitudes toward work and leisure, and traditional attitudes toward civil society. Superman is first of all a science fiction character. With his extraterrestrial origins and ability to fly, Superman represents the frontiers of twentieth-century science, and his name reflects his origins. In his book The American Language, the journalist and lexicographer H. L. Mencken noted the vogue for the prefix “super” in the 1920s when Siegel and Schuster were growing up. Superhighways were a product of new technology and supermarkets of new marketing theories. The title of a new magazine in 1930 summed up the mood: Astounding Stories of Super Science. The invented biography of Superman’s alter ego and mask, Clark Kent, is likewise revealing. Kent’s role as a reporter serves Superman in two ways. By covering stories he helps to make, Kent acts as his/Superman’s own publicity agent. Just as importantly, Kent earns a salary that supports Superman’s avocation of defending “truth, justice, and the American way.” When Kent strips off his glasses and suit to emerge wearing what looks like swimming trunks over long underwear with a beach towel thrown over his shoulders (a costume a boy could easily assemble at home), he utters the memorable line: “This is a job for Superman.” This cryptic assertion of the work ethic underscores Superman’s capacity to be a superworker. Clark Kent has a fulltime job; Superman puts in overtime. Or
Comic Books does he? Although he is sometimes asked to catch a criminal or prevent a disaster, he often stumbles into a situation, and his methods of dealing with miscreants usually involve elaborate and time-consuming punishments. Superman specializes in poetic justice and likes to have the last laugh. In short, his work is his play. Sociologists from Manhattan to Middletown were writing in the 1930s that a new leisure ethic was emerging among American youth. Childrearing manuals were emphasizing a “fun morality.” It was acceptable to play. Yet Superman’s method of helping society is philanthropic in the traditional meaning of the term. Nonhuman himself, he shows his love by acts of humanitarianism. He avoids playing god by playing by the rules he has learned from his foster parents and society. Thus Superman’s roles as volunteer disaster relief worker, posse member, and neighborhood watch captain form the third and most important dimension of his persona, after his science fiction and playful attributes. Superman donates his superpowers to Metropolis and other communities in exchange for acceptance and citizenship in his adopted land. This is an ideal of citizenship taught by the Boy Scouts and school civics books. “Superman” comics also taught concepts of civic duty through their monthly advice column and the organization of Superman clubs. Action Comics, which claimed a circulation of 1.4 million by 1940, was an incredible success; by the end of that year, more than 60 other comic books were being published each month. Two years later, there were 168 titles appearing monthly with a combined circulation estimated at 12 million. In 1943 a poll showed that 95 percent of all boys aged six to eleven read comic books (Nye 1970, 239).
183
Besides “Superman,” boys read “Batman,” in which the title character’s secret identity was millionaire Bruce Wayne. Batman was accompanied by Robin, the “Boy Wonder,” a former circus acrobat. The duo began fighting crime in the May 1939 issue of Detective Comics. They worked by night on the dark streets of Gotham City, somewhere east of Superman’s Metropolis. Batman’s character is decidedly less wholesome than Superman’s, and his weird costume and use of weapons make him a forerunner of the more sinister characters of 1950s horror comics and the 1990s mutant hero comics. His success indicates a darker side of boyhood present at the beginning of World War II. Another vigilante of the dark was Captain Midnight, an aviator who was fighting the Nazis on a radio serial more than a year before the United States entered the war. He appeared in comic books beginning in July 1941 but remained more popular on radio than in the comics. Closer to boys’ fantasies was Captain Marvel, an orphaned boy who was transformed into a superhero by uttering the magic acronym “Shazam.” The streets and yards of the United States reverberated with “Shazams” as boys sought to acquire Solomon’s wisdom, Hercules’ strength, Atlas’s stamina, Zeus’s power, Achilles’ courage, and Mercury’s speed, after Captain Marvel’s comic book appearance in March 1940. A year later another scrawny boy was transformed into the comic book superhero Captain America, this time by drinking a secret potion. His red, white, and blue costume was clearly symbolic of his mission, the defeat of the German and Japanese military. Finally, just before Pearl Harbor, Captains Marvel and America got help from a human hero, Blackhawk, and his squadron of fighter pilots. The fact that
184
Comic Books
comic book superheroes were enlisted in a desperate struggle to save American democracy made a deep impression on boys born in the 1930s and early 1940s. Boy culture in the years 1940–1955 changed rapidly from one characterized by regional diversity to one marked by national similarities. Patriotic propaganda, mass media, and the relocation of families due to military service and defense work all helped to break down the barriers of race, class, and place. Because about 18 percent of American families contributed one or more members to the armed forces during the war, the number of single-parent families increased for the duration of the war. More than a million women went to work full-time, almost a quarter of whom had children. The socalled latchkey children had more time alone and more comic books to fill it. When boys tired of imagining themselves in caped pursuit of fascist foes, they turned to the waggish humor of Archie, a teenager who seemed unconcerned about the draft, or to the antics of Donald Duck, who made his comic book appearance in April 1943. An older boy might prefer to look at scantily clad princesses in the jungle comics. A boy who wanted to please his harried parents might spend 10 cents earned by collecting scrap on one of the new “Classics Illustrated” comics—“The Three Musketeers,” “Ivanhoe,” “The Count of Monte Cristo,” or even “Moby Dick.” Comic books of this era provided boys with a common vocabulary, visual and verbal, and a sense of community as they collected and traded titles. The desire for comic books led some boys to petty crime. It was easy in a crowded drugstore to lift a copy of the latest “Airboy” off the revolving rack and zip it under a jacket. The writer James Alan McPher-
son, who grew up in Savannah, collected 700 comic books but felt so badly about the ones he stole that he returned them. He played hooky from school to go to the Salvation Army Store, where he could buy two or three for 5 cents. For boys too poor to buy all the toys advertised in the comic books, the purchase of even used comics made them feel like they were participating in consumer society. Norman Rockwell’s December 7, 1946, cover for the Saturday Evening Post depicts a poignant moment in a boy’s life as he looks into his coin purse for money to pay for the meal he has just eaten in a railroad dining car, while the smiling waiter looks on. Does he have enough for the bill, and how much should he tip? All but unnoticed in the pocket of his jacket is the top of a shiny comic book for which he may not have budgeted. By 1954 there were 650 comic book titles with perhaps 100 million copies in print each month, but the era was coming to an end (Nye 1970, 240). The superheroes were now fighting the Cold War, but the goals were not as clear. Fear of nuclear weapons and atomic radiation created anxieties and neuroses. Cities were decaying, and suburbs sprawled across old farmlands. Fathers had returned from war only to grow remote again as commuters in gray flannel suits. A new generation of comic book writers, many of them veterans, created ghastly tales from the crypt and supernatural stories of horror, vividly illustrated by decaying corpses and mutilated bodies. Superheroes brooded over personal problems and agonized over their real and imagined ugliness. In 1952, the company that published many of the new horror and science fiction comics began publishing Mad, an iconoclastic illustrated magazine that made fun of American politics, movies, television, and even comics.
Competition It is hardly surprising that when the U.S. Senate began investigating juvenile delinquency in 1953, the committee soon blamed comic book publishers for criminal behavior by boys. In 1954 the distinguished psychologist Frederic Wertham put his case for censorship of comics in a best-selling book, Seduction of the Innocent, arguing that young minds were being warped not only by the violence depicted in the stories but by commercial culture that disrupted traditional relationships between parents and children. Public pressure led the Comics Magazine Association to write a code of standards, but many publishers evaded the restrictions by publishing in larger magazine formats. Since 1960, the gap between traditional comics for young boys and more lurid “graphic novels” for older boys and adults has widened. Comic books have even more extensive connections with the toy industry, television, and more recently video games and the Internet. Boys continue to create their own culture, slang, and defiant opposition to adult rule, but they do it with a different style. Perhaps comic books are the best place to see the changing aesthetics of boyhood. Compare the soft-shaped and almost pastel colors of the 1940s “Superman” with the hard-edged figures and Day-Glo brilliance of “X-Men” in 2000. Comic book publishers, like the producers of other forms of popular culture, no longer appeal to the broadest possible market but seek clusters of fans with specialized interests. For boys this represents both a loss and a gain; the loss of a common comic book brotherhood on the one hand, but the potential of greater freedom of choice on the other. Bernard Mergen
185
See also Superheroes References and further reading Comics Scene 2000. 2000. New York: Starlog Group. Gordon, Ian. 1998. Comic Strips and Consumer Culture 1890–1945. Washington, DC: Smithsonian Institution Press. Goulart, Ron. 2000. Comic Book Culture: An Illustrated History. Portland, OR: Collectors Press. Goulart, Ron, ed. 1990. Encyclopedia of American Comics. New York: Facts on File. Horn, Maurice, ed. 1977. The World Encyclopedia of Comics. New York: Avon. Inge, M. Thomas. 1990. Comics as Culture. Jackson: University of Mississippi Press. McCloud, Scott. 1993. Understanding Comics. Princeton, WI: Kitchen Sink Press. ———. 2000. Reinventing Comics. New York: HarperPerennial. Mencken, H. L. 1982. The American Language. New York: Alfred A. Knopf. Nye, Russel. 1970. The Unembarrassed Muse. New York: Dial Press. Pustz, Matthew. 1999. Comic Book Culture: Fanboys and True Believers. Jackson: University Press of Mississippi.
Competition Social scientists have long debated whether human males are inherently more aggressive and thus more competitive than human females. As they have in virtually all societies throughout human history, American boys in general have indisputably displayed a higher degree of competitive behavior than girls. Yet, however much or little this trait is genetically determined, the degree and type of competitiveness manifested by the population in general and among boys in particular have changed significantly over time. Social, economic, and cultural factors have been a significant factor in determining the degree of social
186
Competition
sanction accorded competitive behavior. Specifically, during the colonial era, the rigorous Calvinist piety of New England Puritans tended to discourage overt displays of competitiveness, whereas the slave society and ethos of honor prevalent in the South tended to encourage it. The advent of industrial capitalism in the early nineteenth century ushered in a highly competitive economic system, which in turn promoted a value system that tended to encourage competitiveness among boys. The widespread fear that boys were becoming too effeminate accentuated the value placed on competitiveness during the late nineteenth century. This social and cultural norm remained essentially intact until the dramatic social and cultural changes that began in the 1960s, notably reservations about the morality of capitalism and the growth of feminism, influenced parents and educational professionals to be less likely to encourage a high degree of competitiveness among boys. During the colonial era, attitudes toward competitiveness differed sharply between Puritan New England and the slave societies of the South. The Puritans’ orthodox Calvinism, with its strong belief in human depravity and pervasive fear of sin, engendered obsessive efforts to keep the baser elements of human nature under control. Since competitiveness is often linked with aggression, greed, and willfulness, it was not a trait that Puritans generally encouraged in their children. Also, the Puritan belief in a tightly knit community and respect for social hierarchy generally discouraged overt displays of competitiveness. Additionally, the absence of a fully developed ideology of laissez-faire capitalism diminished the societal sanction for social or economic competition. Puritan atti-
tudes toward boys’ sports reflected their ambivalent attitudes toward competitiveness. Recreational pastimes that were deemed regenerative, healthful, or socially useful were encouraged, but competitive sports such as football and bat-and-ball games were often regarded with suspicion and hostility. In the minds of Puritan elders, such games too often unleashed the youthful passions that threatened their sense of piety and social order. Southern childrearing practices during the colonial and antebellum eras, however, promoted a higher degree of competitiveness, reflecting the values of a social order based upon the violence and exploitation of the slave system. The ethos of honor that permeated southern culture discouraged piety and mandated that any affront be met with an aggressive defense of one’s good name and social status. Although Puritans lived under a more communitarian ethos and static social hierarchy, southerners could gain or lose social status depending on the degree of aggression and competitiveness they evinced toward their fellows. Thus, whether born to the slaveholding elite, the white yeomanry, or the backcountry poor, boys were generally encouraged to cultivate self-will and competitiveness as necessary elements of character. Self-restraint was considered tantamount to weakness and was thus discouraged. Although the southern social order and its corresponding cultural values underwent no dramatic transformation prior to the Civil War, the changes in the North between 1800 and 1860 created cultural values whose influence remains central to American life to the present day. The market revolution, the emergence of industrial capitalism, and urbanization
Competition transformed the American family and the values by which children were nurtured and educated. Competitiveness became essential for success in the new system of industrial capitalism, so parents encouraged their sons’ competitiveness to an extent never before seen in the United States. The separation of home and workplace in urban industrial society spawned the ideology of separate spheres, which sanctioned competitiveness among males as an essential element of the male’s social role as breadwinner. Competitiveness was an economically vital characteristic of a bourgeois society in which status was earned rather than ascribed. Accordingly, parents became increasingly tolerant of this trait in their sons. The separation of home and work also removed fathers from the household to a historically unprecedented extent and greatly expanded mothers’ responsibility for the raising of sons. As in preindustrial America, boys remained under the close supervision of their mothers until about the age of six, but they no longer were transferred directly to the equally close supervision of their fathers in order to learn the father’s occupation and contribute to household income. Thus, boys in urban areas had an unprecedented freedom to create a semiautonomous world centered in playgrounds, backyards, streets, and wooded areas that was remarkably free of the direct control of adults. The societal sanction given to competitiveness as a result of the bourgeois value system coupled with the autonomy of the boys’ world created an environment in which competitiveness flourished. Boys formed groups organized along neighborhood, class, or ethnic lines, and they commonly engaged in sometimes violent competition with other such groups over turf or status.
187
However, the more important competitions occurred within the groups themselves. Loyalty and group cohesion were necessary to the creation and maintenance of the group, but boys competed among themselves to establish status rankings that were of vital social and psychological importance in the minds of boys. Displaying prowess in footraces, swimming, baseball, or football or in the ability to perform a daring feat at which others blanched enabled boys to gain status vis-à-vis their peers. Even as the evangelical piety of the feminine domestic sphere discouraged the sometimes violent and often psychologically demeaning competition for status within boys’ groups, their relative freedom gave them license to build a world in which competitiveness was an essential element of their relationships with one another. Boys growing up in the urban centers in the Northeast and Midwest participated in adult male activities, thus gaining lessons in the mixture of cooperation and competitiveness that characterized the adult world. Political parties encouraged the participation of boys as a means of building party loyalty at a young age, thereby socializing boys into this important element of nineteenth-century masculine identity. Boys were often assigned the task of carrying out pranks against rival parties, introducing them to the rough-and-tumble competitiveness of partisan politics. Volunteer fire companies and militia companies were other adult institutions that encouraged boys’ participation, thus inculcating them with a spirit of competitiveness. These groups were perhaps better known for fighting one another than they were for fighting fires or wars, and their rivalries often resembled those of urban gangs. Boys as young as ten learned lessons in
188
Competition
competitiveness by serving as auxiliaries or aides to the adult organizations. Boys from the rapidly growing middle and upper middle classes were less likely than their working-class counterparts to participate in the roughly competitive world of politics or the fire and militia companies. Their world was circumscribed by the norms of bourgeois respectability and by the predominance of evangelical piety within the feminine domestic sphere. Schools and churches, the other major institutions charged with the socialization of boys, were likewise dominated by an ethos of piety and passivity. After about 1870, however, opinion leaders and social theorists issued increasingly shrill warnings that boys born into more affluent families were too effeminate. Men such as the philosopher William James, the educational theorist G. Stanley Hall, and most notably Theodore Roosevelt advocated more masculine influences in the lives of middle- and upperclass boys. Competitiveness was a trait that they especially wished to inculcate in these boys suffering, in their view, from the stultifying effeminacy of the world of luxury in which they were growing up. Roosevelt stridently advocated the “strenuous life,” a prescription for boys to compete against one another, against nature, and, most importantly, against their own weaknesses in the quest for self-creation. Competitiveness had been accepted as a necessary male trait as industrial capitalism attained preeminence during the first half of the nineteenth century, but evangelical moralism had prevented middle-class America from embracing it as a positive good. The obsessive fears of feminization that pervaded the fin-de-siècle American bourgeois mentality, as well as the rise of social Darwinism and imperialism, transformed this reluctant accep-
tance into a full-fledged embrace of unrestrained competitiveness. It rapidly became a virtue to be cultivated in boys. Not coincidentally, competitive sports became a central component of the institutions Progressive-era reformers created to deal with the social needs of children between 1890 and 1920. The Young Men’s Christian Association (YMCA) began to promote competitive sports vigorously among boys during the 1890s. By 1920, more than 200,000 boys participated in sports programs run by the YMCA. Local Protestant and Catholic churches began to follow suit over the next few decades, thus reversing the skeptical view that American churches had traditionally held toward competitive sports. Settlement houses, the philanthropic social service institutions that sprang up in major American cities after the 1890s, and the Federated Boys Clubs, founded in Boston in 1906 (renamed the Boys Clubs of America in 1931), each incorporated sports into the recreational programs they offered the mostly immigrant and working-class boys in their clientele. Public schools, however, were far and away the most important institution that made competitive sports an integral element in the lives of American boys. The influx of resources into the public school system during the Progressive era rendered it a much more allencompassing influence on American children, and the addition of competitive sports as both curricular and extracurricular activities gave competitiveness an added imprimatur of legitimacy. The Public Schools Athletic League (PSAL) of New York City was founded in 1903, and by 1913 more than 350,000 New York schoolchildren, mostly boys, competed under its aegis. The PSAL system valued team sports, star athletes, and the pursuit
Competition of championships, thereby institutionalizing competition into the fabric of school sports. The PSAL’s mode of organization, which Theodore Roosevelt enthusiastically endorsed, became the model for public school athletic programs across the nation. The period between 1890 and 1920 witnessed the acceptance of competitiveness as a necessary and beneficial trait in boys, which became a hegemonic element of American culture for roughly the next half-century. However, subtle currents of discontent with the dominant culture’s acceptance of competitiveness as a positive good began to be expressed during the 1950s and reached a flood tide during the 1960s and 1970s. The student movement and the counterculture of the 1960s viewed the harddriving, type-A male personality less as the source of American wealth and freedom and more as a source of racial, class, and gender inequality, imperialism, and environmental destruction. Although many middle-class Americans still believed that their sons should live by the credo “Winning isn’t everything, it’s the only thing,” voiced by Green Bay Packers coach Vince Lombardi, substantial numbers of Americans began to question this attitude. The feminist movement, which became a significant force in American life during the 1970s, asserted that the competitiveness that was instilled in boys fostered aggression and a sense of male privilege. The ideals of the counterculture and feminist movements gradually worked their way into the dominant culture’s conceptions of childrearing. During the 1980s and 1990s, educational curricula began to emphasize cooperation at the expense of competition and teamwork instead of individualism. Some youth sports
189
organizations have declared that physical fitness, sportsmanship, equal participation by all team members, and the enhancement of self-esteem should be the primary goals of such sports, and the single-minded pursuit of victory should be deemphasized. Some such organizations have refused to sanction competitive contests that result in winners and losers. However, globalization, the fall of communism, and the information revolution have rendered capitalism more competitive, not less, during the waning decades of the twentieth century. Thus, whatever qualms that parents, opinion leaders, and educators may have over the deleterious effects of encouraging competitiveness among boys, the dominant culture has certainly not forsaken it. The lives of middle-class children and those who aspire to attain that status are filled with intense competitive pressures. (The increasingly hegemonic ideology of gender equality is now subjecting girls to the same pressures to succeed that once primarily affected their brothers.) From a young age, they face a level of competition that their parents’ generation did not have to reckon with until adolescence. They are expected to gain admission to exclusive private or magnet schools and, once there, to attain perfect or near-perfect grades, to score well on standardized tests, and to gain admission to elite universities. Competition has been a central element in the lives of American boys for two centuries. A dominant culture that has historically extolled competition as the driving force behind the creation of the wealthiest and most powerful nation in the world has also taken pains to encourage its sons to rise to its challenges. Despite some significant qualms regarding the cost of subjecting boys to the pressures of an intensely competitive environment,
190
Computers
recent history indicates that competition is and will remain an integral aspect of the lives of American boys. Andrew Doyle See also Clubs; Fathers; Fire Companies; Mothers; Sports, Colonial Era to 1920; Sports, 1921 to the Present; Young Men’s Christian Association References and further reading Hawes, Joseph, and N. Ray Hiner, eds. 1985. American Childhood: A Research Guide and Historical Handbook. Westport, CT: Greenwood Press. Kett, Joseph. 1977. Rites of Passage: Adolescence in America, 1790 to the Present. New York: Basic Books. Kimmel, Michael. 1996. Manhood in America: A Cultural History. New York: Free Press. Rosenthal, Michael. 1984. The Character Factory: Baden-Powell and the Origins of the Boy Scout Movement. New York: Pantheon Books. Rotundo, E. Anthony. 1993. American Manhood: Transformations in Masculinity from the Revolution to the Modern Era. New York: Basic Books. Seymour, Harold. 1990. Baseball: The People’s Game. New York: Oxford University Press. Wilson, Edward O. 1978. On Human Nature. Cambridge, MA: Harvard University Press. Wyatt-Brown, Bertram. 1982. Southern Honor: Ethics and Behavior in the Old South. New York: Oxford University Press.
Computers During the last quarter of the twentieth century, microcomputers became commonplace in the United States, and American boys became avid computer users. There were more boys using computers in the United States than in the rest of the world combined. During most of the period between 1975 and 2000, more boys than girls used computers regularly, and boys were thought to be more
competent, confident, and frequent users of computers than their female counterparts (Collis 1987). American children had many role models in the area of computer use, most of whom were male. Computer programmers were generally male; computer teachers in school were male; many fathers (and a few mothers) used computers extensively in their work; computer camps and labs were generally filled with boys and men who considered themselves computer experts (or, more colloquially, computer geeks or nerds); and advanced math, science, and computer classes were populated predominantly by boys. Boys were much more likely than girls to choose careers in computer science and related computer fields. Despite the apparent gender neutrality of any historical period, the main markers of the age of computers and electronic information were commonly considered to be male. American scholars asserted that technology was gendered and its gender was male (Benston 1985). Given broad access to computers and computer software, numerous role models, and the relatively unchallenged perception that the world of computers was a male domain, American boys used computers extensively and in a wide variety of ways. Gender differences in computer use were associated with the differential socialization of boys and girls, made manifest in such places as the home where fathers and brothers used computers the most or on television where males were often portrayed in computer-related roles in programs and commercials (Sanders 1990). Statistics In the late twentieth century, American boys had unsurpassed access to computers, computer software, and the Internet.
Computers
software at school was also limited for boys living in poor neighborhoods.
Many had access at home, camps, and public libraries; many more had access at school. Table 1 shows the number of students who used computers in 2000, and the number of students estimated to use computers in 2005, by continent: TABLE 1
Training Boys had numerous opportunities to take formal computer classes. Schools provided training in the use of drawing tools such as KidPix™, word processing tools such as The Children’s Writing and Publishing Center™, and databases and spreadsheets in ClarisWorks™, as well as interactive educational programs such as Oregon Trail™, and simulations such as Sim City™. At all levels in school, teachers were more likely to choose boys rather than girls to assist with technology (Sanders 1990), and boys had more opportunities for learning and using computers than did girls (Shakeshaft 1986). Although most teachers perceived that they gave equal time and attention to girls and boys within their classrooms (Lanzinger 1990), in fact, male students received more time and attention from teachers than did female students (Sadker and Sadker 1994); teachers instructed male students but often did the task for female students; and teachers allowed male students more opportunities to respond, question, engage in activities, and give opinions than they allowed female students.
Contemporary and Projected Future Computer Use
Africa Asia Europe Middle East North America South America
2000
2005
90,000 6.2 million 6.1 million 157,000 13.7 million 447,000
357,000 22.2 million 15.3 million 440,000 36.3 million 1.8 million
In 1998, when asked how many hours per week they used computers at home, a small sample of students from New York representative of all American boys reported usage as shown in Table 2. These boys also reported the number of times they used the Internet per week, as shown in Table 3. Access to computers, however, was not distributed evenly across all American households. Many more Caucasian and Asian American boys had home computers than did African American and Latino boys. Access to computers and computer TABLE 2 No Use 1%
Sample Weekly Computer Usage by Boys in the United States
1–9 Hours
10–19 Hours
20–29 Hours
30–39 Hours
40+ Hours
40%
29%
14%
7.5%
8.6%
TABLE 3 Sample Weekly Frequency of Internet Usage by Boys in the United States No Use 5%
191
1–9 Times
10–19 Times
20–29 Times
30+ Times
56%
24%
7%
6%
192
Computers
Boys working on computers. (Photodisc)
Computer training workshops by the Boy Scouts and Trail Rangers helped boys learn the techniques for advanced computing. Many were rewarded with badges for their efforts. This differential treatment often gave boys an advantage in learning about and using computers and computer software. Many boys, however, were self-taught. Given a computer and time to play, boys would experiment with computer hardware and software and learn through trial and error (Christie 2000). Many U.S. computer entrepreneurs such as Steve Jobs and Bill Gates started their careers as young boys intrigued with computer hardware and software. Many boys learned in similar ways through inventive hypothesizing, testing, and experimenting. Boys in a high school networking
class said that learning about computers was the modern version of poking under the hood of a car. However, in its annual report on gender equity in education, the American Association of University Women (Wellesley College Center 1992) warned that computer science had become the new “boys’ club” in schools. Role Models Role models for boys interested in computers were plentiful. In schools worldwide, computer use was dominated by men, and therefore boys found many male teachers as role models (Reinen and Plomp 1993). Until the last few years of the twentieth century, computer teachers, counselors at computer camps, and family members who used computers extensively were predominantly male.
Computers Three times as many boys as girls participated in summer computer camps, and parents were more likely to purchase computers, computer software, and peripherals for boys than for girls (Hess and Miura 1985). That the overwhelming majority of role models were male can be seen in the percentage of males who participated in four World Wide Web Conferences in the late 1990s: 90 percent, 85 percent, 86 percent, and 84 percent. In addition, Jo Sanders (1990) maintained that most girls felt there was a “No Girls Allowed” sign on the door of the computer lab. The computer lab was a place for (often rather nerdy) boys and men, and it was not a proper place for girls.
•
•
• Uses of Computers Boys identified with computers in three primary ways—as toys, as tools, and as a trademark. In general, grade school boys used computers as gaming devices; high school boys used them as tools and as gaming devices; and after high school, boys saw themselves as part of the male computer culture. Although these ways of identifying with computers varied by degree over time, boys of all ages played computer games, used computers to accomplish tasks, and saw themselves as part of the predominantly male computer culture. Young boys used computer gaming machines such as Nintendo™, Sega™, and Sony PlayStation™, as well as a variety of computer software. These games were often competitive or violent in nature. Major categories of these types of games are listed below along with a popular example of each and a descriptive sentence from its publisher: • Action and Adventure Games: Aliens vs. Predator Gold Edition™.
•
•
•
193
The three most ferocious species in the universe are pitted against one another in a battle for the ultimate prize: survival. One wrong move turns you from hunter into prey. Arcade Games: Lemmings Revolution™. Hot air balloons have been strategically positioned to lure the lemmings into danger. Get the lemmings to safety in this addicting game. Board Games and Puzzles: Risk II™. It is the time of empires and armies, of countries and conquests. The world is at war, and you are in command of an army fighting for global domination. Role-playing Games: Wizards & Warriors™. In this enchanted medieval realm, embark on more than 100 quests and adventures—battling in real-time and turn-based combat. Can you uncover the mysteries of the Sword before terror reigns? Simulations: Al Unser, Jr. Arcade Racing™. Experience high speed thrills as you throw a top performance turbo-charged racing vehicle around treacherous stages. Sports Games: NFL Fever 2000™. You are on the field for every bonejarring tackle, blitz, rush, and sack. This is the NFL with attitude. Dare to go facemask to facemask with the league’s best. Strategy Games: Myst™. Few are chosen. Fewer succeed. Combine keen observation and logic to unlock secrets. Myst™ is the surrealistic adventure that will become your world.
Models of male behavior, including the computers and computer games marketed
194
Computers
for boys, stressed decisiveness, competition, and the imposition of will or power (Turkle 1984). A male high school student reported: “Guys like to have all the raw power. It’s a competition thing.” Computer hardware and software companies marketed specifically to this model of male behavior. The HotWheels™ computer was marketed as a “computer designed with boys in mind.” It came with many of the types of games listed above. The female version of this computer was the Barbie™ computer; it included none of the competitive games in the HotWheels™ version. Even games designed for young boys stressed competition, power, and violence. Games remained popular with boys (and men) of all ages. However, when boys reached junior and senior high school, computers took on another function: that of a tool to help boys complete homework and do research projects. Tools Many software applications were available for American youth to assist them in the learning process, both at school and in many homes. The most popular tools included word processing, database, and spreadsheet programs; drawing programs; and Internet browsers and searching tools (Christie 1997). The Children’s Writing and Publishing Center™ was a bilingual (English and Spanish) word processing program that allowed children to create professional-looking reports, newsletters, and other written documents. This software included a program to check the spelling and grammar for students, as well as a graphics library so children could use graphics to illustrate their written work. Boys as young as six years old could successfully create written documents using this program. ClarisWorks™ fit into the category called
“integrated software” because it included a word processing program as well as programs to generate databases and spreadsheets. Boys could use all three components to create research reports on topics they studied. One popular use of databases was to keep names and addresses of fellow classmates. Spreadsheets were used to store and graph numeric information gathered through research, and boys often would graph sports statistics using the graphing function of spreadsheets. If students wanted to use commercially prepared clip art to create greeting cards, signs, or banners, they could choose PrintShop™. After completing a study of the solar system, for example, boys could create greeting cards inviting parents to a “Solar System Celebration,” signs giving directions to the room where the celebration would be held, and 20-foot banners that welcomed visitors to the celebration. Since inexpensive color printers accompanied most computers, boys could make colorful, artistic products using PrintShop™. Another graphics generation program, KidPix™, contained little clip art. Instead, boys using this program created the pictures themselves. A young boy could write his name in KidPix™. He could then add a background color, draw images, and create a picture to print and share. Boys wishing to create an illustrated story could create a series of pictures—with or without text—and then place them into a slide show complete with sounds and transitions. Boys as young as four or five years old were very proficient at using this program. With the advent of the Internet, browsers and searching tools (search engines and subject directories) became very popular tools for American boys. Browsers such as Netscape Navigator™
Computers and Internet Explorer™ allowed boys to move easily through the many sites that were part of the Internet. Boys could search the Internet using search engines such as Ask Jeeves for Kids™ or subject directories such as Yahooligans™ and find sites of personal interest or sites that would help them complete an assignment for school. These popular tools were available to boys at school, at public libraries, and often at home. Career Choices In 1990, 87 percent of the doctoral degrees awarded in computer science went to males, and 92 percent of computer science professors were male. Ellen Spertus (1991) reported that causes for this trend included the different ways in which boys and girls were raised, the predominance of male role models, stereotypes that engineers and computer scientists were male, and resistance from women who did not wish to work in predominantly male environments. In addition, differing expectations of girls and boys often were perpetuated in school. A math teacher at an American high school invited students to use the Internet. Nine students rushed to her desk—eight boys and one girl; she found this ratio typical of the gender distribution in computer use in American schools at that time. She also noted that teachers often allowed boys to dominate computer classes. The less assertive girls, left by the wayside, often did not increase their technology skills and thereby limited their career choices. Researchers from Wellesley College (1992) conducted a study for the American Association of University Women on this issue. They concluded that boys received extensive encouragement to go into the sciences, computing, and technology-related areas.
195
Girls, however, felt out of place in these areas. With this encouragement, many American boys chose careers in computer-related industries. American boys used computers extensively as toys, tools, and trademarks of their maleness and their place in the male-dominated computer culture. With the advent of the twenty-first century, however, the distinction between male and female computer users blurred. Both boys and girls were using computers in numerous ways to accomplish personal and school-related tasks. At that time, many Americans suggested that the country should embrace a broader, more inclusive view of computer culture than the existing male-dominated culture. In this view, all uses and understandings of computers were equally regarded. Alice A. Christie See also Competition; Toys; Video Games References and further reading Benston, M. L. 1985. “The Myth of Computer Literacy.” Canadian Women’s Studies 5: 20–22. Berch, B. 1984. “For Women the Chips Are Down.” Processed World 11, no. 2: 42–46. Christie, A. A. 1997. “Using Email within a Classroom Based on Feminist Pedagogy.” Journal of Research on Computing in Education 30, no. 2 (December). ———. 2000. “Gender Differences in Computer Use in Adolescent Boys and Girls.” Unpublished raw data. Collis, B. 1987. “Psycho-social Implications of Sex Differences in Attitudes towards Computers: Results of a Survey.” International Journal of Women’s Studies 8, no. 3: 207–213. Hess, R. D., and Miura, I. T. 1985. “Gender Differences in Enrollment in Computer Camps and Classes.” Sex Roles 13: 193–203. Lanzinger, I. 1990. “Toward Feminist Science Teaching.” Canadian Woman Studies 13, no. 2.
196
Cowboys
Reinen, I. J., and T. Plomp. 1993. “Some Gender Issues in Educational Computer Use: Results of an International Comparative Survey.” Computers and Education: An International Journal 20, no. 4: 353–365. Sadker, M., and D. Sadker. 1994. Failing at Fairness: How Our Schools Cheat Girls. New York: Simon and Schuster. Sanders, Jo. 1990. “Computer Equity for Girls: What Keeps It from Happening.” Pp. 181–185 in Fifth World Conference on Computers in Education in Sydney, Australia. Amsterdam: Elsevier Science Publishing. Shakeshaft, C. 1986. “A Gender at Risk.” Phi Delta Kappan 67, no. 7: 499–503. Spertus, Ellen. 1991. “Why Are There So Few Female Computer Scientists?” AI Lab Technical Report 1315. Artificial MIT (August). Turkle, S. 1984. The Second Self: Computers and the Human Spirit. New York: Simon and Schuster. Wellesley College Center for Research on Women. 1992. The AAUW Report: How Schools Shortchange Girls—A Study of Major Findings on Girls and Education. Washington, DC: AAUW Educational Foundation.
Cowboys Boys have played an important part in cattle raising throughout its history on the American frontier and in the West. The “boy” in “cowboy” was not meant to be literal, yet much of the work of ranching was done by younger males who later put those early skills to work as mature cattlemen. Frontier families faced an enormous amount of work that required the help of all but the very youngest members of the household. Much of this labor involved the care of animals that were crucial sources of food, power, and income. Fortunately, many basic tasks were within the abilities of youngsters. Cattle were among the most important domestic animals. They provided milk when alive and beef when slaughtered.
Both products were consumed by the families themselves and sold for precious cash to buy what the family could not produce for itself. Oxen were also preferred for pulling wagons and plows. On the eastern frontier, from the Appalachians to the Mississippi River valley, cattle were herded mostly by persons on foot, often with the help of dogs. Young boys took cattle to pasture in the morning and brought them in at the end of the day. Nonetheless, cattle were less important in the family economy than pigs, which were better suited to the thick forests. A vigorous cattle industry developed on larger landholdings in the lower Mississippi River valley. Often raised in sizable herds, these animals were shipped down the river systems, sold in larger towns like New Orleans, and exported to the West Indies and other markets. Here young boys also took part in the work. Some were sons of white landowners, and others were young slaves, for even in this land of cotton plantations African American slaves spent an estimated onefourth of their time caring for cattle. Native Americans of the Deep South also raised cattle, both for their own consumption and to send to market, and young boys in these tribes also helped in this labor vital to their economy. Meanwhile, cattle were also being raised in significant numbers in the Southwest, especially southern Texas, an extension of the great ranching empires of northern Mexico. The breeds of cattle and the ranching techniques, in particular the methods of herding by horseback, had their roots in Spain. Once again, at an early age boys began to take part in the tasks of ranching. The skills they learned as youths contributed to the cultivation of what was one of the great equestrian cultures of North America. A similar
Cowboys
197
A young cowboy in California adjusts his spurs as he sits on a ranch fence, ca. 1950s. (Hulton-Deutsch Collection/Corbis)
equestrian culture also developed in California, where Mexican boys and young Native Americans working for the missions and then on ranches participated in the cattle industry that was the basis of the province’s economy.
With the help of boys, cattle raising continued in all these areas, but after the Civil War its focus shifted to a new region—the Great Plains. Now there was a large and growing demand for beef, especially in the Northeast and Midwest; new
198
Cowboys
A young boy plays at being a cowboy. (Shirley Zeiberg)
industrial slaughterhouses in Chicago, Cincinnati, and other cities; and a rail network to distribute the product to the hungry market. The plains were a vast pastureland as large as Europe, and as Native American peoples there were confined to reservations and the huge herds of bison killed, white ranchers moved into this region. Farming families also emigrated to the plains by the thousands in the last third of the nineteenth century. Their main purpose was to convert the grasslands into farms for commercial agriculture, but they often raised cattle as well as corn, wheat, and other crops. Like families of the eastern frontier, they put their cattle to their own uses while also raising them for sale, but unlike the forested East, the open plains were far better suited for cattle than for pigs and other domestics. This geographic advantage, as well as the growing market for beef, led to a boom in cattle raising on the plains. Luckily for ranching and farming families, boys (and girls) were perfectly capable of performing many of the basic chores of caring for cattle.
Much of this work on the plains was done on horseback, and although many years were needed to become a skilled rider, the basics could be learned young. Men later recalled working in the saddle as early as their fifth year. One remembered that he and his siblings were put on a horse when they were tall enough for their heads to touch the bottom of a stirrup. Sons on family farms and ranches typically were expected to see the family’s cattle to pasture in the morning and return them in the evening. During the day they might return home or stay with the animals. Watching over the cattle all day could be a boring and often uncomfortable job for boys of eight or ten years. Summers were typically hot, even blistering, and being caught in the open during a western thunderstorm left them drenched and cold. Many boys, however, found this solitude a welcome time of play and exploration. They raced on horseback, devised all sorts of games, and tested their abilities with a range of contests. There were some dangers, although only rarely from the wild animals, snakes, bandits, and Indians feared by those unacquainted with life in the rural West. A fall from a horse or some other injury while herding in solitude could lead to serious difficulties. For the most part, however, these young cowboys had little to fear during their hundreds of hours on the range. Instead, many looked back and said they had developed a remarkable independence and self-confidence. Required to rely on their abilities and frequently faced with unexpected situations, they believed they came away with a strengthened sense of themselves and their abilities. One boy of nine, told by his father to find and bring back some horses the boy had allowed to wander away, spent a
Cowboys week on his own locating and returning them. Many accounts tell of sons, their self-reliance fed by time at herding, leaving home when barely in their teens. Young cowboys worked at a variety of other tasks. They helped in roundups, the gathering of herds in the spring and sometimes in the fall in order to count, brand, and doctor cattle and to select those to send to market. They took a hand in making barbed-wire fences to control the movement of cattle and to keep them out of fields and gardens, and they worked on windmills that pumped water into tanks for the thirsty animals. Except for the heaviest labors, in fact, no job was thought beyond the abilities of boys not yet in their teens. From 1867 until 1878, at least 2 million head of cattle were driven northward out of Texas to towns in Kansas, Nebraska, Colorado, Wyoming, and the Dakotas, where they were sold and sent by rail to slaughterhouses in the East. Boys as young as fourteen and sometimes younger took part in these famous cattle drives. The season of the drives—summer—was also the busiest time on a ranch, so the animals were usually contracted to businessmen who took them on the trail. These drovers in turn hired men and boys for the trip north. On cattle drives, boys were usually given the most menial tasks, although some of their jobs were vital. Five or six horses were needed for each cowboy, and responsibility for the horse herd, called the “remuda,” often was given to the youngest members of the crew. Boys also helped the cook in preparing the monotonous meals fed to workers. Gradually, they would be introduced to driving the cattle herd, usually about 1,500 head, along the trail to market. As with herding in the countryside, this labor was much less dan-
199
gerous than it seemed to outsiders, and serious injuries or deaths to young cowboys on the trail were extremely unusual. As commercial ranching spread throughout most of the plains and Southwest and as railroads reached into the western hinterlands, long cattle drives were no longer necessary. Ranchers devoted more effort to breeding the most marketable types of animals. Children of both sexes continued to contribute their labor under these new circumstances. Cultivating and cutting hay became a crucial part of many operations, for instance, and young boys often made up much of this labor force. Commercial ranching also expanded outside the West, especially into parts of the South. Here, too, youthful cowboys took a hand in work not much different from others a century or two in the past. New technologies have brought some changes to cattle raising, but the work of ranching has been affected remarkably little over the decades. On family ranches throughout the rural United States, young boys continue to play an important part in the day-to-day labor in this distinctive way of life. Elliott West See also Farm Boys; Frontier Boyhood; Jobs in the Nineteenth Century References and further reading Cleaveland, Agnes Morley. 1977. No Life for a Lady. Lincoln: University of Nebraska Press. Dale, Edward Everett. 1959. Frontier Ways: Sketches of Life in the Old West. Austin: University of Texas Press. Hampsten, Elizabeth. 1991. Settlers’ Children: Growing Up on the Great Plains. Norman: University of Oklahoma Press. Jordan, Terry. 1993. North American Cattle-Ranching Frontiers: Origins, Diffusion and Differentiation.
200
Cowboys
Albuquerque: University of New Mexico Press. Osgood, Ernest Staples. 1929. The Day of the Cattleman. Minneapolis: University of Minnesota Press. Peavy, Linda, and Ursula Smith. 1999. Frontier Children. Norman: University of Oklahoma Press.
West, Elliott. 1989. Growing Up with the Country: Childhood on the Far Western Frontier. Albuquerque: University of New Mexico Press.
D Discipline
on discipline for young boys and the development of new social organizations, such as the Boy Scouts, aimed at instilling manly self-control among youths. The first few decades of the twentieth century were marked by a renewed emphasis on discipline in early childhood, this time based on scientific principles of behaviorism and enacted through strict scheduling. Beginning in the 1930s, a climate of increasing permissiveness prevailed until the late 1950s, when concerns about overindulgent mothers who spoiled and emasculated their boys ushered in the current mix of disciplinary techniques, which includes, for many parents, corporal punishment as well as nonpunitive techniques based on reasoning and discussion. During the colonial period, corporal punishment was a legitimate way to punish deviant behavior in children and adults and was the primary means of disciplining boys. Laws and statutes upheld the right of teachers and masters to physically punish the boys who were their wards. Fathers were the ultimate authority and the disciplinarians of the home. Corporal punishment was often combined with practices, such as whippings performed in front of other people, meant to induce feelings of shame in wrongdoers. The incidence of whippings varied by religion and region. Religious leaders who called for fathers to break the emerging wills of their children
Discipline is best defined as a system of training that is expected to produce orderly behavior. The disciplinary efforts of parents, teachers, and others in positions of authority have greatly affected the experience of generations of American boys. Parents in particular strive to train boys in ways that will result in desirable behavior and characteristics in the future man as well as the boy. Discipline of boys is thus an important part of gender socialization, or the ways in which boys learn the ideal of masculinity prevalent in their sociohistorical context. Trends in preferred disciplinary techniques are related to changes in the circumstances and goals of childrearing. Corporal punishment was the preferred disciplinary technique for most American parents in the colonial period. Parents in the early republic sought to give their children at an early age the self-control necessary for the construction of a virtuous, self-ruled citizenry. Mid-nineteenth-century disciplinary techniques focused on the development of internalized selfrestraint. Antebellum reforms against corporal punishment together with changes in the economy and domestic life led to the development of nonpunitive disciplinary techniques known as moral suasion. Concerns about effeminacy in the late nineteenth century prompted an emphasis
201
202
Discipline
used biblical rationales for discipline that reflected the belief that self-will was sinful and that children were innately depraved. In some Protestant families, the goal of childrearing was to break the child’s will so that it might submit completely to an external authority: severe corporal punishment was sometimes used to obtain the child’s submission in infancy and early childhood. The childrearing practices of genteel southern families of the eighteenth century were shaped by a masculine code of honor with an underlying fear of effeminacy. Parents encouraged their young men to behave in an aggressive, even belligerent manner and were likely to use corporal punishment. Corporal punishment was also used throughout the South in the discipline of slave boys. After the Revolution, American parents sought to meet the challenge of selfrule and stabilize the new republic by blending enlightened European childrearing techniques, which recast the child as malleable and free from innate depravity, with Protestant religious beliefs. The goal of childrearing in the early republic was the creation of virtuous citizens, capable of self-rule through moral restraint and the submission of individual desires to the public good. Because men were to be leaders of the new republic, the discipline of boys was of particular concern. New childrearing techniques that emphasized training the child to control his or her own will were popularized in the United States through the writings of the English philosopher John Locke (1632– 1704), among others. Locke argued that the power of self-control and the ability to deny oneself the satisfaction of unreasonable desires could be attained by habit formation at an early age. This disciplinary technique often took the form of
small lessons in self-denial and obedience to parental authority that parents would set for infants and small children. Locke did not favor the use of corporal punishment to attain submission of the will. The early republican focus on submission of individual interests to the public good gave way by the mid-nineteenth century to a concern with fostering selfdirected, industrious activity in combination with disciplined self-restraint. Self-control rather than submission to authority was the goal of nineteenth-century American discipline. This shift was related to economic and social changes brought about by the emergence of industrial capitalism in the United States. Industrialization eroded the material basis of parental authority: with the separation of most economic production from the home, parents no longer controlled the life chances of their children. Disciplinary techniques that aimed at achieving the child’s submission to external authority threatened the initiative and self-direction that a boy would need to make his way in an industrial, capitalist society. Nineteenth-century parents and teachers were thus primarily concerned with the development of character, which referred to the potential within each individual to internalize moral restraints. Such an internalized moral code offered a secure and flexible basis for order in a changing society. In the antebellum period, the corporal punishment of children generated widespread public criticism. Popular antebellum writers on childrearing criticized parents and teachers who whipped and caned children and recommended the limited use of corporal punishment. This shift in favored disciplinary techniques was related to a middle-class romanti-
Discipline cization of domesticity, idealizing the home as a place of harmony and cooperation ruled over by a loving, gentle mother. The feminization of childrearing and of the teaching profession meant that women were now the primary disciplinarians at home and in the school. Whipping or spanking children ran counter to popular perceptions of the role and nature of women. Although nineteenthcentury discipline varied with individual children and parents, it appears that boys were likely to be disciplined earlier and more severely than girls because it was thought that boys were naturally less tractable. The nineteenth-century perception of women as the embodiment of gentleness and virtue made the physical chastisement of girls repugnant, and boys thus received more whippings than girls at home and at school. Although these changes in childrearing in the nineteenth century affected the discipline of white, middle-class children, enslaved boys and girls in the South continued to be whipped and beaten, often by their mistresses as well as their masters. In the North, parents in patriarchal immigrant families or those living on the edge of poverty also continued to rely on corporal punishment. Yet teachers and parents increasingly sought to discipline middle-class AngloAmerican children through moral suasion, not corporal punishment. Parents and teachers experimented with a range of disciplinary techniques that were psychologically, not physically, punitive. Deprivation of minor privileges became the favored form of punishment, and explanation, reasoning, and avoidance of infractions by distraction and humoring were to be substituted for punishment as much as possible. Parents sought to develop guilt in wrongdoers through emo-
203
tional manipulation, so that children would learn to chastise and regulate themselves. Parents sometimes used isolation to induce feelings of guilt in wrongdoers, often followed by a scene of repentance and forgiveness. As corporal punishment and use of communal shame declined, parents could resort to indirect means of discipline such as restriction of the child’s diet; regulation of children’s bodily functions, including the use of purgative medicines; and surveillance and control of children’s daily activities. In the late nineteenth and early twentieth centuries, psychologists such as G. Stanley Hall and other social critics expressed fears about the increasing effeminacy of American boys raised in the female worlds of home and school and called for less sentimental treatment of boys by parents and teachers. Different forms of punishment were recommended for boys and girls, with whipping or corporal punishment of boys expressly recommended by some advisers. For men worried that middle-class boys were becoming too soft, organized sports, such as the supervised athletic programs begun by many social agencies between the 1890s and 1910s, promised to rebuild boys’ strength, courage, and persistence. Control of boys’ activities was the main objective of such programs, especially those for lower-class boys. Recreation specialists emphasized that supervised team sports would teach obedience to rules and to coaches. Young Men’s Christian Associations (YMCAs) and Sunday school athletic leagues sponsored basketball and baseball for middle-class boys, boys’ clubs and settlement houses did the same for working-class boys, and municipal playgrounds served varied classes. The Boy Scouts of America, founded in 1910, exemplified the combination of
204
Discipline
submission to discipline and enhanced masculinity that marked these new activities for boys: Boy Scouts accepted close adult control of their activities in return for outdoor recreation and assurances that they were manly fellows. Parents, teachers, and childrearing advisers returned to slightly harsher disciplinary techniques for both boys and girls in the early twentieth century. John Watson and other behaviorists advised parents to rear their children according to a strict disciplinary regime, which accorded little room for variable temperaments among individual children and prohibited much parental warmth and interaction with infants and small children. Behaviorists promised parents nearly complete control of their children’s bodies and minds through careful adherence to scientific principles of childrearing, which included precise schedules for food, sleep, and other bodily functions. Early twentieth-century writers on childrearing argued that even young infants were capable of achieving self-control if properly disciplined. Early twentieth-century memoirs and other historical evidence suggest that parental discipline may have been stricter than recommended by childrearing advisers. For example, mothers in rural New York commonly used punishments that had been disparaged by child care experts since the mid-nineteenth century, some of which, such as tying them up in the front yard, entailed shaming children. Although almost all childrearing experts in the post–World War II era advocated mild disciplinary techniques and proscribed corporal punishment, there was always an unresolved tension between too much discipline and too little. By 1957 even Benjamin Spock, one of the most vocal and popular advocates of mild
discipline, admitted that American parents of the 1950s were too permissive and rewrote his baby book to reinforce the importance of firm discipline in rearing children. He conceded that his previous advice had been a reaction to the rigidity of behaviorist childrearing prescriptions. Conservative critics blamed the countercultural behavior of young men and women in the youth movements of the 1960s and 1970s on the failure of midcentury parents to adequately discipline their children. Contemporary child psychologists and other childrearing experts favor methods that seek to discipline children in a nonpunitive manner and try to keep a child oriented positively toward the parent. Nevertheless, national surveys indicate that most parents continue to use some form of physical discipline or corporal punishment in addition to nonpunitive disciplinary techniques. Unlike earlier periods of American history, there is little evidence that today’s boys are more likely to be physically disciplined than girls. Among the most popular alternatives to physical punishment used in today’s homes and schools are reasoning, discussion, and demonstration of logical consequences; setting of boundaries, rules, and limits; and time out and isolation. These methods of childrearing and discipline are believed to foster a sense of responsibility, conscience, and self-discipline in children. At the same time, some of these methods also recognize the value of moderate levels of shame and guilt in training children to achieve self-control. Roblyn Rawlins See also Boy Scouts; Muscular Christianity; Sports, Colonial Era to 1921; Sports, 1921 to the Present; Young Men’s Christian Association
Disease and Death References and further reading Clement, Priscilla Ferguson. 1997. Growing Pains: Children in the Industrial Age, 1850–1890. New York: Twayne Publishers. Glenn, Myra C. 1984. Campaigns against Corporal Punishment: Prisoners, Sailors, Women, and Children in Antebellum America. Albany: State University of New York Press. Grant, Julia. 1998. Raising Baby by the Book: The Education of American Mothers. New Haven: Yale University Press. Hiner, N. Ray, and Joseph M. Hawes, eds. 1985. Growing Up in America: Children in Historical Perspective. Urbana: University of Illinois Press. Macleod, David I. 1998. The Age of the Child: Children in America, 1890–1920. New York: Twayne Publishers. Reinier, Jacqueline S. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers. Rotundo, E. Anthony. 1993. American Manhood: Transformations in Masculinity from the Revolution to the Modern Era. New York: Basic Books.
Disease and Death Of all the revolutions that have transformed American boyhood since the founding of the United States, few have been so profound as the dramatic reduction that has taken place in the likelihood of death, disease, or physical incapacitation during infancy, childhood, and adolescence. Today, the death rate of the nation’s boys is only a tiny fraction of what it was through most of the eighteenth and nineteenth centuries, when somewhere around 20 percent of all boys born in the United States failed to survive the first five years of life and close to one-quarter failed to make it to age twenty. Similarly, in real terms, the incidence of serious disease and physical handicap has declined immensely. The so-called childhood diseases are infre-
205
quently contracted and almost never fatal, and unlike in past centuries, it is no longer common to meet children who are blind or deaf from infection or accident; whose frames are stunted or twisted from rickets, scoliosis, or tuberculosis; who are disfigured or partially paralyzed by accidents; or who are condemned to a life as an invalid by hearts damaged by a bout with diphtheria or scarlet fever. Most of this great improvement in boys’ health and survival took place in the twentieth century and was part of a larger mortality and epidemiological transition that saw life expectancy improve at all ages and infectious disease retreat from being the great killer of Americans. Driven initially by rising nutrition, living, and educational standards and by state-sponsored environmental sanitation, improvements in boys’ health and survival have increasingly been the consequence of developments in scientific medicine. Perhaps nothing so characterized boyhood at the beginning of the twentieth century as its precariousness. Approximately 13 percent of all boys born survived less than twelve months, prey to premature birth, birth trauma, congenital deformities, respiratory diseases, or, most likely, any of the variety of gastrointestinal disorders that carried away tens of thousands of American babies each year, especially during the hot summer months. Early childhood was less risky for boys but hardly safe. Almost 7 percent of boys aged one to five who were alive in the United States during the first year of the century were dead by the second. Causes of death among these boys were many, but particularly prominent were respiratory diseases, especially pneumonia, which killed over 21 percent of the young children who died each year (Preston and Haines 1991; Meckel 1996).
206
Disease and Death
Also accounting for over one-fifth of the annual deaths of boys each year were socalled childhood diseases such as the measles, scarlet fever, smallpox, whooping cough, and the most deadly and frightening, diphtheria. These diseases, along with accidents, were also the chief causes of death of the thousands of boys aged five to fifteen who died each year. Of course, most boys at the turn of the century survived. But for many of those who did, childhood was marred by physical handicap or chronic discomfort and pain. Every year, uncounted numbers of infants were condemned to lives as invalids by congenital deformity or birth trauma. Boys who were blind or deaf— from infection or accident—were common. So too were boys whose physical development was marred by rickets, scoliosis, or tuberculosis or whose hearts had been weakened by diphtheria or scarlet fever. Also common were boys who hobbled about on crutches because of a crippling accident or because they suffered partial paralysis as a result of poliomyelitis, a disease that was beginning to make its presence felt and would soon replace diphtheria as the most terrifying childhood scourge. Less terrifying but almost universal were the pain and discomfort from a variety of chronic health problems. Among school-age boys, bacterial and parasitic skin diseases were endemic; discharges from eye, ear, and sinus infections were so common that they often did not occasion comment; and ongoing mouth and tooth pain from rotting teeth was considered a normal part of childhood. Turn-of-the-century painters and writers might have celebrated the carefree robustness of boyhood, but the reality for many of the nation’s boys was often quite different.
One hundred years later, at the beginning of the twenty-first century, less than eight-tenths of 1 percent of all boys born in the United States fail to reach age one and less than 2 percent fail to make it to twenty. This dramatic reduction in mortality among boys was closely mirrored by one in girls’ mortality, although for biological, genetic, and behavioral reasons, boys in developed countries have long faced a higher risk of death than did girls. Indeed, today, American boys aged one to four and five to fourteen have, respectively, a 23 percent and 43 percent greater chance of dying than girls those ages (Zopf 1992; NCHS 2000). The reduction in boys’ mortality also occurred within the context of an overall decline during the twentieth century in the death rates of Americans of all ages and as part of an epidemiological transition in which chronic and degenerative diseases supplanted infectious disease as the major killers of Americans. Although the late-twentieth-century resurgence of tuberculosis and the appearance of acquired immunodeficiency syndrome (AIDS), Lyme disease, and some deadly disorders caused by new strains of streptococcal and E. coli bacteria offer vivid reminders that the microbial threat to health in the United States has not been banished, infectious disease in general is no longer the handmaiden of death. This is particularly true for the young, for whom infectious disease is not the mass killer it was earlier in the century. Although the mortality of boys dropped relentlessly during the twentieth century, it did not do so uniformly, either over time or among all boys. Indeed, the overall decline of mortality among boys was marked by some short-term fluctuations and reversals and by more rapid declines
Disease and Death in some periods than others. Unlike the middle-aged and elderly, for whom mortality declined most sharply in the middle and last thirds of the century, children experienced their greatest improvement in life expectancy before 1930, with gains ranging from just under 60 percent for those aged five through fourteen to more than 70 percent for those aged one to four (Grove and Hetzel 1968; Meckel 1996). The one exception to this pattern has been the mortality of neonates, or those infants under one month. Although the mortality of older infants fell rapidly during the first third of the century and has been relatively stable since midcentury, that among neonates has dropped just as fast at the end of the century as at the beginning. All groups of American boys enjoyed the twentieth-century decline in mortality, but they did not do so equally. As with so many other social benefits, improving health among boys during the twentieth century has correlated closely with socioeconomic status. Moreover, the strength of that correlation may have increased over the course of the century, as the quality of medical care improved and the ability to purchase it had an evergreater impact on health. Race has also been a significant determinant of mortality and incidence of disease among boys in the twentieth century. In 1900, male infants and children classified as nonwhite were between 50 and 90 percent more likely to die that year than were their white counterparts. At the end of the century, although the death rates of both groups have dropped dramatically, the disadvantage persists and seems to have grown. In 1996 infant mortality among blacks was 2.3 times higher than among whites. Similarly, the mortality of black boys aged one to four and four to
207
fifteen was, respectively, 1.9 and 1.6 times higher than that of their white counterparts (Preston and Haines 1991; NCHS 2000). Closely connected to the variations that characterized declining mortality among boys in the twentieth century have been three major changes in the leading causes of death among male infants, children, and adolescents. The first is a dramatic decline, especially in the first third of the century, in the death rates from infectious disease. The second is a gradual and uneven increase in death rates from external causes, that is, accidents and injuries, homicide, and suicide. The third is an increase in the proportion of infant mortality attributable to congenital malformations and anomalies, premature birth, low birth weight, and other so-called diseases of early infancy and a corresponding decrease in the proportion of infant mortality caused by gastrointestinal and respiratory diseases. In 1900, infectious gastrointestinal and respiratory diseases killed close to half of all infants who died that year. Pneumonia was the leading killer of children aged one to four and diphtheria the leading killer of children five through fourteen. Other infectious diseases also caused large numbers of deaths among boys. Measles, whooping cough, scarlet fever, meningitis, and the various respiratory diseases cut down thousands of children aged one to fourteen. Indeed, infectious disease accounted for more than 60 percent of all deaths of boys less than fifteen years of age. Today deaths of American infant boys from gastrointestinal disease are exceedingly rare, and the proportion of infant deaths caused by infectious respiratory diseases has shrunk from more than 20
208
Disease and Death
percent to less than 2 percent. The biggest killers of infant boys today are congenital anomalies, gestational complications, and disorders related to prematurity and low birth weight. Similarly, infectious disease accounts for only a tiny fraction of the deaths suffered by boys aged one through fourteen. Of all the many infectious diseases that threatened and took the lives of young boys at the beginning of the century, only pneumonia and influenza remain significant, though much less so than they were. If a boy dies today, it is likely that the cause will be cancer or a congenital anomaly or, even more likely, external. Indeed, at the end of the twentieth century, unintentional injuries, homicide, and suicide rank as the first, third, and fifth causes of death among boys five through fourteen. Behind the dramatic twentieth-century reduction of mortality and morbidity among American boys were a wide array of socioeconomic, behavioral, political, and technological and scientific developments. In the first third of the century, declining fertility and better nutrition and housing, accompanying a rising standard of living, contributed significantly to the reduction of disease and death among boys. So too did environmental improvements, especially in cities, brought about by the construction of water supply and sewer systems and the implementation of effective refuse removal. Also crucial was the work of public health officials and their allies in social work and medicine in combating the transmission of diseases among schoolchildren, in controlling milk-borne diseases, and in educating the public in the basics of preventive infant and child hygiene. For many of the same reasons they declined in the first third of the twentieth century, mortality and disease among
male infants and children continued to drop in the second and third. But beginning in the 1930s, the development and application of specific medical interventions and technologies played an increasingly important role. The perfection of effective fluid and electrolyte therapy to combat acidosis and dehydration in infants; the development and dissemination of vaccines providing immunity against traditional killers like pertussis and polio; the discovery and employment of sulfonamides and antibiotics to reduce fatalities from bacterial infections; the increasingly sophisticated use of vitamins and minerals to aid metabolism and combat such childhood scourges as rickets, pellagra, and pernicious anemia; and the design and implementation of surgical, therapeutic, and intensive care techniques to correct congenital deformities and to counter the risks faced by premature and low-birth-weight infants have worked together to reduce the likelihood that an American boyhood will be scarred by disease or ended prematurely by death. American boys still contract serious disease and die. Death has not been banished from boyhood entirely but has been made an increasingly infrequent visitor. Long associated with life and robustness, boyhood in the twentieth century has significantly closed the gap between symbolism and reality. Richard A. Meckel See also Accidents; Orthodontics; Poliomyelitis; Poverty; Sexually Transmitted Diseases; Suicide References and further reading Crimmins, Eileen. 1981. “The Changing Pattern of American Mortality Decline, 1940–1977.” Population Development Review 7: 229–254.
Divorce Ewbank, Douglas. 1987. “History of Black Mortality and Health before 1940.” Milbank Quarterly 65, supp. 1: 100–128. Fingerhut, Lois, and Joel Kleinman. 1989. Trends and Current Status in Childhood Mortality. Washington, DC: National Center for Health Statistics. Grove, Robert D., and Alice M. Hetzel. 1968. Vital Statistic Rates in the United States, 1940–1960. Washington, DC: National Center for Health Statistics. Hammonds, Evelynn Maxine. 1999. Childhood’s Deadly Scourge: The Campaign to Control Diphtheria in New York City, 1880–1930. Baltimore: Johns Hopkins University Press. Meckel, Richard A. 1990. Save the Babies: American Public Health Reform and the Prevention of Infant Mortality 1850–1929. Baltimore: Johns Hopkins University Press; 1998, Ann Arbor: University of Michigan Press. ———. 1996. “Health and Disease.” Pp. 757–786 in Encyclopedia of the United States in the Twentieth Century, vol. 2. Edited by Stanley I. Kutler et al. New York: Scribner’s. NCHS (National Center for Health Statistics). 2000. Health, United States, 2000. Hyattsville, MD: NCHS. Preston, Samuel H., and Michael R. Haines. 1991. Fatal Years: Child Mortality in Late Nineteenth-Century America. Princeton, NJ: Princeton University Press. Retherford, Robert D. 1975. The Changing Sex Differential in Mortality. Westport, CT: Greenwood. Rogers, Richard G. 1992. “Living and Dying in the U.S.A.: Sociodemographic Determinants among Blacks and Whites.” Demography 29: 287–303. Zopf, Paul E., Jr. 1992. Mortality Patterns and Trends in the United States. Westport, CT: Greenwood.
Divorce In colonial America, divorce was rare and usually had to be enacted by legislation. Families could also be disrupted by the desertion of a father or, more likely, by the death of a parent since mortality rates
209
in certain areas were very high. Not until the late nineteenth and early twentieth centuries did mortality rates significantly decline. Yet the same time period also brought a rise in divorce. Between 1867 and 1929, the divorce rate in the United States increased 2,000 percent; by 1929 more than one marriage in six ended in divorce each year (May 1980, 2). By the twenty-first century, divorce had become an American institution almost as common as marriage. In 1998 the U.S. Census Bureau reported that approximately 19.4 million adults were divorced, with the likelihood that 43 percent of new marriages that year would end in divorce (Current Population Study, National Center for Health Statistics and U.S. Census Bureau, 1998). Divorce is readily available because of no-fault legislation and is considered socially permissible. Yet the prevailing theme of studies that assess the effect of parental divorce on children is that its deleterious and long-lasting effects are suffered particularly by boys. Early work treated divorce as a unidimensional traumatic event in which the absence of the noncustodial parent, usually the father, was the cause of its negative effect on boys. As divorce became more widespread, the myth prevailed that boys without male role models would be either effeminate or juvenile delinquents (Lowery and Settle 1985, 455). However, the focus has gradually shifted from family structure to family processes. Effects of divorce on children have come to be seen as indicative of dysfunctional family processes, marital conflict, or children’s problems prior to dissolution of the marriage in addition to the stress of the breakup and its aftermath (Morrison and Cherlin 1995, 800). A complete picture of the effect of the divorce process on boys, then, requires an
210
Divorce
examination of the period before the breakup, the event itself, and the period after the breakup of a marriage. Most of the research on children’s adjustment during the process of the divorce itself has focused on the short-term effects of the custody of the mother, who may be in a difficult relationship with the noncustodial father (Freeman 1985, 130). This stage has been referred to as the “crisis period,” when adults may have a “diminished capacity to parent” (Morrison and Cherlin 1995, 802). During this time of initial transition, children and adults may experience hostility between the spouses as well as loss and uncertainty. Custodial mothers may become depressed and overburdened because of a substantial loss of income, strained resources, and limited social support. While they are less capable of providing a nurturing, supportive, and consistent household, custodial mothers may parent ineffectively and fail to supervise children. Poor parenting techniques and drastic household changes may cause initial adjustment problems for children, including periods of depression, anger, anxiety, behavior pathology, and low self-esteem. The relationship between boys’ behavioral responses to divorce and their custodial mothers’ reactions may flow from both directions. Having a mother with predominantly negative moods may exacerbate a boy’s difficulty with adjustment to a new life situation, and having a boy with behavioral problems may contribute to a mother’s emotionality in trying to cope with the divorce. Boys in mother-custody arrangements may be more likely than girls to experience depression and withdrawal (Howell, Portes, and Brown 1997, 143). They may be more likely than boys in the custody of their fathers to be aggressive and to have prob-
lems in behavior and self-esteem (Camara and Resnick 1989). They are also more likely to have behavioral problems than boys in intact families (Hetherington 1991). Boys may adjust poorly following divorce because they are usually in the custody of their mothers and may have limited involvement with their fathers (Lowery and Settle 1985, 457–458). Boys need a regular, ongoing, positive relationship with their fathers to develop a valued sense of masculinity, internalize controls over behavior, achieve appropriate development of conscience, and perform up to their abilities academically (Kalter et al. 1989, 606). Without the presence of a father, boys may fall into poor social and academic skills and demonstrate aggressive behavior. These negative outcomes for boys immediately following divorce are not necessarily a product of the divorce itself, however. The quality of the parent-child relationship and the personality characteristics of each individual before the breakup of the marriage also contribute to the adjustment of boys after a divorce. The few studies that address the relationship of the parent and child prior to the divorce present inconsistent findings. Some researchers (Block, Block, and Gjerde 1986, 829) asked whether the breakdown of the nuclear family that results from divorce or the conflict and instability found in troubled but still intact families have implications for children’s personality development. Analysis of this issue has revealed that as early as eleven years before divorce, children from to-be-divorced families exhibited more behavioral problems than children from intact families. Boys from to-bedivorced families also showed problems with impulse control, stubbornness, and
Divorce restlessness as early as ages three, four, and seven (Shaw, Emery, and Tuer 1993). Boys’ behavior problems prior to the divorce, rather than the divorce alone, then, may contribute to their difficulties with adjustment after the divorce. Yet other researchers have failed to find any differences in the behavioral problems of boys prior to divorce in tobe-divorced families and those in intact families. Instead, they focused on the relationship between parenting skills and boys’ adjustment after a divorce. Parents of boys in to-be-divorced families displayed significantly less concern and higher levels of rejection, economic stress, and parental conflict before the divorce than parents in intact families (Shaw, Emery, and Tuer 1993, 130). Poor parenting skills, such as parental detachment, maternal resentment, and conflict between parents and their sons, may be linked to boys’ struggles with adjustment following the divorce (Block, Block, and Gjerde 1988). These studies highlight the importance of parenting skills and the need to pinpoint child and parental factors present before the divorce that may adversely affect boys’ capacity to cope afterwards. Child and family assessments conducted within the first two or three years after a divorce indicate that boys experience more negative effects than girls do, but this hypothesis should be qualified by specifying the circumstances under which the pattern holds (Zaslow 1988, 375). For example, factors that mediate the long-term effects of divorce on boys’ adjustment include the child’s age at the time of divorce, the length of time since the divorce, and changes in the family structure. Boys’ perceptions of parental divorce are affected by their cognitive capacity to
211
make sense of the event according to their age when dissolution of the marriage occurs. Research presents inconsistent findings, however, on the effect of divorce on different age groups: some studies indicate that adjustment problems increase with age, some claim that they decrease with age, and some argue that each age group experiences its own unique problems (Howell, Portes, and Brown 1997, 145). Older children may have a more difficult time coping with divorce because they hold painful recollections of parental hostility and conflict compared to children who are younger at the time of divorce (Wallerstein, Corbin, and Lewis 1988). Yet older children also have a greater capacity to understand the divorce in a realistic manner, using their internal locus of control and more interpersonal knowledge (Kurdek and Siesky 1980). Indeed, older children may be more well adjusted after divorce than younger children are. However, children of all ages respond to divorce with problematic behavior; younger children may act out more, whereas older children tend to be more depressed (Hodges and Bloom 1984). Age differences in adjustment after divorce further vary by gender. Simply put, boys who experience divorce in their preadolescent years seem to fare worse than preadolescent girls. However, boys who cope with divorce during adolescence are better adjusted than adolescent girls. Regardless of age at time of divorce, boys seem to adjust better as more time elapses (Wallerstein and Kelly 1980). Whether this occurs may depend on the restructuring of the custodial family through remarriage. The presence of a stepfather in the custodial home generally buffers boys from negative developmental outcomes. For
212
Divorce
boys, a stepfather may serve as a male substitute to fill the gap left by their absent biological father. This relationship is more beneficial for younger sons, who may be able to attach more readily to the presence of a new parental figure than older sons can (Wallerstein and Kelly 1980). Boys who experienced their mothers’ remarriage between ages twelve and fifteen faced more delinquency problems than their peers who were from families that remained intact (Pagani et al. 1998, 495). By age fifteen, however, this kind of problematic behavior decreased. In general, preadolescence is a vulnerable time for boys in which they may be prone to behavior problems and have a difficult time adjusting to divorce. Boys’ negative adjustment, then, is due to factors other than whether their mothers remarry. Most researchers agree that boys are adversely affected by divorce, but more work needs to be done to determine mediating factors that may insulate boys from poor adjustment outcomes. As divorce rates continue to rise, more refined research is essential to direct clinical practice and to guide court decisions in divorce and custody disputes. Divorce must be evaluated as a process that affects boys before, through, and after the marital breakup, indeed even into adulthood when males make their own marital choices. More research needs to explore the extent to which father- or mother-custody households affect boys’ adjustment. The effect of relationships with noncustodial fathers and their sons’ perceptions of their visitation experiences should be studied. Comparisons should be made on how parental use of mediation as an alternative to settling divorce disputes in court affects boys’ adjustment. Mediation may be a viable mechanism to reduce levels of hostility
between spouses and to improve boys’ adjustment. Future research should explore how boys who experience divorce at different ages are affected by changes in the structure of the family over time. Ideally, a comprehensive evaluation of divorce should be undertaken that encapsulates the entire process instead of utilizing the piecemeal approach of much of the current work. A complete picture of the divorce process will help boys stand a fighting chance against the perils of divorce. Debra Lynn Kaplan
References and further reading Ahrons, Constance, and Richard B. Miller. 1993. “The Effect of the Postdivorce Relationship on Parental Involvement: A Longitudinal Analysis.” American Journal of Orthopsychiatry 63, no. 3: 441–450. Biller, H. B. 1981. “Father Absence, Divorce, and Personality Development.” In The Role of the Father in Child Development. 2d ed. Edited by M. E. Lamb. New York: John Wiley. Block, Jeanne H., Jack Block, and Per F. Gjerde. 1986. “The Personality of Children Prior to Divorce: A Prospective Study.” Child Development 57, no. 4: 827–840. ———. 1988. “Parental Functioning and the Home Environment in Families of Divorce: Prospective and Concurrent Analyses.” Journal of the American Academy of Child and Adolescent Psychiatry 27: 207–213. Camara, K., and G. Resnick. 1989. “Styles of Conflict Resolution and Cooperation between Divorced Parents: Effects on Child Behavior and Adjustment.” American Journal of Orthopsychiatry 59, no. 4: 560–575. Emery, R. E. 1988. Marriage, Divorce, and Children’s Adjustment. Newbury Park, CA: Sage. Emery, R. E., E. M. Hetherington, and L. F. Dilalla. 1984. “Divorce, Children, and Social Policy.” Pp. 189–266 in Child Development Research and Social Policy. Edited by H. W. Stevenson and
Douglass, Frederick A. E. Siegel. Chicago: University of Chicago Press. Freeman, Evelyn B. 1985. “When Children Face Divorce: Issues and Implications of Research.” Childhood Education 62, no. 2: 130–136. Herzog, E., and C. Sudia. 1973. “Children in Fatherless Families.” In Review of Child Development Research: Vol. 3, Child Development and Child Policy. Edited by B. M. Caldwell and H. N. Riccuiti. Chicago: University of Chicago Press. Hetherington, E. M. 1979. “Divorce: A Child’s Perspective.” American Psychologist 34: 851–858. ———. 1991. “Presidential Address: Families, Lies, and Videotapes.” Journal of Research on Adolescence 1, no. 4: 323–348. Hodges, W. F., and B. L. Bloom. 1984. “Parents’ Reports of Children’s Adjustment to Marital Separation: A Longitudinal Study.” Journal of Divorce 8, no. 1: 33–50. Howell, Susan H., Pedro R. Portes, and Joseph H. Brown. 1997. “Gender and Age Differences in Child Adjustment to Parental Separation.” Journal of Divorce and Remarriage 27, nos. 3–4: 141–158. Kalter, Neil, Amy Kloner, Shelly Schreier, and Katherine Okla. 1989. “Predictors of Children’s Postdivorce Adjustment.” American Journal of Orthopsychiatry 59, no. 4: 605–618. Kurdek, L., and A. E. Siesky. 1980. “Children’s Perceptions of Their Parents’ Divorce.” Journal of Divorce 3, no. 4: 339–378. Lowery, Carol R., and Shirley A. Settle. 1985. “Effects of Divorce on Children: Differential Impact of Custody and Visitation Patterns.” Family Relations: Journal of Applied Family and Child Studies 34, no. 4: 455–463. May, Elaine Tyler. 1980. Great Expectations: Marriage and Divorce in Post-Victorian America. Chicago: University of Chicago Press. Morrison, Donna R., and Andrew J. Cherlin. 1995. “The Divorce Process and Young Children’s Well-Being: A Prospective Analysis.” Journal of Marriage and the Family 57, no. 3: 800–812. Pagani, Linda, Richard E. Tremblay, Frank Vitaro, Margaret Kerr, and Pierre McDuff. 1998. “The Impact of Family
213
Transition on the Development of Delinquency in Adolescent Boys: A 9Year Longitudinal Study.” Journal of Child Psychology and Psychiatry and Allied Disciplines 39, no. 4: 489–499. Shaw, Daniel S., Robert E. Emery, and Michele D. Tuer. 1993. “Parental Functioning and Children’s Adjustment in Families of Divorce: A Prospective Study.” Journal of Abnormal Clinical Psychology 21, no. 1 (February): 119–134. U.S. Census Bureau. Marital Status and Living Arrangements. Available: http://www.census.gov/population/ www/socdemo/ms-la.html. Wallerstein, J. S., S. B. Corbin, and J. M. Lewis. 1988. “Children of Divorce: A 10-Year Study.” In Impact of Divorce, Single Parenting, and Stepparenting on Children. Edited by E. M. Hetherington and Josephine D. Arasteh. Hillsdale, NJ: Erlbaum. Wallerstein, J. S., and J. B. Kelly. 1980. Surviving the Break-up: How Children and Parents Cope with Divorce. New York: Basic Books. Zaslow, Martha J. 1988. “Sex Differences in Children’s Response to Parental Divorce: 1. Research Methodology and Postdivorce Family Forms.” American Journal of Orthopsychiatry 58, no. 3: 355–378.
Douglass, Frederick One of the foremost antislavery advocates of the nineteenth century, Frederick Augustus Washington Bailey was born a slave in Talbot County on the Eastern Shore of Maryland in February 1818. As a boy he rarely saw and did not clearly remember his mother, Harriet Bailey, who was hired out to work in the fields of a nearby farm. But from her occasional visits, early in his life he formed a picture of her in his mind and later described her as “tall and finely proportioned” and “remarkably sedate in her manners” (Douglass 1855). Douglass could only speculate about the identity of his father, who was almost certainly a
214
Douglass, Frederick
white man. Perhaps he was Aaron Anthony, the master of his mother and manager of the farms of Colonel Edward Lloyd, the largest slaveholder in the area, or perhaps he was one of Lloyd’s adult sons. Fred was raised for the first six years of his life by his grandparents, Betty and Isaac Bailey, who lived in a log cabin in the woods, tucked between two farms. Betty and her children were slaves, but Isaac was free and worked as a sawyer. Fred remembered his grandmother as proficient in making nets and catching fish and in growing sweet potatoes from seedlings. She raised a large family of the children of her five daughters, and Fred spent his early years climbing the ladder to the loft to sleep on pallets with the other children and with them eating meals of cornmeal mush served from a wooden trough and scooped up with oyster shells. Wearing only shirts of rough tow-linen, the children rolled around together in the dusty yard, roamed the woods freely, and cooled off with a swim when they felt like it. In 1824, however, when Fred was six, his grandmother took him to the home plantation of Edward Lloyd, sent him to play with older siblings who were already there, and, when he was distracted, slipped quietly away. Bursting into tears when he discovered her departure, he could not be consoled even by the peach and pear offered by his sympathetic siblings. Young Fred had entered a self-sufficient community of thirteen farms and more than 500 slaves. Most of the slaves lived in the Long Quarter, a row of low buildings teeming with adults and children, but the Bailey children were looked after by Aunt Katy, who kept house for their master, the farm manager, Aaron Anthony. Aunt Katy disciplined the chil-
dren by denying them food, and Fred was hungry much of the time. One of his rare memories of his mother was when she visited and made a sugar cake for him. A captivating child with skill in mimicry, he attracted the attention of Anthony’s young married daughter, Lucretia Auld, who slipped him extra food when he sang beneath her window. Fred’s tasks at Lloyd’s plantation were light. Too young to work in the fields, he drove cows in the evening, cleaned up around the yard, and ran errands for Lucretia. In his leisure time he gravitated toward what he later described as the grandest building he had ever seen, the Great House occupied by Colonel Lloyd and his family. Associating the Great House with plenty of food, Fred formed a friendship with Lloyd’s son, Daniel, five years his senior. Daniel took the slave boy on shooting excursions, where he retrieved the birds. He also allowed Fred to sit in on lessons with Joel Page, the New England tutor who struggled to cure the white boy of speaking like a slave. Forming words along with Daniel, Fred learned the power of literacy and cultured English speech. Imbued with the ambition of learning to read, he also began to wonder why God, as he was told, “made white people to be masters and mistresses, and black people to be slaves.” Later he wrote, “I distinctly remember being, even then, most strongly impressed with the idea of being a freeman some day” (Douglass 1855). In 1826, when Fred was eight, Lucretia and her husband Thomas Auld arranged that he be sent to Baltimore to live with Thomas’s brother, Hugh Auld, and his wife Sophia. Ostensibly, Fred was to care for their two-year-old son, but perhaps the Aulds hoped to secure for him a more promising future. The institution of slav-
Douglass, Frederick ery was more flexible in urban Baltimore than on the plantation: slaves could learn skills, be hired out to earn wages, and even buy their freedom. And Sophia Auld, a devout Methodist, was a kind mistress who supplied Fred with adequate clothes, a good straw bed, and plenty of bread, treating him almost as a brother to her young son. In 1827, however, Aaron Anthony died, and Betty Bailey, her children, and her grandchildren became part of the estate that would be divided among Anthony’s children. Fred was whisked back to the plantation for a year, while his fate was determined. Fortunately, he was inherited by Lucretia and Thomas Auld, who may have asked for him, and he was allowed to return to Baltimore. Fred spent five more years there in the secure environment created by Sophia Auld. As she read stories to the two boys, he memorized passages and, matching sounds with words on the page, began to learn to read. When Hugh Auld observed the lessons, he strongly objected that learning would make the boy unfit for a life of slavery. Fred was excluded from the reading sessions and began to be watched more closely. Nevertheless, when he was sent on errands or allowed to play in the street, he continued his spelling lessons by bribing white boys with bread to teach him words in the Webster’s speller he carried around in his pocket. At age eleven he could read well enough to match letters on boards in the shipyard with those in the speller, and, in stolen moments when Sophia was at Bible class, he taught himself to write. A year later, he bought a copy of The Columbian Orator with 50 cents he had earned, steeping himself in speeches by Cicero or Cato, English Whigs or George Washington, and dialogues on slavery.
215
About this time he also began to attend the Sabbath school for black children at the Dallas Street Methodist Church, encountering for the first time literate black adults. Later in his life he would renounce religion, but as a young teenager he was profoundly moved by a white itinerant Methodist preacher and converted by a black minister. After that, he spent long hours discussing religion with “Uncle Lawson,” a black workingman. In 1833, when Fred was fifteen, the Aulds may have realized how difficult it would be to contain this bright, active boy in slavery in Baltimore, for they sent him back to the Eastern Shore to live with Thomas Auld in the rural town of St. Michael’s. Unfortunately, Lucretia Auld had died, and Thomas’s new wife, Rowena, seemed by comparison a cruel mistress. Once again, Fred was allotted only meager rations of food and learned from his sister Eliza to “appropriate” what he needed, arguing that what his labor helped produce was his. When his master attended a camp meeting and became a Methodist, Fred’s hopes for better treatment and eventual freedom rose but were dashed when he discovered that piety in the household did not mean generosity toward slaves. Fred joined another young man to form a Sabbath school, hoping to teach black boys to read as he had done in Baltimore, but alarmed white members of the Methodist church, comparing him to the insurrectionist Nat Turner, quickly broke up the school. Even more chilling to Fred was Auld’s treatment of the crippled slave, Henny Bailey, whom the master tied up by her lame wrists and whipped with a cowhide, all the while citing Scripture. Finally, Auld resolved that Fred also should be broken and sent him to a nearby tenant farmer, Edward Covey, who, although
216
Douglass, Frederick
also a Methodist, was reputed to be an expert at training slaves. Under the tutelage of Covey, the seventeen-year-old Fred learned for the first time what it was like to be a field hand. The farmer sought to break the boy’s spirit with brutal beatings, which left sores on his back, kept open by the rubbing of his rough shirt. One day when Fred, who had no experience driving oxen, lost control of a team, Covey cut young shoots from a tree, trimmed them with his knife, and flogged Fred until the switches wore out. After six months of such beatings and hard labor under constant supervision, Fred despaired that the aspirations he had formed in Baltimore never would be realized. On a hot August day, when the boy collapsed with sunstroke while fanning wheat, Covey responded with kicks and a heavy blow to his head with a hickory board. In a desperate effort to find redress, Fred took off across the fields to St. Michael’s. But Thomas Auld found excuses for Covey and condemned Fred, sending him back in order that his wages for the half-year served not be lost. In the woods on his weary way back to Covey’s farm, the disconsolate boy encountered Sandy Jenkins, a sympathetic free black, who listened to his story and offered him a root to wear to protect him from the farmer’s blows. The next morning, perhaps enabled by the root, the boy waited for the farmer in the barn, and, now resolved to fight, with a sudden spring wrestled him to the ground. Man and boy were almost equal in weight, and Fred held his own, seizing Covey by the throat until the farmer cried for help. Yet other slaves owned by Covey feigned ignorance of their master’s plight, allowing Fred to wrestle until the hated slave trainer gave up the fight. For the remaining six
months Auld had hired his slave out to Covey, the farmer did not beat the boy again, and Fred’s sense of pride and Baltimore dreams revived. After Fred had served his time with Covey, he was hired out to a kinder master, although he had gained the reputation of a bad sheep that could spoil the flock. Working and testing his strength and skills against those of other young slaves, he began to teach his new friends and anyone else who would join them to read. As they repeated the stirring speeches from The Columbian Orator, Fred and his trusted circle began to consider achieving liberty for themselves. They decided on a plan to steal a canoe, travel along the coves and inlets of the Chesapeake Bay’s Eastern Shore, and slip into Pennsylvania and freedom. On the April morning that Fred and five others planned to leave, however, he discovered to his distress that their plan had been betrayed. A posse on horseback seized the youths and dragged them stumbling behind the horses to Thomas Auld’s store in St. Michael’s and then to jail. Although the young slaves feared that they would be sold to traders and taken south, their masters, who were eager to retain their property, intervened, and Fred returned once more to Thomas Auld. Perhaps Auld realized that his talented eighteen-year-old slave could not be broken, for he promised to send him back to Hugh Auld in Baltimore to learn a trade. If he worked diligently for the wages his master would collect, he could obtain his freedom at the age of twenty-five. Fred became an apprentice in a Baltimore shipyard and then was allowed to hire himself out and collect his own wages. According to the arrangement with Hugh Auld, he would buy his own tools and find his own room and board
Drag Racing but still pay his master $3 a week. When Fred left work for a few days to attend a Methodist camp meeting, however, Auld threatened to end his slave’s precarious independence. With Anna Murray, a free black woman he had met in Baltimore, Fred determined on a plan to run away. Obtaining the papers of a free seaman and wearing a sailor’s hat and kerchief, twenty-year-old Fred Bailey boarded a train for New York. Shortly after, under the name of Frederick Johnson, he married Anna, who had followed on another train, and the couple continued on to Massachusetts. But only as a husband, father, and free wage earner in the New Bedford shipyards did he become Frederick Douglass, catch the eye of abolitionists, and embark on his career as an orator and writer in the antislavery cause. Jacqueline S. Reinier See also African American Boys; Slave Trade; Slavery References and further reading Douglass, Frederick. 1855. My Bondage and My Freedom. Reprint, New York: Dover Publications, 1969. McFeely, William S. 1991. Frederick Douglas. New York: Simon and Schuster.
Drag Racing Drag racing is a form of automotive competition in which two drivers sprint from a standstill to a fixed point some distance away, usually a quarter mile (440 yards); the first to arrive is the winner. Participants view this activity as recreation, a sport, a business, a means of identity, and a confirmation of mastery, even a matter of “satisfaction, meaning, and self” (Viken 1978). Historically, those seeking identity and mastery by means of drag racing were teenage males enthralled
217
with speed and power—“hot rodders” driving “souped up” Detroit autos on public streets. But more recently, drag racing has become a popular spectacle with a complex web of commercial relationships. Millions of people attend drag races every year. Billions of dollars change hands because of drag racing. This is no longer a pursuit associated primarily with youthful males, except in its persisting alternative form—impromptu challenges on public streets. There, a drag race is still quite likely to be a matter of two boys playing with high-powered toys. Opinion varies about the origin of the term. In one view, it derives from a county fair event called “dragging,” involving farm boys, teams of horses, and heavy weights. Another view, that it stemmed from racing autos on “the main drag,” is the more likely explanation. Whatever its origins, drag racing on public streets is an activity that has long stirred boyish derring-do. In the classic scenario, youthful males gather with their hot rods at a local hangout (often a drive-in restaurant), choose adversaries and make bets, and then head out for “Railroad Avenue” or “Arrow Highway,” wherever there is a straight stretch of two-lane blacktop with a minimum of ordinary traffic. Or, in even less structured and more dangerous circumstances, a challenge would issue somewhere right in the center of town, and two cars would “drag it out” from a stoplight. Such unlawful competition still thrives in and around big cities from New York to San Diego and is to a great extent dominated by young Hispanics, Asian Americans, and African Americans. Street racers are not all boys, far from it, but they still are overwhelmingly male. What keeps that so? Why has fascination with technology, power, and speed
218
Drag Racing
The drag-racing scene from American Graffiti, 1973 (Photofest)
consistently been associated with masculinity in American history? Historians of technology are beginning to emphasize the social construction of gender ideologies in American culture, which have attributed such characteristics as courage and activity to men, and others such as nurturing and passivity to women. Judith A. McGaw (1987) argues that these gender definitions came to be viewed as natural during the historical change that accompanied the Industrial Revolution in the nineteenth century. These gender stereotypes shaped attitudes toward technology as the new machines of the in-
dustrial age were developed. Because American society identified the control and power they endowed with the kind of courage and activity associated with males, operation of dangerous and powerful machines was assumed to be the province of men. Pursuing this idea further, Cathy Curtis emphasizes the innate conservatism of the car culture. “Its belief in exacting, precision work, a pantheon of heroes, macho contests of speed and strength, and the primacy of representational art,” she writes, fit a traditional gender stereotype in which young women are viewed “as decorative accou-
Drag Racing terments” (quoted in Post 1998, 121). In her study of another group of technological enthusiasts, ham radio operators, Susan Douglas has defined what she considers “key elements of twentieth-century masculinity—the insistence on mastering technology, the refusal to defer to the expertise of others, the invention of oneself by designing machines” (Douglas 1999, 328). Another side of this discussion was initiated and summarized by Ruth Schwartz Cowan in the 1970s when she challenged other historians of technology with the question, “Was the female experience of technological change significantly different from the male experience?” (quoted in Post 2001, 259). When the first commercial drag-racing operation was established in 1950 at an airfield in Santa Ana, California, the racers who gathered there—and at similar places all across the country—were nearly all youthful males, the median age being around twenty. Very few thought that the girls who came along, when they came at all, would play any active role. Their existence was entirely circumscribed by the boys with whom they associated, and if they participated in any way, it was usually restricted to domestic chores like polishing chrome and fetching refreshments. The classic Hollywood epoch of hot rodding, American Graffiti (1973), perpetuates this traditional view. When the film climaxes with a drag race between two disaffected boys, a girl passenger is depicted as merely hanging on for dear life. Females who sought to obtain competitive licenses to drive fuelers (supercharged, needle-nosed dragsters burning nitromethane) were denied permission on the grounds that they lacked “the stamina and strength.” Wally Parks, head of the National Hot Rod Associa-
219
tion, the major sanctioning body for organized drag racing, argued that this was a “protective” rather than a “discriminatory” measure. “It isn’t that we don’t like the ladies,” he said, “we simply want to see them around for a long time to come” (Post 2001, 264). The prejudice that females would never show an aptitude for anything “mechanical” died slowly, much more slowly than the notion that they were incapable of handling high-powered machinery. In terms of driving, there was a sliver of fluidity in gender roles even at the outset. For example, posters for Bmovie chestnuts in the 1950s, such as Hot Rod Girl (1956), show a pair of cutdown roadsters racing toward big trouble. Sometimes the occupants are all boys, and sometimes one car has a boy driving and a girl at his side. But sometimes a girl is depicted in the active and dangerous role of driving the car. Shirley Muldowney became a champion driver in the 1970s, and a film based on her career a decade later, Heart Like a Wheel (1983), portrayed her as a woman with all the right stuff. “I think the difference between me and the other guys,” she once remarked, “is that a lot of them really don’t have that kick-ass attitude” (Post 2001, 268). In 1998, a nineteen-year-old college student, Cristin Powell, piloted a fueler with more than 6,000 horsepower from 0 to 309 miles an hour in 4.59 seconds. Although Powell was exceptional, it is clear that Shirley Muldowney had opened a door through which other females could pass by the 1990s. As drag racing has become commercialized, technical “experts” have become essential to success in such realms as aerodynamic finesse. In the twentyfirst century, not only are women suc-
220
Drag Racing
cessful as drivers, but they also excel in technical roles involving mechanical and electromechanical functions, as well as in the design of drag-racing machinery. Eloiza Garza, a Latina, is notable for designing and fabricating streamlined bodywork out of composite materials such as carbon fiber. And those involved in design innovation are no longer boys. When Don Garlits revolutionized the configuration of dragsters by establishing a new paradigm that put the engine behind the driver instead of in front, he was nearly forty years old. When he opened new vistas in the realm of aerodynamics, he was well past fifty. By the 1990s, in fact, the average age of the men and women driving fuelers in professional competition was almost fifty. Despite the progress women have made in drag racing, in the twenty-first century Americans still associate technology with masculinity. The newest technology, that of computers, is still largely a male domain. But drag racing persists as a boy’s activity only in the subculture in which racecourses are city streets and racing is still viewed in terms of “macho contests of speed and strength.” Although initially such contests were central to all forms of drag racing, by the 1960s and 1970s the activity was changing because of increasing commercialization and shifts in gender ideology in the culture at large. Some critics bemoan the loss of an exciting challenge they view with nostalgia. In his review of “Kustom Kulture,” an exhibit of hot rods and dragsters, Henry Allen writes, “As we were losing in Vietnam, feminism saw its opportunity to make guy things the living proof of stupidity, ugliness, and oppression” (quoted in Post 1998, 119). Yet in recent decades American gender ideologies have changed to the
point that many activities previously considered “guy things” have become the province of girls as well—even drag racing. Robert C. Post See also Cars References and further reading Batchelor, Dean. 1995. The American Hot Rod. Osceola, WI: Motorbooks International. Cowan, Ruth Schwartz. 1979. “From Virginia Dare to Virginia Slims: Women and Technology in American Life.” Technology and Culture 20: 51–63. Dorbrin, Michael, and Philip E. Linhares. 1996. Hot Rods and Customs: The Men and Machines of California’s Car Culture. Oakland, CA: Oakland Museum of California. Douglas, Susan. 1999. Listening In: Radio and the American Imagination. New York: Times Books. Garlits, Don. 1990. The Autobiography of “Big Daddy” Don Garlits. Ocala, FL: Museum of Drag Racing. Hawley, Frank, with Mark Smith. 1989. Drag Racing: Drive to Win. Osceola, WI: Motorbooks International. Kerr, Leah M. 2000. Driving Me Wild: Nitro-Powered Outlaw Culture. New York: Juno Books. Martin, Chris. 1996. The Top Fuel Handbook. Wichita, KS: Beacon Publishing. McGaw, Judith A. 1982. “Women and the History of American Technology.” Signs: Journal of Women in Culture and Society 7: 798–828. ———. 1987. Most Wonderful Machine: Mechanization and Social Change in Berkshire Paper Making, 1801–1885. Princeton, NJ: Princeton University Press. Moorhouse, H. F. 1991. Driving Ambitions: A Social Analysis of the American Hot Rod Enthusiasm. Manchester: Manchester University Press. Parks, Wally. 1966. Drag Racing, Yesterday and Today. New York: Trident Press. Post, Robert C. 1998. “Hot Rods and Customs: The Men and Machines of
Du Bois, W. E. B. California’s Car Culture, at the Oakland Museum of California.” Technology and Culture 39: 116–121. ———. 2001. High Performance: The Culture and Technology of Drag Racing, 1950–2000. Baltimore, MD: Johns Hopkins University Press. Scharff, Virginia. 1991. Taking the Wheel: Women and the Coming of the Motor Age. New York: Free Press. Viken, James P. 1978. “The Sport of Drag Racing and the Search for Satisfaction, Meaning, and Self.” Ph.D. diss., University of Minnesota.
221
Drinking See Smoking and Drinking
Drugs See Illegal Substances
Du Bois, W. E. B. See Washington, Booker T., and W. E. B. Du Bois
E Early Republic
With eight children the statistical norm, families often grew to a dozen, though their sizes varied greatly (Thompson 1949, 62–69). New infectious diseases could strike three or four family members in one fatal swoop. Not only did death figure in the emotional economy of children, but its frequency prompted mothers and fathers to push their children toward maturity. Religious parents pressed for a recognition of their state of sin. At least half of all children had lost a parent by the time they reached age twenty-one. Such experiences fueled the religious revivals that swept through the country in the early decades of the century, the stern Calvinist message about damnation giving way to that of the Methodists and some Baptists, who counseled tenderness toward children. With 85 percent of the population still involved in farming, boyhoods were spent mastering the skills of plowing, hoeing, cradling, reaping, mowing, and fence mending that made up the rounds of rural labor. What was new in the early republic were the alternatives for teenage boys. They could become schoolteachers, lawyers, preachers, clerks, retailers, writers, newspaper reporters, or artisans, occupations that multiplied as the country moved from colonial dependency to national independence. Those who seized these opportunities rarely expressed regret for having left behind the unremitting toil
The revolutions in America and Europe that closed the eighteenth century prompted many people to think about change in a fundamentally different way. Novelty and innovation acquired a good reputation. Some began to speak pejoratively of “the dead hand of the past” and optimistically of a “dawning age.” Young people acquired a new importance as being those especially well positioned to cast off the habits that had blocked their elders’ path to improvement. For the generation of Americans who were born after 1776, boyhood was caught up in the tempo of nation building. The impulse to seize new personal opportunities became inextricably tied up with demonstrating the superiority of America’s republican institutions. In addition to these challenges, independence opened up a whole new array of economic opportunities ranging from new callings to forays into trade and manufacturing to moving onto new lands. Before boys were able to engage in these activities, the harshness of life in the early nineteenth century took its toll. Large families were the norm, leaving little family time to indulge the infants in their midst. Since adolescence had yet to be conceived of as a special stage of life, girls and boys typically went from being a child to an adult with little fanfare. Physical maturity signaled adult status.
223
224
Early Republic
Early nineteenth-century boys and men rest after field work. William Sidney Mount, Farmer’s Nooning, 1836 (Museum at Stony Brook)
that began when one grew “large enough to handle a hoe or a bundle of rye” (Appleby 2000, 126). The economic and political opportunities opening up to ordinary boys acted as a magnet with a sieve in front of it, drawing in those who had the literary and arithmetic skills that were still relatively rare. Schooling remained primitive in the early republican era. Most boys attended school long enough to learn to read, write, and do simple computations, abilities they could acquire in four or five years of three-month stints at country school. Those who prospered in the new
professions demonstrated an early love of reading, writing, calculating, and speaking in public. College attendance demanded a good knowledge of Latin, a requirement that people were beginning to criticize. In fact, literacy played a dynamic role in moving boys from the rural to the urban United States, not just because city jobs required schooling but also because teaching in a country school offered employment off the farm. A full third of the boys growing to maturity in the first decades of the nineteenth century moved with their families to the areas opening up in western parts
Early Republic of the thirteen original states and beyond. Traveling in wagons, often festooned with ribbons and kitchen utensils, boys might be called upon to attend younger children, take care of the livestock, or play scout during the long treks. Foreign visitors marveled at the endless miles of wagons snaking their way through the roads leading away from the Atlantic coast. They were also struck by the independent spirit parents cultivated in their children, encouraging them from infancy to do things for themselves. Childhood for boys on the frontier contained the excitement and dangers that later generations could only capture in play at cowboys and Indians. They watched the pine breaks and forests teeming with game yield to the plough, ax, and managed fires of frontier farmers. Carving out a farm on newly cultivated land required all the hands in the family, but the fertility of the soil yielded ample subsistence for them. A surprising number of northern boys roamed the country as peddlers, earning money by carrying the products of northern industry into the southern states. The spectacle of teenagers making their way on their own surprised their elders. With savings from peddling, some boys financed further education, either in academies or by studying under an established doctor or lawyer. Others stuck to a trade or went from peddling to retailing. Clerks were in strong demand in business, the professions, and the stores that appeared in cities, towns, villages, and hamlets across the land. Boys with a knack for writing could find work in the hundreds of newspapers that sprang up throughout the North and West like daisies in a June meadow. The young men who joined the mobile bands of restless youth ferreting out op-
225
portunities placed great importance upon their early love of books. Levi Beardsley, who later became president of the New York State Senate, remembered gratefully that four or five families in his frontier home had established a small-town library, augmenting his father’s two volumes of John Dryden’s poems and James Bruce’s Travels to Discover the Source of the Nile (1790). John Belton O’Neall carried into old age the recollection of the avidity with which he read his first book— Pilgrim’s Progress (1678)—taken from the library society organized in his hometown of Newberry, South Carolina. Having received one of his father’s two shares in a nearby township library in rural Massachusetts, Amos Kendall read every book in the collection within a few years. George Gilmer bought “Hume’s History and Ossian’s poems” with the wages that his father paid him for picking cotton. And in William Rudolph Smith’s school, each week the children had to read aloud from the newspaper’s “domestic occurrences and foreign news” (Appleby 2000, 125). Intelligence, an aptitude for learning, an early gift for reading, a yearning for more schooling, an eagerness to be on one’s own—these were the notes that orchestrated boys’ moves from home. “Probably in no enlightened country on the globe,” Harriet Cooke observed, “are children more anxious to be esteemed, or earlier permitted to become men and women than in our own; it has been with much truth remarked, that in the United States there is no such period as youth; we jumped at once from childhood to fancied maturity.” James Durand, who joined the navy at age eighteen, claimed that he had been punished by ensigns no older than twelve. Charles Trowbridge, one of Michigan’s pioneers, looked back with wonder at the responsibility laid on
226
Early Republic
young boys in his youth. At fourteen years of age, he was sent 140 miles “to procure the discount of a note for $4,000,” and when the note matured, he traveled over the same road again with funds to meet it. James Riley, whose narrated adventure became a best-seller, described his transition to manhood at age fourteen. Being “tall, stout, and athletic for my age; and having become tired of hard work on the land,” he wrote, “I concluded that the best way to get rid of it, was to go to sea and visit foreign countries.” His parents’ opposition gave way before his determination, and within five years he had passed through the grades of cabin boy, cook, ordinary seaman, seaman, second mate, and chief mate (Appleby 2000, 121–122). Although no such opportunities opened up to enslaved boys, their lives were affected by the new national celebration of freedom and independence. Many were caught up in the religious revivals and often learned to read as a consequence. When articulate black men and women began to agitate for the end of slavery, southern legislatures clamped down on those who started schools for slaves or taught them individually, but with mixed success. Hundreds of slaves—the preponderance of them young—liberated themselves, taking advantage of an expanding urban free black population to hide from their pursuers. For the first time, African Americans could establish churches, schools, voluntary clubs, and self-help organizations, but these improvements were often marred by the white hostility toward blacks that prevailed everywhere, except among the few ardent antislavery advocates. The range and sweep of youthful enterprise in the United States suggest the widespread willingness of American boys
to be uprooted, to embark on uncharted courses of action, and to take risks with their resources—above all the resource of their youth. Leaving home at the onset of manhood was more than an economic decision; it pulled on youthful longings for independence and autonomy while creating anxiety and guilt about separation. Those who did leave turned themselves into agents of change in a society that characterized itself as uniquely open to change. Over time, innovations and novelties in the United States dealt a mighty blow to traditional assumptions about social order. The Revolutionary War bequeathed to American youth permission to be bold, energetic, and ambitious. Joyce Appleby See also Books and Reading, 1600s and 1700s; Frontier Boyhood; Jobs in the Nineteenth Century; Preachers in the Early Republic; Schools, Public; Slavery References and further reading Appleby, Joyce. 2000. Inheriting the Revolution. Cambridge: Harvard University Press. Cashin, Joan. 1991. A Family Venture: Men and Women on the Southern Frontier. New York: Oxford University Press. Cayton, Andrew R. L. 1993. “The Early National Period.” Vol. 1, p. 100 in Encyclopedia of American Social History. Edited by Mary Kupiec Cayton, Elliott J. Gorn, and Peter W. Williams. New York: Scribner’s. Gilmore, William J. 1989. Reading Becomes a Necessity of Life: Material and Culture Life in Rural New England, 1780–1835. Knoxville: University of Tennessee Press. Heyrman, Christine. 1997. Southern Cross: The Beginnings of the Bible Belt. Chapel Hill: University of North Carolina Press. Kett, Joseph F. 1977. Rites of Passage: Adolescence in America, 1790 to the Present. New York: Basic Books. Mattingly, Paul H., and Edward W. Stevens, Jr. 1987. “Schools and the
Emerson, Ralph Waldo Means of Education Shall Forever Be Encouraged”: A History of Education in the Old Northwest, 1787–1880. Athens: University of Georgia Press. Stevenson, Brenda E. 1996. Life in Black and White: Family and Community in the Slave South. New York: Oxford University Press. Thompson, Warren S. 1949. “The Demographic Revolution in the United States.” Annals of the American Academy of Political and Social Sciences, no. 262.
Education See Military Schools; Schoolbooks; Schools for Boys; Schools, Public
Emerson, Ralph Waldo Ralph Waldo Emerson (1803–1882) influenced American cultural attitudes toward boyhood by idealizing self-reliance, power, and closeness to nature as attributes of masculine youth. In the name of such values, the Transcendentalist movement that Emerson led has often been perceived as American’s first youth rebellion. “The ancient manners were giving way,” he himself wrote of the years after 1820. Instead of being repressed, “the young men were born with knives in their brain” (R. W. Emerson 1903–1904, 10: 326, 329). Emerson himself experienced no easy path to individualistic power in childhood or adolescence. Indeed, his celebration of youth compensated for personal experience begun in the old repression, then darkened by the loss of his father and financial distress of his widowed mother. But the Emersons’ situation as a New England clerical family included positive resources as well as constraint and tragedy. His elders valued education and managed to obtain privileged schooling for the boy, and in particular his aunt Mary Moody Emerson
227
formed his character by powerful example and counsel from his childhood through his adult years. The individualism that Emerson would eventually claim for boys had crucial origins in his female-led family. He was born to William and Ruth Haskins Emerson, minister and wife of the socially prominent First Church Boston. The early deaths of siblings reduced their eight children to five boys, of whom Ralph was the second. William pursued an ambitious agenda for these sons in keeping with his religious rationalism and Federalist political vision. Where his own Calvinist ancestors had believed in breaking the will of children to prepare their souls for salvation, William emphasized intellectual prowess and moral self-control in preparation for leadership in the world. The resulting discipline still felt harsh to the boys whose manners, literacy, and health he took in hand. As an adult Ralph recollected the “mortal terror” his father had inspired by calling him to cold salt baths in Boston harbor, from which he hid like Adam from God’s judgment in the garden (R. W. Emerson 1939, 4: 179). But with the decline and death of his father when Ralph was eight, severity receded into mere blankness of memory, as mother Ruth and Aunt Mary supervised the children without recalling his father’s words or ideas. Emerson’s later antipatriarchal stance as a writer would assume the fathers’ irrelevance more than the need for confrontation between generations. Following the death of William Emerson, Ralph’s youth unfolded on the financially precarious edges of Boston gentility. His mother kept a boardinghouse at shifting addresses, relied on her sister-inlaw Mary’s intermittent help as representative of the Emerson family, and turned
228
Emerson, Ralph Waldo
to church and school authorities for support in educating her sons. Ralph experienced the worst and best of Boston Latin School’s traditional education, when the alcoholic and cane-wielding master was replaced by one who valued literary excellence. The eleven-year-old boy’s first essays were composed under the latter’s supervision. Ralph also stole hours from school for private adventures amid Boston’s wharves and commons. At least one early acquaintance, however, recalled that he rarely played with other children, let alone encountered the rough boys who dominated these public spaces. He felt little social confidence. Embarrassment accompanied the family’s poverty: Ralph and his brother shared one coat and were taunted by schoolmates, “Whose turn is it to wear the coat today?” (Cabot 1887, 1: 29). In addition, the family’s stoic response to loss allowed for little demonstration of feeling. Once when the brothers came home late, Ruth exclaimed that she had been “in agony” for them, and Ralph “went to bed in bliss at the interest she showed” (Cabot 1887, 1: 35). He seems to have felt significantly unnurtured otherwise and in need of a strong male model. A great deal, however, was going on beneath the boy’s quiet surface. His solitude among peers never became absolute because of closeness within the band of Emerson brothers. They shared books and intrafamily jokes, with Ralph at an early age composing comic verse and word games to their delight. They supported each other’s accomplishments; but rather than grimly urging competition, as his elder and younger brothers sometimes did, Ralph excelled in humor and optimism. In this his chief mentor was Mary Moody Emerson, who had overcome her own early childhood adver-
sity with a witty tongue and effrontery of convention. This unmarried woman attempted to balance the demands of her extended family with a preference for solitude on a farm in the mountains of Maine, and her willful bolting from Ruth’s Boston house offered Ruth’s sons an unsought example of self-reliance. Even more, however, Mary continued to guide her nephews from afar by writing letters and requiring responses. Long before Ralph studied writing at Boston Latin, he was composing letters to his aunt. He had his most demanding but also most affectionate parent in this female “stepfather.” Mary did not hesitate to declare her love for the boys, in one letter greeting Ralph as a “dear play Mate.” She even succeeded in renaming him, preferring his ancestral middle name “Waldo” in letters as early as 1812 (M. M. Emerson 1993, 65, 136). As the boy reached adolescence he adopted the signature, permanently naming his mature self as she had suggested. Most of all, Mary raised Waldo (as this entry will proceed to call him) in a religious and intellectual intensity that would shape his adult vocation. She led the family’s devotional life with prayers and stories of their ancestors’ heroic faith that would echo in his later memory. Though standing in place of her deceased ministerial brother, Mary was strongly modifying his message. Like him valuing public leadership as a goal for her nephews, she urged it only through prophetic knowledge of God. Brother and sister had both been bookish, but in her reading Mary found religious enthusiasm and early romantic vision, instead of William’s rationalism. She expressed a philosophy of childrearing antithetical to her brother’s, hoping to unfold the child’s powers by allowing maximum freedom,
Emerson, Ralph Waldo exposing him to nature as well as books, and asking for his own account of learning. This confidence in childish intuition showed the influence of European romanticism on her thinking and anticipated the ideas of American educators like Bronson Alcott, Waldo’s friend in the circle of Transcendentalists, a generation later. To Waldo himself, looking back from adulthood, the remembered picture of Aunt Mary and her boys seemed to him key to an “interior and spiritual history of New England,” a passage from old religion to new ideas (R. W. Emerson 1960–1978, 7: 446). Education at Harvard College began for Waldo at the age of fourteen. The youngest member of his class, he remained somewhat distant from fellow students by circumstance and temperament alike. President John Thornton Kirkland, an old friend of his father’s, offered financial assistance by making the boy his freshman orderly, expecting him to run errands from a single room in the presidential house. During vacations Waldo earned family income by tutoring boys nearly his own age in an uncle’s school. By sophomore year he had joined his classmates, both in the dormitory and at the “Rebellion Tree,” where they protested four classmates’ suspension by Kirkland for leading a food riot. He subsequently joined a literary club and took part in several essay and poetry contests. Still he did not excel, either in an educational system valuing rote memory or in a social scene favoring the manners of privilege. Just a few academic courses won his praise, most notably when Edward Everett began a new era by introducing Harvard to the riches of German university scholarship. Waldo’s most intense expression of friendship in college was a journal entry, privately recorded
229
but not pursued, on his homoerotic fascination as a senior with a new freshman. Keeping the journal that told of both intellectual life and fantasies was his truer means to growth in these years. Beginning it at seventeen was his most important act of 1820, the year from which he later dated a “Movement” of youths against their elders’ “Establishment.” More important than any Harvard class rebellion was the impulse of many individual, introspective young men with “knives in their brain” to express their thoughts in solitude. For Waldo this private reading and writing began a lifelong practice. Only later would he publish with confidence the thoughts initiated in his journal, declaring himself then to have been the “true philosopher” in college rather than his professors. “Yet what seemed then to me less probable?” (R. W. Emerson 1960–1978, 4: 292). In his independence from the educational establishment, more than ever he had a partner and guide in his self-educated aunt. Through the college years she wrote alternately joking and intellectually impassioned letters to him, stammering in mock humility at addressing a “son of Harvard,” urging him to avoid the beaten track of conformity and become a poet of divinity (M. M. Emerson 1993, 104). Her models of sublime solitude were inward-turning John Milton and William Wordsworth. She often recommended books that were not part of the Harvard curriculum, and he followed her leads as well as his own. His journal was begun in the Emersons’ parlor, with Mary present in the room and her own diary its most important model. Thereafter he often transcribed her letters in his journal as the words of “Tnamurya,” his mysterious-sounding anagram for “Aunt Mary.” Likewise, some of his
230
Emerson, Ralph Waldo
most important statements of principle and intention made during and after college came in letters to this same mentor. Indeed, her words readily seemed to be his own. Sentences from one of her letters went directly into a prize essay his senior year in college. Nor did the habit of appropriation end soon. As he began studying for the ministry, he begged her for more such letters of inspiration, and even as she resisted the supposed flattery of his imitation, she also obliged. He even echoed without attribution her early sentence on Miltonic solitude in his book The Conduct of Life, published twenty-five years later. Her influence on his youth was profound, never fully left behind. Another fifteen years of adulthood would intervene before Ralph Waldo Emerson emerged in his manifesto Nature (1836) with a call to transcendent vision that was also personal empowerment and healing of his uncertain boyhood. In it he mythologized infancy as a “perpetual Messiah” and urged adult readers to recover the eyes and hearts of childhood (R. W. Emerson 1903–1904, 1: 71). “Self-Reliance,” probably his most influential essay, presented the masculine entitlement of such innocence more directly. “The nonchalance of boys who are sure of a dinner . . . ” he wrote, “is the healthy attitude of human nature” (R. W. Emerson 1903–1904, 2: 48). Such rhetoric, as recent feminist critics have insisted, reduces women to mere nurturance and apparently excludes them from power. He was also suppressing a personally vital female source, his own aunt, in making these pronouncements. Yet he also struggled throughout life to credit the genius of Mary Moody Emerson, finally presenting a lecture of excerpts from her private writing. Whether girls
can also become self-reliant has been an ongoing argument in post-Emersonian American culture, one whose contentious course he charted but could not control thereafter. Many generations of interpreters have recognized Ralph Waldo Emerson as a representative and inspirer of the younger generation. For those recoiling from perceived excesses of the 1960s, he became an easy target in his offering of power to youth. “Emerson’s views are those of a brazen adolescent,” President A. Bartlett Giamatti proclaimed to Yale seniors in 1981, “and we ought to be rid of them” (Giamatti 1981, 177). Emerson would have been pleased at this association with youthful arrogance. At the height of his influence in 1839, he recalled the question his uncle had once asked: “How is it, Ralph, that all the boys dislike & quarrel with you, whilst the grown people are fond of you?” From his adult role of unsettling the establishment he could reflect on the change since those days with ironic satisfaction: “Now . . . the old people suspect and dislike me, & the young love me” (R. W. Emerson 1960–1978, 7: 253). Phyllis Cole
References and further reading Allen, Gay Wilson. 1981. Waldo Emerson: A Biography. New York: Viking. Barish, Evelyn. 1989. Emerson: The Roots of Prophecy. Princeton: Princeton University Press. Cabot, James Elliot. 1887. A Memoir of Ralph Waldo Emerson. 2 vols. Boston: Houghton Mifflin. Cayton, Mary Kupiec. 1989. Emerson’s Emergence. Chapel Hill: University of North Carolina Press. Cole, Phyllis. 1998. Mary Moody Emerson and the Origins of Transcendentalism: A Family History. New York: Oxford University Press. Emerson, Mary Moody. 1993. The Selected Letters of Mary Moody Emerson. Edited
Emotions by Nancy Craig Simmons. Athens: University of Georgia Press. Emerson, Ralph Waldo. 1903–1904. The Complete Works. 12 vols. Edited by Edward W. Emerson. Boston: Houghton Mifflin. ———. 1939. Letters of Ralph Waldo Emerson. 6 vols. Edited by Ralph L. Rusk. New York: Columbia University Press. ———. 1960–1978. Journals and Miscellaneous Notebooks of Ralph Waldo Emerson. Edited by William H. Gilman et al. Cambridge: Harvard University Press. Giamatti, A. Bartlett. 1981. “Power, Politics, and a Sense of History.” Pp. 166–179 in The University and the Public Interest. New York: Atheneum. Leverenz, David. 1989. Manhood and the American Renaissance. Ithaca: Cornell University Press. McAleer, John. 1984. Ralph Waldo Emerson: Days of Encounter. Boston: Little, Brown.
Emotions Boyhood serves as a training ground for many of the rules that govern emotional life. Some of these rules are common to childhood generally, but others are explicit to boys and serve to differentiate them from girls. Thus in many societies, boys (depending somewhat on social class) are encouraged to develop emotional reactions that might be appropriate for military life, for example, in cultivating the basis for courage. They may also, however, be urged to emphasize emotions supportive of humility and obedience that are similar to those recommended for girls, when these emotions are viewed as vital to family life and to religion. Historians in recent decades have been emphasizing both the variability and the changeability of emotional rules and in some cases have applied their interests to studies of boyhood. In this view, boyhood
231
may change significantly when emotional guidelines shift. Crying is a case in point. When Romantic literature was in full flower in the late eighteenth century, many adolescent boys and young men found it extremely fashionable to cry. Johann Wolfgang von Goethe’s Sorrows of Young Werther (1774) both produced and reflected an orgy of male tears. But by the 1820s, crying and manhood began to be seen as antithetical, and part of the process of being a boy and of raising boys involved as early and full a dissociation from tears as possible. Emotional rules that boys learn not only change but also vary from one group to the next. Slave boys, even ones who played with the offspring of plantation masters, had to learn habits of emotional deference in the antebellum decades so that they would be prepared for the discipline of adulthood. Their white counterparts could indulge more freely in emotions like anger. But variation does not depend on sharp racial or social class divides alone. Philip Greven (1977) has studied three categories of emotional culture in colonial America, again with significant implications for the emotional experience of boyhood and its preparation for manhood later on. Evangelical Christians, disproportionately located on the frontier, emphasized strict restraint of emotion punctuated with rigorous discipline often applied particularly to boys. The result was a fascinating and in many ways dangerous combination of docility and passion. Moderates also disciplined boyish emotion and actually imposed even stricter rules about bodily control (for example, with regard to posture), but they also emphasized affection. This emotional style gained ground in the later eighteenth century, particularly in the urban middle class. Finally, southern
232
Emotions
gentry were more indulgent, particularly toward boys and young men, producing a more expressive and certainly more selfforgiving emotionality. Research on emotional changes and variations is still in its early stages. We know too little about Catholic emotional rules as they applied to the immigrant experience of nineteenth- and earlytwentieth-century American boys, for example. Some Catholic parents used fear to impose religious discipline on children in general until about the 1950s, but how this played out in boyhood requires further inquiry. Two periods have been studied, however. Focusing on the urban middle class, historians have identified a vivid set of emotional rules for boys in the nineteenth century, rules that served as part of the strong emphasis on gender identities at every stage of life. These rules began to change in the early twentieth century, setting up a different, in some ways more complex, framework for emotional boyhood. Along with general injunctions about obedience, nineteenth-century childrearing literature emphasized two distinctive emotional regimes for boys. The first involved courage. Literature directed at boys and advice to parents in dealing with boys highlighted the experience of courage. Boys were expected to think about dealing with fear and overcoming it. Most of the popular literature aimed at boys by tireless authors such as “Oliver Optic” (William Taylor Adams) presented imaginative scenes in which courage would be called forth. In these stories, boys mastered their fear to save their sisters or other loved ones. The courage of soldiers was also a very popular theme during and after the Civil War. A 1904 childrearing manual made the point for boys directly: “When you feel
fear, master it; for courage is a virtue” (Birney 1904, 96). The second distinctive emotional focus involved anger. Boys were taught, at least according to widely purchased parent manuals, that anger was inappropriate in the context of the family. Boys must not be angry at parents or siblings. But they must not lose the capacity for anger, either; stories, again, presented scenes in which boys used their anger to attack bullies or to rail against social injustice. G. Stanley Hall, the expert on adolescence, was one of many adult authorities who argued that, properly directed, anger for boys and men was “a splendid force.” Boys’ emotional rules for anger and fear were also quite different from those urged on girls. Girls could show fear; they were held to be weaker than boys and their fear was expected, even up to a point charming (for it gave boys a spur to gallantry). But ladylike behavior did not involve anger, and here the rules were much stricter, though less complex, than those for boys. Emotional rules preached to parents and by adults to children through approved reading do not, of course, necessarily create an accepted emotional culture. Boys (far more than girls) often escaped adult control in the nineteenth century, particularly in the smaller towns and countryside where most people still lived. But although what one historian has called “boy culture” departed from many adult rules, it did replicate the recommendations on anger and courage. Boys constantly “dared” each other to take risks, and he who yielded to fear was derided. And they expressed anger as well by meeting challenges to fight. The distinctive rules for boys’ emotions, both as recommended by adults and as embedded in boy culture, showed
Emotions up in the emergence of the term sissy to describe boys who did not know how to get angry and who showed fear. Originally an endearing term for sister, by the 1880s sissy had become an epithet (and a source of parental concern about manliness) in the United States. Anger and fear were not the only components of emotional recommendations for boys, of course. Boys should also know love, an emotion initially directed at their mothers, quintessentially loving creatures in the nineteenth-century view of things. Later, in young adulthood, a man should be able to manifest a pure and intense love for a young woman. In between, there is substantial evidence that many young men also loved each other. Love, however, was a bit tricky for boys, partly because it was an emotion shared with girls, who by nature (according to the beliefs of the time) were better at it. Escape from mother love was part of boy culture, though few boys wanted to escape entirely. Where father fit in the family love equation was not clear, though in real life there were some obviously loving fathers. The nineteenth-century emotional culture developed for boys sought to prepare them emotionally for male adulthood. Courage and the competitive spur provided by appropriate anger could be vital in business or political life. Recognition of masculine emotional distinctiveness was part of the proper order of things throughout life, though somehow (and here was where real complexity entered in) it was to be combined with a strong family life based on love. Elements of this culture survived through the twentieth century. Although American boyhood began to be associated with athletics in the nineteenth century, the association steadily intensified
233
in the twentieth century. In athletics, boys were expected to master their fear and learn how to channel their anger into competitive success. By the twentieth century, the growing popularity of football emphasized the capacity for teamwork rather than the more individual achievements of sports like baseball and boxing. But in sports, key aspects of nineteenth-century emotional formulas persisted, including the differentiation from girls and from feminine emotions. On the whole, however, at least at the level of adult recommendations, nineteenth-century emotional culture began to change considerably by the 1920s, and boys were at the center of this change. A new surge of prescriptive literature took shape from the 1920s onward, indicating that parents were seeking different kinds of advice. Other adults who worked with boys, including teachers and scoutmasters, participated in the change as well. Boyhood was certainly newly constrained by the growing anxiety about homosexuality, an anxiety that had not provoked much comment in the nineteenth century. Twentieth-century beliefs that homosexuality was an all-ornothing category and a vile one at that placed new limits on how boys could express themselves to each other and created new targets for parental anxiety about boys. Ironically, one result was to highlight sports as one of the only legitimate arenas for homosocial contact. Generally, however, boys’ feelings for each other were newly circumscribed. Additional changes in emotional rules tended to equate boys more with girls— so long as they did not manifest homosexual symptoms. Here, the clearest and earliest transition involved fear. Once a desirable emotional experience, by the 1920s fear became a dangerous emotion,
234
Emotions
A young boy fears he has lost his parents in a mall. (Skjold Photographs)
and parents were urged not to test their sons by exposing them unnecessarily or by insisting on unrealistic amounts of courage. New psychological research indicated how many irrational fears both boys and girls could have and how damaging they could be. Articles proliferated in family magazines about what to do when children were afraid of the dark or of animals, and boy and girl examples mixed. Fathers who put their sons to harsh courage tests rather than sympathizing with their fears and carefully coaxing them were now subject to criticism. Jealousy was another emotion boys, along with girls, should now be protected from. Jealousy of other siblings had not been a concern in the nineteenth century,
but by the 1920s it received anguished comment. As with fear, there were three key changes involved: First, boys were now seen as vulnerable and flawed; there was no natural manly reservoir to be encouraged. Second, parental supervision of their emotions needed to increase, for their fears or jealousies might hurt themselves or others. And third, where these now-dangerous emotions were concerned, boys should not be treated differently from girls. There was no clear and distinct boyhood emotional model, just standards for children in general. The same approach applied to anger. It was now bad, period, and to the extent that boys were likelier to get angry than girls, they needed more controls. Boys had to learn to identify anger and talk it out, replacing it with feelings, as one popularizer put it, “more socially useful and personally comfortable” (Black 1946, 140). Here, then, was a new emotional regime for boys. It was not necessarily harder than the regime of the nineteenth century—the rules in some senses were less complicated, as in anger, or less rigorous, as with fear. Changes in the fear rules clearly increased emotional leeway; after the Vietnam War in particular, young men in the armed forces who expressed fear were treated with much more understanding than had been the case previously in the American military. But the emotional regime was new; it gave boys fewer clues about how to be different from girls; and it contradicted certain ongoing boyhood staples, like sports. The new regime resulted from several factors. First, psychology gained ground and with it some troubling discoveries about children and sometimes about boys in particular. It was psychologists who took the lead in relabeling child-
Emotions hood anger and calling it aggression. Many psychologists in the family and parenting field were women, and although they were not hostile to boys, they did not necessarily delight in highlighting masculinity. Second, adult roles were changing. Middle-class boys could now expect to grow up to be men who worked in corporate management or the service sector. It was vital to train them in what would soon be called “people skills”—getting along with and pleasing others. Learning to control or mask disruptive emotions like anger and jealousy were crucial here, and boys might have more to learn in this area than girls did. Assumptions about family were changing a bit, too. Fathers were being urged, by the 1920s, to take a somewhat more active role with children, which could encourage some new emotional experience as well. Obviously, this set of factors would intensify later in the twentieth century under the spur of feminist reexamination of gender roles and the fact that most women, as well as men, participated in the labor force. Here were additional reasons to encourage boys to think of girls not as emotional contrasts but as emotional equivalents. Finally, schooling was becoming more important, with middle-class boys expected to stay in the classroom through high school and, increasingly, into college as well. Disruptive emotions that might have passed in the nineteenth century, with less ubiquitous and less controlled schooling experiences, were no longer acceptable. Here, boys presented more obvious targets than girls, for they were more often classroom troublemakers. By the 1920s, indeed, a concern about unruly boys was developing into a medical category. This would undergo several name changes before it emerged, by the
235
1970s, as attention deficit disorder, a systematic emotional incapacity to accept school discipline, found disproportionately among boys. How well did this new emotional regime work for boys and the men they became? The question is easier to pose than to answer. Many boys accepted the new emotional discipline while also seeking recreational outlets that would provide some release—and some reassurance that the male was still quite different from the female. Boys’ interests in vicarious violence increased steadily, if only because there were more unsupervised outlets available, from the comic books that elicited great concern in the 1950s to the movies and video games of the century’s end. Although sissy dropped back as a term, other words still designated boys who did not seem emotionally manly. Correspondingly, worrying about boys’ emotional expressions almost certainly increased. Exploring historical perspectives on boys and emotions will surely yield greater insights in the future, for the field is very new. The relationship between adult standards for boys and boys’ own negotiations of those standards will always be difficult to pinpoint, but it deserves further attention. Comparison will be fruitful as well: there is little understanding of how American emotional history compares with that of other societies; here, again, boyhood would be a vital and rewarding focus. It is clear that changes in emotional standards and the motives behind them can play a significant role in the experience of boyhood and the relations between boys and adults. Peter N. Stearns See also Competition; Slavery; Teams
236
Emotions
References and further reading Birney, Alice. 1904. Childhood. New York. Black, Irma S. 1946. Off to a Good Start. New York. Chauncey, George, Martin Bauml Duberman, and Martha Vicinus, eds. 1989. Hidden from History: Reclaiming the Gay and Lesbian Past. New York: NAL Books. Greven, Philip J., Jr. 1977. The Protestant Temperament: Patterns of ChildRearing, Religious Experience and the Self in Early America. New York: Alfred A. Knopf. Rotundo, E. Anthony. 1993. American Manhood: Transformations in Masculinity from the Revolution to the Modern Era. New York: Basic Books.
Stearns, Peter N. 1994. American Cool: Constructing a Twentieth-Century Emotional Style. New York: New York University Press. Stearns, Peter N., and Jan Lewis, eds. 1998. An Emotional History of the United States. New York: New York University Press.
Employment See Jobs in the Seventeenth and Eighteenth Centuries; Jobs in the Nineteenth Century; Jobs in the Twentieth Century
F Farm Boys
due. Historically, farm boys remained so, in terms of their relationship to their parents, until they were married or living independently. Generally, this happened after their twenty-first birthday. All farm children began work at an early age, and in their first six or seven years, boys and girls often did the same tasks. They aided their mothers in the kitchen and helped with whatever small tasks were suitable to their age and abilities. These jobs often included gardening, gathering wood and carrying water, tidying the farmyard, gathering eggs, and harvesting wild foods such as greens, nuts, and berries. Parents were often hardpressed to do all of the work necessary to manage their farms, and children provided a flexible labor force. Work also trained young children to be industrious, a trait that farm parents often valued as much as any other sort of learning. As boys reached their seventh or eighth year, it increasingly became their responsibility to aid their fathers in the fields. It was not uncommon for boys to be plowing, harrowing, and completing other significant farm tasks by the time they were ten or twelve years old. Boys planted and cultivated crops and aided in the harvest and threshing. They also cared for animals; herding in particular occupied many a boy’s time. Willing and able sons were one of a farmer’s most important as-
Between the colonial period and 1925, the average American boy was a farm boy. During that period, the majority of Americans made their living from the soil and raised their children on farms. Being raised on a farm provided these boys with a far different experience than they would have had in the nation’s growing cities. Although nineteenth-century, urban, middle-class children increasingly turned their attention to school and to play, farm boys remained integral to their families’ economic survival well into the twentieth century. To be a farm boy was to be a worker, first and foremost. Beyond work, farm boys often achieved only a limited schooling and had to entertain themselves with the resources of the land surrounding them. In the twentieth century, the lives of farm boys have become much more like those of their urban counterparts, but some distinctions remain. An important issue to consider is the definition of farm boy. Clearly, residence upon a farm is required. However, the age parameters of boyhood are harder to define. Even as boys gained physical maturity and began to do a man’s work in the fields, their parents expected them to continue to labor on the family’s land and to be subject to the family’s needs, rules, and discipline. Their wages for off-farm employment continued to be their family’s
237
238
Farm Boys
Farm boys in Georgia, 1938 (Library of Congress)
sets, since the amount of land he could farm and therefore the amount of money the family could earn were directly proportional to the amount of family labor that could be utilized in the fields. Throughout most of American history, farm families have earned incomes that were too meager to allow them to hire labor. Instead, families made use of the labor of their children in order to make a living sufficient to support their families and their farms. The work that boys did often resulted in the assumption of a great deal of responsibility at an early age. In the 1870s and 1880s, the Norton family lived in frontier Kansas. One of the main components of their livelihood was a herd of cattle. The family could not afford fencing, so the Norton boys herded the animals. Herding sometimes required the
boys to spend long days and even nights alone on the prairie with the animals. Their parents trusted twelve- and thirteen-year-old boys to care for themselves and the family’s highly valued cattle without adult supervision. Although this would be somewhat unusual by the standards of the current day, the Nortons’ work arrangements and expectations of their sons were not at all unusual in the nineteenth and early twentieth centuries (Norton Family 1876–1895). The degree to which a farm boy received an adequate education was dependent upon many factors. Poor families often could not spare their sons from work. Other families in frontier communities often desperately needed labor. Some parents simply believed that all their sons needed in order to succeed as farmers was the most basic of educa-
Farm Boys tions: a little reading, writing, and simple arithmetic. In the late nineteenth century, as high school education became more and more common for middle-class urban youth, farm boys continued to be lucky to have an eighth grade education. Boys often began school at six or seven but fell behind in their studies because their parents needed them at home in the spring and fall to help with planting and harvesting. Although boys aged twelve and over often attended the winter term, they generally had poor attendance at all other times of the year. During those winter terms, boys could become a disruptive element in country schools, challenging the authority of teachers and intimidating younger students. By the turn of the century, educational reformers were discussing the “big boy problem” in rural schools and wondering what to do with youngsters who came to school only to cause trouble. As a result, many school districts that normally employed women employed men as teachers during the winter term, hoping that they would be able to keep the big boys in order. Many boys became discouraged and left school, or their parents decided that their efforts were best applied in the fields. By the late nineteenth century, it was far more common for a farm girl to have a high school education than a farm boy. Farm boys might, however, consider the training that they received on their parents’ farms the most important part of their educations. Henry Taylor, who grew up in Iowa following the Civil War, believed in later life that the time he spent working with his father was the best and most important education of his childhood. Beginning at the age of two, his father took him to the fields and the barn to teach him the fine art of farming. He learned by observation and by trial
239
and error, imitating his father’s actions. Farm boys generally had regular and close interaction with their fathers, a situation that in the best of circumstances allowed youngsters opportunities for closeness and intergenerational understanding that other children, not working with their parents on a regular basis, may not have had. Life for farm boys was not all work and no play. While they were pupils and even sometimes afterward, they enjoyed the entertainment provided by rural schools. Schools sponsored evening programs such as spelling bees, literary programs, box dinners, and other events. Boys regularly participated in these activities. Although home was a site of work, it was also a place for play. Few farm children had many toys, and most of them would have been homemade: sleds, tops, kites, and other items that could be fashioned by parents and children from wood, cloth, and used materials. Lacking toys and games, boys often made use of the possibilities in their rural surroundings. The farm environment offered boys numerous opportunities to hunt, fish, and take part in the adventures presented by open spaces. In the winter, they sledded, skated, and trapped. Some of these leisure activities, such as hunting, fishing, and trapping, served to feed their families and provide them with pocket money. Animals on farms also offered youngsters the chance to play and forge bonds with other living beings. Many boys played cowboy while riding horses, cows, and even pigs. For many boys, a horse was their best friend and companion in their adventures, and work done while on horseback was often considered play. Dogs were also valued friends. In memoirs recounting life on the nation’s farms, it is sometimes difficult to know
240
Farm Boys
where a boy’s play left off and his work began. Most families expected that their sons would remain at home until they were twenty-one or married. During those years, sons continued to contribute their labor and often their outside earnings to the family coffers. Many farm boys worked at least occasionally as hired laborers for other farmers, or worked seasonally as threshers or harvest hands. Some fathers allowed their sons to keep their outside wages and even paid their sons for working on the family farm. This, however, was unusual. Most boys would have been expected to contribute their entire income to the family enterprise until it was time to establish their own households. Farm boys were generally the inheritors of their parents’ farms. Although girls sometimes inherited land, parents overwhelmingly awarded daughters household goods as their inheritance. Families willed their land to sons, often to a single son. In established communities where land was both scarce and expensive, families often tilled acreages that were too small to be divided. In much of New England, family custom was for the father to deed his farm to his youngest son when the time came for him to retire. That son would then be responsible for providing money and household goods, in lieu of land, to his siblings and for caring for his parents as long as they lived. Farm sons who would not inherit had several options: they could attempt to marry into or buy land in the same community, go to town, or migrate westward in search of less expensive lands. Many farm boys moved westward in the eighteenth, nineteenth, and early twentieth centuries looking for opportunity and new lands to farm. Others moved into the rapidly growing towns and cities. The world wars
of the twentieth century took even more boys away from the farms and encouraged them to think of pursuing lives in urban areas. Following World War I, a common question in regard to farm boys was, “how do you keep them down on the farm, after they’ve seen ‘Paree’?” Following World War II, the lives of farm boys in the United States became increasingly like those of their urban counterparts. Changes in agricultural technology allowed farmers to complete more of their tasks on their own, without the extensive aid from children required in earlier eras. The spring plowing could be done using tractor power, and tasks such as corn planting and picking, which had once been tedious chores requiring large amounts of hand labor, could now be accomplished by one or two people operating a machine. Additionally, compulsory education had become more firmly established and enforced, and boys more regularly graduated from high school and even college. With the spread of radio and then television advertising as well as greater prosperity following the war, farm boys began to enjoy the same toys and leisure activities as those in urban areas. This does not mean, however, that farm boys’ lives had become identical to those of urban youngsters. Among small ethnic and religious groups, such as the Amish and Hutterites, farm boys continue to live their lives with essentially the same responsibilities that their fathers and grandfathers had as children. Although only a few farm boys live lives so untouched by modernity, they continue to play an important part in their families’ economic enterprises, working on weekends and holidays in the fields and barns of the nation’s farms. Their play continues to take advantage of the countryside and all its possibilities. Per-
Fathers haps the most important way in which they are different is that today, instead of being the majority, they are part of a tiny minority. Although nearly 100 percent of the nation’s boys were farm boys at the time of the American Revolution, today only about 2 percent have that distinction (Danbom 1995, 266). Pamela Riney-Kehrberg See also Cowboys; Frontier Boyhood; Jobs in the Nineteenth Century References and further reading Bliven, Bruce. 1968. “A Prairie Boyhood.” The Palimpsest 49, no. 8: 308–352. Danbom, David B. 1995. Born in the Country: A History of Rural America. Baltimore: Johns Hopkins University Press. Drury, Clifford Merril. 1974. “Growing Up on an Iowa Farm, 1897–1915.” Annals of Iowa 42, no. 3: 161–197. Florey, Francesca A., and Avery M. Guest. 1988. “Coming of Age among U.S. Farm Boys in the late 1800s: Occupational and Residential Choices.” Journal of Family History 13, no. 2: 233–249. Garland, Hamlin. 1899. Boy Life on the Prairie. New York: Macmillan. Haywood, C. Robert, and Sandra Jarvis. 1992. A Funnie Place, No Fences: Teenagers’ Views of Kansas, 1867–1900. Lawrence: Division of Continuing Education, University of Kansas. Landale, Nancy S. 1989. “Opportunity, Movement, and Marriage: U.S. Farm Sons at the Turn of the Century.” Journal of Family History 14, no. 4: 365–386. Norton Family. Norton Diaries, 1876–1895. Copied and annotated by Helen Norton Starr. Manuscripts Division, Kansas State Historical Society, Topeka, Kansas. Taylor, Henry C. 1970. Tarpleywick: A Century of Iowa Farming. Ames: Iowa State University Press. West, Elliott. 1989. Growing Up with the Country: Childhood on the Far Western Frontier. Albuquerque: University of New Mexico Press. Wilder, Laura Ingalls. 1961. Farmer Boy. New York: HarperCollins.
241
Fathers Fatherhood is a “hot topic” these days. Among the most compelling reasons for this interest is the scientific evidence that fathers have a substantial impact on their children’s social and intellectual well-being (Lamb 1997). The father-son relationship is sometimes regarded as something special among family relationships. There is now a substantial body of scientific evidence that shows this to be true; the effects of father involvement are often found to be stronger for boys than for girls (see Parke 1996). Also important is the evidence that being a father has considerable influence on men’s well-being (Pruett 2000; Snarey 1993). However, people’s ideas about the traits of an “ideal father” are probably more diverse now than ever, and as a result, the social scripts for fathers are less clear now than in the past. Changes in fatherhood have occurred along with widespread changes in the structure of American families. An American boy is now just as likely to come from a single-parent family or stepfamily as he is to grow up in a family with two parents who have always been married to each other. Divorce and remarriage are more common now than in the past, and most mothers are in the paid labor force. Just as these changes have had large impacts on fathers and their relationships with their sons, so have many social and cultural trends in the past. Although expectations for fathers have changed a great deal during 400 years of American history, common elements still emerge over the years. The rich diversity of fathering experiences throughout the history of the United States is too complex to depict; we are limited to describing general trends. The first European settlers
242
Fathers
A young father cradles his newborn son. (Jennie Woodcok; Reflections Photo Library/Corbis)
brought to the new world their own form of family, which was predominantly a nuclear family with two parents and their children (Doherty 1998; LaRossa 1997). Elizabeth Pleck and Joseph Pleck (1997) describe how this was not the typical family form among most of the indigenous peoples in North America. For example, Indian fathers often did not live with their children. An Indian boy usually lived with relatives of his mother in a longhouse or hogan, and the mother’s brother, rather than her husband, was likely to have a more prominent role in raising the boy. In addition, the activities of hunting, trapping, and engaging in war or diplomacy often kept men away from
the village, and so the community played an active part in bringing up children. Conversely, white colonial fathers usually played a central role in the lives of their children, especially their sons. Throughout the colonies, fathers commonly headed their families as patriarchs, providers, and teachers. Accordingly, a boy growing up in the American colonies typically looked to his father for much of his academic and religious education and later his vocational direction and training (Griswold 1997). In the rare case that his parents divorced, a son almost always lived with his father rather than his mother. Once the boy had grown up, it was the father’s prerogative to pass on the farm to him, thus giving him a degree of self-reliance. But many young men left the settled lands of their families and moved to their own land or to the cities. The colonial father in prerevolutionary America is often characterized as the stern Puritan father, but Philip Greven (1977) describes three styles common to colonial parenting: the stern style of the evangelicals (who liberally used corporal punishment); the moderate style characterized by love and duty, not fear (Thomas Jefferson and Benjamin Franklin were counted in this group); and the permissive style. There is evidence that stern patriarchy also held sway in the homes of fathers in the agricultural South and Latino fathers in the American Southwest, but southern white fathers of the planter class were more likely to have been permissive (Pleck and Pleck 1997). Forced to live lives very different from those of their white masters, African American slave fathers were obviously restricted in their desires to be providers and protectors of their families. As Pleck and Pleck (1997) describe, both fathers
Fathers and mothers taught important lessons in surviving life as a slave, such as when to acquiesce and when to fight. The stakes were high: the master had power to sell one or more members of a family. Some fathers who lived on neighboring plantations received permission from the master to visit their families once a week or on holidays. A father who was refused permission to see his family might try to run away with them. However, an African American boy was more likely than not to live with his father, with whom he spent evenings and what little free time there was learning to hunt and fish. During the 1800s, the cultural focus shifted to mothers and away from fathers (Griswold 1997). The young democracy now relied heavily on mothers to teach their children (especially sons) the proper character traits needed in the new republic. In addition, the philosophical ideas of the Enlightenment, which helped drive the American and the French Revolutions, had an important impact on American fatherhood (LaRossa 1997). With a new emphasis on rational thinking, democracy, and individual freedom, the culture of fatherhood became more egalitarian. The roots of today’s “new fatherhood” are apparent in postrevolutionary ideas of fathers as less authoritarian, more sensitive, and democratic. The rise of an industrial society in the North had the biggest cultural impact on nineteenth-century American fatherhood, however. Fathers began leaving the home in increasing numbers to work in factories and offices in urban settings, which put greater emphasis on the notion of father as a “good provider” (LaRossa 1997). Long hours away from home became the mark of nineteenthcentury fatherhood. It should be noted that rural fathers, too, were sometimes
243
required to leave the home for long periods of time, also for economic reasons. And industrialization did not inevitably distance fathers. For example, Amos Lawrence, a Boston merchant, was often visited by his children in his office, and the family usually took lunch together at home. Still, compared to boys living before the Revolutionary War, a boy growing up in the 1800s tended to spend much less time with his father and more time with his mother. The father’s absence from the home made parenthood rather one-sided by giving mothers the majority of the parenting responsibilities. In this way industrialization marginalized fathers. Although these trends were not universal in damaging fatherhood, it is undeniable that their effect on the father’s involvement with his children was detrimental. Perhaps as a result, moral messages for good fathering were sounded even in the beginning of the nineteenth century. Robert Griswold (1997) writes of the great Christian revivals of the 1820s and into the 1840s that included calls for fathers to adhere to the ideals of “Christian fatherhood.” In addition, some nineteenth-century moral advisers and medical practitioners considered responsible fatherhood essential to true and honorable manhood. The rise of the suburban middle class during the 1800s further built on the ideas of the Enlightenment, calling for a new definition of fatherhood as kinder, more deliberate, and separated from work, as industry removed many day-to-day father-child interactions integral to rural life. In the early 1900s, two fathering roles became more dominant: that of pal and that of male role model. Magazines and social scientists advanced the notion of father as tutor, chum, and outdoorsman,
244
Fathers
which was made possible by suburban backyards, middle-class office hours, and enough income to allow leisure time (Pleck and Pleck 1997). Additionally, part of a father’s duty during this era was to ensure that his son was not overly feminized by the large amount of time spent with his mother. Although mothers were still the primary parent to both boys and girls, mothers’ feminine influence was to be properly offset by the presence of a strong masculine role model. American ideals of fatherhood also changed significantly between the two world wars. Ralph LaRossa (1997) describes how the role of father was institutionalized as economic provider, pal, and male role model all rolled into one, even though each of these roles already existed at various times. In addition, magazines and social scientists framed parenthood as learned rather than innate. The idea of parenthood as a learned activity invited more participation from fathers—they could learn just as mothers could learn—and allowed discussion regarding which parenting strategies might work best. The postwar affluence of the 1950s and 1960s helped fathers find their way back into the home, even if only in the evenings and on weekends. The latter part of the twentieth century witnessed the progress of feminism, an influx of women into the workforce, and continued influence from the social sciences. Although some mothers were still at home when boys came home from school at the end of the day, many more mothers were likely to have a paying job than not. Accompanying the changes in women’s lives was the notion that an ideal father was no longer merely involved: he was to be an equal in caring for children. Fathers were now to be co-
parents with mothers; less emphasis was placed on strict gender roles. By the beginning of the twenty-first century, social science research had empirically documented what many suspected for centuries: that fathers’ influence and involvement are important to children’s development and well-being. Children tend to do better cognitively, socially, and academically with their father in the picture (Lamb 1997; Pruett 2000). As early as World War II, research showed that boys whose fathers had not gone to war developed better relations with their peers; conversely, father absence was associated with school and peer difficulties (see Parke 1996). Current research confirms these and other findings. For example, children whose fathers engage them in high levels of positive physical play tend to have better intellectual development, less aggression, and better social skills. Research has also confirmed that many men want to be involved with their children and that such involvement is good for them (Pruett 2000; Snarey 1993). But factors such as American individualism and materialism and high rates of divorce and unwed births that helped create the current crisis in fatherhood (Blankenhorn 1995) still exist and continue to undermine fatherhood. The current picture of fatherhood is more complex than ever. A great number of fathers want to be more involved in the lives of their children. But even after new initiatives such as the Family and Medical Leave Act (1993), which guarantees fathers and mothers unpaid time away from work after the birth of a baby, economic constraints and unwritten corporate rules still discourage increased involvement (Hochschild 1997). For many men, the questions of how and where to
Fathers
245
A father reads to his two sons on the couch, ca. 1940 (Library of Congress)
get involved are difficult to answer. In some cases, increased involvement on the part of fathers still feels invasive to mothers and their roles as nurturers (Allen and Hawkins 1999). Of particular concern is the separation of marriage and parenthood. Father involvement typically drops dramatically after a divorce. Similarly, fathers who are not married to the mothers of their children struggle to be involved in their lives over time. Moreover, the negative effects of father absence are usually longer-lasting for boys than for girls (Parke 1996). There is strong scientific evidence that a good relationship between parents has a positive impact on children’s social and emotional adjustment and that conflict between parents is linked to children’s
aggression, depression, and anxiety and to poor parent-child relationships (Cummings and Davies 1994). Clearly, a healthy, stable marriage will help fathers to establish and maintain involvement with both sons and daughters. Also, economic circumstances still have an impact on fatherhood today. A man’s ability to support his children remains at the core of the meaning of a good father. Men who struggle with getting and maintaining employment will also struggle with assuming their paternal responsibilities. Accordingly, healthy economies and programs that increase men’s labor force skills will strengthen fatherhood. Many recognize that a great majority of men value fatherhood and want to be good, involved fathers
246
Fathers, Adolescent
(Doherty 1998). Men’s desires to be good fathers, coupled with the current directions in research, may help provide men with clearer pathways toward overcoming and even transcending obstacles to satisfying, involved fathering—to their own benefit as well as that of their sons and daughters.
States: Historical Dimensions.” Pp. 33–48 in The Role of the Father in Child Development. 3d ed. Edited by Michael E. Lamb. New York: John Wiley and Sons. Pruett, Kyle D. 2000. Fatherneed. New York: Free Press. Snarey, John. 1993. How Fathers Care for the Next Generation: A Four-Decade Study. Cambridge, MA: Harvard University Press.
Kay P. Bradford Alan J. Hawkins See also Discipline; Divorce; Fathers, Adolescent; Mothers References and further reading Allen, Sarah M., and Alan J. Hawkins. 1999. “Maternal Gatekeeping: Mothers’ Beliefs and Behaviors That Inhibit Greater Father Involvement in Family Work.” Journal of Marriage and the Family 61: 199–212. Blankenhorn, David. 1995. Fatherless America: Confronting Our Most Urgent Social Problem. New York: Basic Books. Cummings, E. Mark., and Patrick Davies. 1994. Children and Marital Conflict: The Impact of Family Dispute and Resolution. New York: Guilford Press. Doherty, William J. 1998. The Intentional Family. Reading, MA: Addison-Wesley. Greven, Philip J., Jr. 1977. The Protestant Temperament: Patterns of ChildRearing, Religious Experience, and Self in Early America. New York: Alfred A. Knopf. Griswold, Robert L. 1997. “Generative Fathering: A Historical Perspective.” Pp. 71–86 in Generative Fathering: Beyond Deficit Perspectives. Edited by Alan J. Hawkins and David C. Dollahite. Thousand Oaks, CA: Sage. Hochschild, Arlie R. 1997. The Time Bind. New York: Metropolitan Books. Lamb, Michael E. 1997. The Role of the Father in Child Development. 3d ed. New York: John Wiley and Sons. LaRossa, Ralph. 1997. The Modernization of Fatherhood: A Social and Political History. Chicago: University of Chicago Press. Parke, Ross D. 1996. Fatherhood. Cambridge, MA: Harvard University. Pleck, Elizabeth H., and Joseph H. Pleck. 1997. “Fatherhood Ideals in the United
Fathers, Adolescent A teen father is a male who becomes a parent between the ages of twelve and twenty years. The subject of teenage fatherhood received widespread interest in the United States after the rates of premarital adolescent pregnancy and parenthood steadily increased during the 1980s and early 1990s. Concern for the wellbeing of unwed teen mothers and their children prompted research on the male partners of adolescent mothers. The results of this research present a complex picture of boys who become fathers during their teenage years. Only about one-third of the males involved in an adolescent pregnancy are teenagers; the rest are adult males who are at least twenty years old. Although teen fathers come from all socioeconomic levels, they are overrepresented among the poor. A disproportionate number of African American and Hispanic boys become teen fathers, but the majority of the young men who become fathers during their teenage years are non-Hispanic whites. Teen fathers appear to be slightly overrepresented among adolescents who engage in delinquent acts; however, the majority of teen fathers are not juvenile delinquents. Compared to men who delay becoming fathers until their adult years,
Fathers, Adolescent teen fathers are more likely to drop out of school, complete fewer years of school, and earn higher average salaries during their teen years but lower average salaries during their adult years. In all other respects, teen fathers appear to be more alike than unlike their peers who are not fathers. Boys who become fathers during their teenage years—especially unwed young men—violate a prevailing U.S. cultural norm that fatherhood should be delayed until males are married adults. In an effort to explain the dramatic rise in the rate of premarital adolescent parenthood that occurred in the United States during the latter decades of the twentieth century, several theories have been proposed. Some experts believe that early sexual activity reflects a decline in family and moral values in our culture. Others suggest that a lack of knowledge about reproductive processes, pregnancy prevention measures, and the consequences of early childbearing play a major causative role in teen pregnancy. Some teenagers may engage in sexual relations before they are emotionally prepared to do so because they have not developed assertiveness and good decisionmaking skills. A lack of access to contraceptive and abortion services has been suggested as a reason that some teenagers fail to prevent and terminate unplanned pregnancies. Some pregnancies might represent reactions to a sense of a lack of opportunity and power; that is, poor teenagers may drift into parenthood as a means to enhance an existence devoid of promising life opportunities. Welfare has often been cited as an economic system that rewards young women for becoming mothers. The media has been charged with contributing to increased unprotected sex among teenagers by depicting
247
sexually explicit behavior in the absence of birth control. Several other explanations for premarital adolescent pregnancy have been suggested. Some researchers believe that the tendency to become teenage parents runs in particular families and is passed on through observational learning and socioenvironmental constraints. Or an adolescent boy may unconsciously see a child as an object that can help him meet needs of symbiosis and individuation. Some teenagers may become parents as a mechanism for escaping from an unstable and hostile family. Finally, teenage parents may lack internal controls to inhibit the expression of sexual impulses, or adolescents with poor self-esteem may engage in sexual activity in an attempt to bolster their personal worth. Each of the postulated causes of premarital teenage parenthood has some empirical support, suggesting that teenage pregnancy probably results from some combination of factors. Most authorities on the subject of adolescent parenthood have argued persuasively that simple cause-and-effect relationships are inadequate explanations for adolescent pregnancy. Rather, examination of the complex interplay of multiple socioeconomic and psychological factors must be the direction of future attempts to understand and help teen fathers. Teen fathers are the victims of massive misunderstanding. A common societal stereotype about adolescent fathers is that they are sociopaths who sexually exploit and impregnate adolescent girls and then abandon them and their babies. Although this image accurately depicts some teen fathers, the majority of young fathers do not fit this stereotype. On the contrary, most teen fathers know little more than their partners or nonfather
248
Fathers, Adolescent
peers about sexuality and reproduction. Furthermore, teen fathers tend to experience intimate feelings toward teen mothers and their babies and remain involved, physically or psychologically or both, throughout the pregnancy and childbirth experience. Most teen fathers also provide emotional and financial support to their partner and child at least through the first year of the child’s life, although the quality of the relationship between teen fathers and their partners and children appears to deteriorate over time. Since most adolescent fathers do make a genuine attempt to support their partners and children, why are they viewed so pejoratively? The answer to this question resides in the flaws of research practices by social scientists studying the subjects of adolescent pregnancy and parenthood and the attention that the media has given their findings. Researchers have often made sweeping generalizations about teen fathers based on the results of case studies involving highly maladjusted teen fathers or larger studies in which the bulk of the subjects were actually adult men. The flawed conclusions of these studies were filtered through the media, thereby creating harmful public misconceptions about boys who become fathers while teenagers. Adolescent fathers tend to experience a host of adjustment difficulties associated with early paternity. These include the following: a wide range of troubling emotional reactions to the pregnancy, including depression, anger, and denial of responsibility for the pregnancy; conflicts regarding decisions pertaining to abortion and adoption; conflicts with the teen mother and her family and a related denial of access to the child; concerns about their competence as a parent; declining contact with the child over time; failure at school,
which may lead them to drop out; legal concerns; relationship changes with peers; and long-term career dissatisfaction, employment worries, and financial hardships. Recognizing that teen fathers need professional assistance in order to address these problems, advocates for teen fathers have urged counselors to develop service programs that are tailored to the needs of this population. Recommended programs include crisis pregnancy counseling, including abortion and adoption counseling; parenting skills training with an emphasis on the father’s role in child development; legal advice; family and couples counseling; recreational services; educational and career counseling; and job training and placement services. Biases against teenage fathers are manifested in programs for teenage parents that tend to include medical, educational, and psychological services for teenage mothers but not for teenage fathers. Data from numerous studies examining the status of services for teenage parents indicate that most federal, state, and local social service programs for adolescent parents have focused on the needs of teenage mothers, but only a few programs with limited services have been provided to teenage fathers. Thus, in spite of research indicating that teen fathers want assistance with the transition to parenthood and in spite of repeated assertions by numerous child advocacy groups, such as the Children’s Defense Fund, that teenage fathers are an at-risk group who require many of the same services deemed to be essential for teenage mothers, it appears that the needs of teenage fathers go largely unnoticed by service providers. These findings suggest that society is giving teenage fathers a mixed message: we expect you to become a responsible parent, but we will
Fathers, Adolescent not provide you with the guidance for how to become one. Although services for teen fathers are rare, several model programs have been developed. The Public/Private Venture’s Young Unwed Fathers Pilot Project targeted 155 fathers at six sites located throughout the United States. In Maine, 53 clients were assisted by the Maine Young Fathers Project, a statewide service program. The Maximizing a Life Experience (MALE) Group was administered to 8 students enrolled in a suburban high school. And a nationwide demonstration project called the “Teen Father Collaboration” served 395 adolescent fathers at eight different settings. Typically, a range of services were offered through these programs, including parenting and job skills training, educational planning, supportive counseling, life skills training, and legal advice regarding such matters as child support and establishing paternity. Analyses of the effectiveness of these programs suggest that young fathers appreciate receiving assistance with the transition to fatherhood. In addition, the participants experienced numerous positive benefits, such as increased school enrollment rates, employment rates, and knowledge regarding child support laws, legal rights and responsibilities, and birth control and pregnancy resolution options. Fathers who participated in these programs tended to get more involved in the prenatal care of their infants and in parental skills classes, interacted more often with their children, and provided greater financial support for the child. They were also more likely to establish paternity. Their children tended to have higher infant birth weights, and both father and child used wellness/sick care services more. In general, these boys seemed more likely to make use of avail-
249
able support systems, including public aid, job-readiness training, and fatherhood preparation curricula. They implemented plans to manage financial affairs more frequently, and increased declaration of paternity and child support payments. These boys also made more responsible use of birth control. Their interpersonal relationships improved, and their attitudes toward the child support system became more positive. In summary, most young fathers want help with the challenges of early paternity, and the provision of father-oriented service programs increases the odds that teen fathers will be caring, effective parents and productive members of society. Society owes it to these young men to think more carefully about boys who become fathers during their teenage years and to assist them with the challenges and responsibilities associated with adolescent fatherhood. Mark S. Kiselica See also Adolescence; Fathers; Sexuality; Transitions References and further reading Achatz, Mary, and Crystal A. MacAllum. 1994. Young Unwed Fathers: Report from the Field. Philadelphia: Public/Private Ventures. Barth, Richard P., Mark Claycomb, and Amy Loomis. 1988. “Services to Adolescent Fathers.” Health and Social Work 13: 277–287. Brown, Sally. 1990. If the Shoes Fit: Final Report and Program Implementation Guide of the Maine Young Fathers Project. Portland: Human Services Development Institute, University of Southern Maine. Children’s Defense Fund. 1988. Adolescent and Young Adult Fathers: Problems and Solutions. Washington, DC: Children’s Defense Fund. Elster, Arthur B., and Michael E. Lamb, eds. Adolescent Fatherhood. Hillsdale, NJ: Erlbaum.
250
Films
Fagot, Beverly I., Katherine C. Pears, Deborah M. Capaldi, Lynn Crosby, and Craig S. Leve. 1998. “Becoming an Adolescent Father: Precursors and Parenting.” Developmental Psychology 34: 1209–1219. Huey, Wayne C. 1987. “Counseling Teenage Fathers: The ‘Maximizing a Life Experience’ (MALE) Group.” School Counselor 35: 40–47. Kiselica, Mark S. 1995. Multicultural Counseling with Teenage Fathers: A Practical Guide. Thousand Oaks, CA: Sage. ———. 1999. “Counseling Teen Fathers.” Pp. 179–198 in Handbook of Counseling Boys and Adolescent Males. Edited by A. M. Horne and M. S. Kiselica. Thousand Oaks, CA: Sage. Klinman, Debra G., Joelle H. Sander, Jacqueline L. Rosen, Karen R. Longo, and Lorenzo P. Martinez. 1985. The Teen Parent Collaboration: Reaching and Serving the Teenage Father. New York: Bank Street College of Education. Pirog-Good, Maureen A. 1996. “The Education and Labor Market Outcomes of Adolescent Fathers.” Youth and Society 28: 236–262. Robinson, Bryan E. 1988. Teenage Fathers. Lexington, MA: Lexington Books. Stouthamer-Loeber, Magda, and Evelyn H. Wei. 1998. “The Precursors of Young Fatherhood and Its Effect on Delinquency of Teenage Males.” Journal of Adolescent Health 22: 56–65. Thornberry, Terence P., Carolyn A. Smith, and Gregory J. Howard. 1997. “Risk Factors for Teenage Fatherhood.” Journal of Marriage and the Family 59: 505–522.
Films Boyhood has been an essential theme in film since the medium’s earliest days, symbolizing the hopes, dreams, and values that a culture associates with its children. From the comedies and melodramas of the silent era to the action-oriented, specialeffects-laden extravaganzas of the twentyfirst century, boys have found a place. Occupying a variety of roles and images, they serve as entertaining movie characters as
well as serious commentators on family, community, and social issues. In 1895, Auguste and Louis-Jean Lumière, brothers whose handheld camera enabled them to leave the confines of a studio, capturing daily life and creating the world’s first documentaries, presented Watering the Gardener. In this film, a boy plays a trick on a gardener by placing his foot on the gardener’s hose to stop the water flow and then lifting it quickly when the gardener raises the nozzle to investigate, causing a sudden burst of water to hit the gardener’s face. The gardener then runs after the boy, catches him, and spanks him in what is played as a comic gesture. This short, simple film underscores the playful, mischievous quality of boyhood that figures prominently in later onscreen images. The innocence of childhood—as seen in the portrayal of the happy, spirited boy and its opposite, the troubled, searching boy—permeates boyhood characters in American cinema. In silent films, girls were more populous than boys, reflective of the era’s predilection for the innocent child in peril in classic melodramas and for the plucky, tomboy roles of “America’s Sweetheart,” Mary Pickford; nevertheless, there were notable images of boys. In 1921, Charlie Chaplin released The Kid, his first feature-length film, which costarred Jackie Coogan. In this film, an unwed mother carefully places her infant son in the backseat of an expensive automobile, hoping to give her child a life that she cannot provide. Chaplin’s Tramp happens upon the boy and, after unsuccessfully trying to get rid of him, takes the child home to raise as his own, offering him love and guidance. The Tramp plays the doting father, tenderly teaching the child his own strategies for survival
Films until the authorities, believing that he is an unfit parent, try to take the child away. The Kid in this film functions as a pint-sized version of the Tramp; he is both spunky and good-hearted, ready to fight the town bully, break windows so the Tramp can earn a living by following along to repair them, and prepare his father’s breakfast. Reflecting Chaplin’s own concerns following his childhood experiences in a London workhouse, the film introduces biting social commentary. At a time when a controversy raged regarding the benefits of institutionalism versus parental custody for impoverished children, Chaplin takes a stand, denouncing dogmatic social service practices and espousing the importance of the father-son bond. The Kid’s success paved the way for comedy-short producer Hal Roach to begin a child-centered series of his own, Our Gang, featuring some of the most memorable boys in film: Spanky, Alfalfa, Stymie, and Buckwheat. Roach’s inspiration for Our Gang was a group of children playing in a lumberyard across the street from his studio. Engrossed in the banter of this typical neighborhood gang, he vowed to make a show with real kids—not actors—confronting in a humorous way the day-to-day adventures and trials of childhood, such as dealing with a new teacher. The Our Gang series benefited from the naturalness and spontaneity of its multicultural ensemble cast and likable dog and stressed that poor children, unrestrained by the decorum of the rich, could really have fun. This theme proved popular with both Depression and wartime audiences; thus Our Gang, which began in 1922, ran until 1944, traversing both the silent and sound eras and accumulating a total of 221 episodes. It jump-started the acting
251
careers of Spanky McFarland, Jackie Cooper, and Dickie Moore and provided the prototype for a later television show, The Little Rascals. During the Depression years, approximately 60 to 75 million Americans flocked to the movies every week to escape the uncertainty of their lives and find powerful messages of hope and survival, and child-centered productions often filled the bill (Bergman 1971, xi). Although the remarkable oeuvre of Shirley Temple dominated what came to be regarded as the child-star era, films featuring American boyhood attracted audiences as well. Finding comfort in the past, moviegoers admired the cinematic versions of several boyhood literary classics, among them Little Lord Fauntleroy (1936), David Copperfield (1935), and Kidnapped (1938), all starring British actor Freddie Bartholomew as the proper, perfect child, loyal and kind to those he loved. In Little Lord Fauntleroy, Bartholomew plays the role of Ceddie of Brooklyn, who upon the death of his father finds himself transported across the ocean to England to be renamed Lord Fauntleroy and groomed as the next Earl of Dorincourt. His grandfather, the reigning earl, banishes Lord Fauntleroy’s beloved mother from Dorincourt Castle because she is from a lower class, and it is up to the child to teach him acceptance. Eventually, the earl acquiesces, proclaiming his love for Lord Fauntleroy, asking his mother’s forgiveness, and inviting her to live with them. Representative of Bartholomew’s other roles, in this film he plays a fixer who solves the problems of those around him. His story also underscores the mother-son bond and the resolution of class differences, which figure prominently in many later films.
252
Films
Jackie Coogan with Charlie Chaplin in The Kid, 1921 (Kobol Collection/First International/Charles Chaplin)
In his most successful role, in the 1937 film Captains Courageous, Bartholomew plays Harvey, a spoiled, self-centered, rich boy who, following the death of his mother, is left to be raised by his workaholic father. Following an accident at sea, a kind fisherman named Manuel (Spencer Tracy) rescues Harvey and becomes his friend, teaching the boy kindness and compassion. The film addresses the boy’s changing value system and his finding a hero. When Manuel dies, Harvey, now transformed, reunites with his father but remembers the lessons in life that the simple fisherman taught him. This theme of hero worship characterizes other boyhood movies of the 1930s, some of them featuring popular child star
Jackie Cooper, known for his tough exterior and inner sensitivity. In The Champ (1931), Dink (Cooper) rejects a lavish lifestyle with his mother in order to be with the father he idolizes, a former boxing heavyweight champion (Wallace Beery) now down on his luck and living a seedy life in Tijuana. In Treasure Island (1934), Cooper, as Jim Hawkins in Robert Louis Stevenson’s literary classic, befriends pirate Long John Silver and refuses to abandon him even when he discovers his treachery. In these films, the image of boyhood is marked by innocence and loyalty to one’s heroes, a fitting message for Depression families struggling through hard times. Boys in films of this period also reinforced their innocence by their affinity with animals, especially dogs. Playing the role of Skippy in the 1931 movie of the same name, which was nominated for an Academy Award for best picture, Cooper tearfully mourns the death of his friend’s dog and later trades his bicycle for another dog. As the first major child star of the talking era, Jackie Cooper established the persona of a child seeking love and meaningful relationships with parents, substitute parents, and animals. Perhaps no film better underscores the importance of boyhood connections than Orson Welles’s masterpiece, Citizen Kane (1941). The flashback scene showing Kane as a boy occupies only a small part of the film, but it is central to the film’s message. Although Kane has achieved the American dream of wealth and success, at the time of his death he remains searching and unfulfilled. His last word, “Rosebud,” which a journalist tries unsuccessfully to understand, refers to Kane’s boyhood sled, a symbol of his childhood. As a young boy, Kane is thrust from his home in Colorado to find greater
Films opportunities in the East, but his separation from his mother and home causes permanent damage, preventing Kane from ever being able to love or commit. As World War II broke out, the wholesome teen escapades of Mickey Rooney in the Andy Hardy films and the mean-spirited pranks of a group of juvenile delinquents called the Dead End Kids attracted American audiences, suggesting that movie tastes were growing up. Nevertheless, cinematic images of boys continued to appear, although tinged with a deeper seriousness that reflected the times. In How Green Was My Valley (1941), directed by John Ford, Roddy McDowall as Huw Morgan reflects on his idyllic childhood in a small Welsh mining town. However, times have changed. After mining disasters claim the lives of his older brothers and father and his family becomes impoverished, Huw realizes that he must leave the traditional life generations before him have known and expected and begin anew. The movie addresses themes of nostalgia, loss, and maturity, emphasizing the life-altering events that signal the end of one’s childhood. The Yearling (1946), starring Claude Jarman, Jr., as Jody and Gregory Peck as Pa, reflects similar ideas. Set in rural Florida in 1878, The Yearling begins with Pa’s being bitten by a poisonous snake and having to kill a doe in order to use its heart and liver to suck out the venom. Concerned about the fawn she has left behind, Jody insists on taking it home and raising it. Lonely himself, Jody and the fawn, which he names Flag, become inseparable friends until it becomes apparent that the wild creature cannot be prevented from destroying the family’s crops and livelihood. Pa reluctantly orders that Flag be shot, and in an act of defiance, Jody runs away. When he returns,
253
Pa remarks to Ma (Jane Wyman) that the boy is “not a yearling anymore.” He has matured and is ready to assume his father’s sense of responsibility in caring for the land and the family. Further, Jody’s mother, who had always been cool and distant toward him out of fear that she might someday lose him, is finally able to demonstrate her love. In addition to The Yearling, other key films of the period link boys to nature and animals. The Jungle Book (1942), starring Sabu, recounts Rudyard Kipling’s classic tale of the Indian boy raised by wolves; it inspired an animated version released by the Walt Disney Studio in 1967. In 1943, Roddy McDowall appeared in two films, Lassie, Come Home, in which a boy’s love for a dog overcomes all barriers, and My Friend Flicka, in which a child learns about life by taming and caring for a rebellious horse. In these films, boys’ relationships with animals underscore the innocence of childhood and the oneness that children share with nature before they are tamed by society. Boys in 1940s films also reflect wartime sensibilities. Hitler’s Children (1943) alludes to Hitler’s belief that training young boys in the Nazi ideology is the key to his country’s future and poses the chilling question of what such a society would be like. The antiwar film The Boy with Green Hair (1948) features Dean Stockwell as a war orphan who encounters cruelty before those around him learn acceptance, tolerance, and the devastating effect of war on children. This film, like many others, uses children to explore prejudice and how it is learned. Between 1946 and 1964, postwar Americans produced the greatest baby boom the world has ever known, resulting in the births of 76 million children. Although more Americans were busy
254
Films
“Spooky Hooky”: an episode from the Our Gang series (Photofest)
having children, youthful images on the screen declined, marking the end of the child-star era. However, in 1953, an extraordinary film appeared, one destined to become a classic of American boyhood. Shane, based on the western novel by Jack Schaefer and directed by George Stevens, expresses the point of view of Joey Starrett (Brandon de Wilde), an only child whose parents have moved to Wyoming with the intention of building a homestead and farming the land, much to the dismay of local ranchers who try to drive them away. From out of the distance, a tall stranger dressed in buckskin appears, identifying himself simply as “Shane” and establishing himself as a
hero in Joey’s eyes. Shane ultimately defends the Starretts’ claim to the land by defeating the ranchers and their hired gunfighter in a shoot-out. However, Shane then must leave, realizing that as a gunfighter, he can neither escape his past nor fit into the homesteaders’ domesticity. Quiet and observant, Joey watches Shane move in and out of his life, reflecting on all Shane teaches him about friendship, honor, and the civilization of the West. Establishing the child as an observer and interpreter of adult life remains one of Shane’s greatest contributions to American film. This strategy proves equally effective in a later film, To Kill a Mockingbird (1962), in which a little girl
Films named Scout (Mary Badham) recounts the events in Macomb County, Alabama, in the summer of 1932 when she and her brother Jem (Philip Alford) learn important lessons about racism, prejudice, and their father’s special brand of heroism. By the 1950s and 1960s perhaps no other studio had done more to popularize images of boyhood than Walt Disney. In animated feature films as diverse as Pinocchio (1940), Peter Pan (1953), The Sword in the Stone (1963), and The Jungle Book (1967), Disney created boyhood characters who grew up, learning moral lessons along the way. The same could be said of Disney’s live-action output of this period, including Treasure Island (1950), Old Yeller (1957), The Shaggy Dog (1959), Toby Tyler (1960), and Mary Poppins (1964). In all these films, set in earlier, simpler times, traditional portrayals of boyhood innocence and playful mischief predominate, and the importance of family and friends is espoused. At a time when the population of children in the United States was at an all-time high, movies painted a picture of boyhood saturated with adventure. By the 1970s, this image gave way to a disturbing trend, the boy as monster, apparent in horror films such as Village of the Damned (1960), The Omen (1976), Damien—Omen II (1978), The Final Conflict—Omen III (1981), The Other (1972), Halloween (1978), and Children of the Corn (1984). A marked contrast from the sensitive, searching children of earlier times, these boys kill and feel no remorse. This sinister portrayal of boyhood reflects tensions in American society regarding changing gender and family roles and pressures of childrearing, increased technological dependence and the dissolution of community, and awareness of the possibility of a nonex-
255
istent future brought about by nuclear annihilation. Inherent in the cinematic family, then, is the catalyst for evil. Although most films of the 1980s do not portray boys negatively, they nevertheless create a more serious, probing trend, often emphasizing the stresses inherent in the breakdown of the nuclear family. In Kramer vs. Kramer (1979), seven-year old Billy Kramer (Justin Henry) feels abandoned when his mother leaves to “find herself” and pursue a career in another state and his father attempts to raise him. Gloria (1980) features a street-smart but innocent little boy who loses his entire family following a mob bloodbath and goes to live with a gangster’s ex-moll. Stanley Kubrick’s The Shining (1980), based on the best-selling novel by Stephen King, presents the horrific image of a father gone mad who chases his son through a frozen garden maze with an ax. Finally, Stand by Me (1986) recounts the experiences of four troubled youths with various family problems who discover a dead body in their little town of Castle Rock, Oregon, during the summer of 1959. Many movies by Steven Spielberg, arguably the most popular director of recent decades, also address family disintegration and its effects on boys’ lives. In The Sugarland Express (1974), Spielberg tells the story of a mother with a criminal record, deemed an unfit parent, who stops at nothing to get her son back. This theme of mother-son separation is reinforced in Close Encounters of the Third Kind (1977), where three-year-old Barry Guiler (Cary Guffey), attracted to the lights and sounds of visiting aliens, is spirited away in a spacecraft, and his mother tries to retrieve him. In E. T., the Extra-Terrestrial (1982), Elliot (Henry Thomas), the middle child in a family recently abandoned by
256
Films
A scene from Kramer vs. Kramer with Dustin Hoffman and Justin Henry, 1979 (Kobol Collection/Columbia)
the father, feels isolated and sad until he meets E. T., an extraterrestrial child left on Earth by mistake when his spaceship takes off without him. Recruiting his siblings and other neighborhood children, Elliot helps E. T. to “phone home” and defies authorities by transporting the alien to the spaceship sent to rescue him. This film, a variation on the child-animal tale that has become a classic in its depiction of boyhood, emphasizes mother-child and sibling bonds, friendship, and the magic of childhood, exemplified by Spielberg’s signature image of Elliot on his bike racing through the darkened sky across a glowing moon. A later Spielberg film, Empire of the Sun (1987), based on J. G. Ballard’s autobiography, recounts the World War II story of a boy who lets go of his mother’s
hand in a crowded train station and ends up stranded in Japanese-occupied China. If girls attained popularity in the early decades of film for their depictions of harmony, by the turn of the twenty-first century, boys, providing images of conflict, predominated. Several of the highest-grossing films of the 1990s featured boys in key roles. Most notable are Home Alone (1990) and its sequel Home Alone 2: Lost in New York (1992), both starring Macaulay Culkin in the role of Kevin McAlister. The plot of both films is the same: Kevin, who feels neglected in the midst of his large, bustling, affluent family, is inadvertently left behind and must fend for himself, fighting off thieves and intruders in a slapstick-comedy style. This movie underscores two themes of modern life, both promulgated by changes in the American family due to divorce, single-parenthood, and twocareer couples: first, the child at home alone and second, the empowered, savvy child. Although street-smart children in movies were nothing new, Kevin struck a chord with American audiences, making him one of the most recognizable boys in film history. Other popular 1990s films with central boy characters include Jurassic Park (1993), the adventure of a brother and sister in peril in a dinosaur theme park; Star Wars: Episode I—The Phantom Menace (1999), the prequel to the Star Wars trilogy featuring Darth Vader as a child; The Sixth Sense (1999), a horror tale showcasing a boy with extrasensory perception who sees dead people; Angela’s Ashes (1999), the story of a boy’s impoverished childhood in Ireland; and My Dog Skip (2000), a boy’s nostalgic memory of growing up in Mississippi with his beloved dog. Reflective of each era’s preoccupations, boy-centered films from the child-star era
Fire Companies often addressed the social issue of class, whereas later films focused on the family struggle of a child growing up and trying to find a secure place despite change and separation. As movies enter their second full century, audiences continue to be enthralled by images of boyhood in their many dimensions, representative of the American obsession with youth and hope for the future. Kathy Merlock Jackson See also Horror Films References and further reading Aykesworth, Thomas. 1987. Hollywood Kids: Child Stars of the Silver Screen from 1903 to the Present. New York: Dutton. Bergman, Andrew. 1971. We’re in the Money: Depression America and Its Films. New York: New York University Press. Cary, Diana Serra. 1979. Hollywood’s Children: An Inside Account of the Child Star Era. Boston: Houghton Mifflin. Goldstein, Ruth M., and Charlotte Zornow. 1980. The Screen Image of Youth: Movies about Children and Adolescents. Metuchen, NJ: Scarecrow Press. Jackson, Kathy Merlock. 1986. Images of Children in American Film: A Sociocultural Analysis. Metuchen, NJ: Scarecrow Press. Kincheloe, Joe L. 1997. “Home Alone and ‘Bad to the Bone’: The Advent of a Postmodern Childhood.” Pp. 31–52 in Kinderculture: The Corporate Construction of Childhood. Edited by Shirley R. Steinberg and Joe L. Kincheloe. Boulder, CO: Westview Press. Sinyard, Neil. 1992. Children in the Movies. New York: St. Martin’s Press.
Fire Companies Boys have idolized firemen since fire companies were first organized in the
257
United States. The contemporary relationship between fire companies and boys, however, has its roots in the nineteenth-century struggle over the municipalization of firefighting in the urban United States. Before the late 1850s, all fire companies were composed of volunteers. These firemen generally owned their own firehouses and behaved in ways considered indecorous by many by midcentury. The attraction of boys to these sometimes rowdy volunteer fire companies helped produce widespread support for the change to paid firefighting in large U.S. cities, starting at the close of the 1850s. Newly professionalized fire companies carefully monitored both the behavior of their paid firemen and the access of boys to their firehouses. Firefighting was therefore made safe for boys by becoming a future adult career rather than an immediately accessible activity. The popularity of firefighters and fire engines among children is nearly universal. When queried about their future career goal, young boys regularly place firefighting at the top of their list. Children thrill to the appearance of firefighters in parades, and both families and elementary school classes tour neighborhood firehouses. There are currently more than fifty fire- or firefighting-themed children’s books in print, and early examples of this genre can be dated to the late nineteenth century. The mass appeal of firefighting to children is attributable to a number of factors. The size, color, speed, and sound of the fire engine attract the attention of children, as does the regular appearance of the firehouse dog, the Dalmatian, in parades and at the firehouse. Children are impressed by the uniform of the firefighter, and especially by the firefighter helmet, small versions of which are often distributed by fire companies to
258
Fire Companies
Dennis Ellis, age three, and his father sit at the wheel of the Harlow Fire Brigade fire engine, 1935. (Hulton-Deutsch Collection/Corbis)
young fans. The most significant attraction of firefighting, however, is the nature of the work firefighters do. In most cases, children simply reflect, in amplified form, the general respect accorded by society to the heroic actions and extreme bravery of the firefighter. Although some public ambivalence over the performance of the police is standard, firefighters are almost always beloved by their communities. Most young boys find firefighters to be truly heroic figures worthy of emulation. Whether the increasing integration of female firefighters into fire companies will result in a desire by young girls to likewise emulate firefighters re-
mains to be seen. As long as gender divisions among children continue in their current form, the physical exertion central to firefighting will probably retain a stronger appeal to boys, who are generally raised to value physical prowess to a greater degree than are girls. The earliest fire protection in the United States was provided by homeowners, who were required under town law to keep buckets on hand and help out in the case of fire. By the late eighteenth century in cities including Philadelphia and New York and by the early nineteenth century in most other established municipalities, volunteer fire companies
Fire Companies had taken control of fire protection out of the hands of the general citizenry. These fire companies quickly attracted the praise and regard of neighbors for their unpaid and apparently selfless efforts to control the frequent fires that afflicted towns. Fire companies appeared regularly in parades and often opened rooms of their houses to the community, providing ample opportunity for boys to see the firemen both at work and leisure. In the 1850s, Nathaniel Currier and James Merritt Ives produced several popular series of prints documenting the heroic efforts of urban volunteer firemen, including two prints, both titled “The Little Fireman,” which celebrated the boyhood infatuation with the fire department. In each print the “little fireman” sports a miniature version of the firefighter uniform, complete with a hat identifying the wearer as “Chief Engineer.” These prints testify to the long history of the tight bond between boy and fire company. The attraction of boys to fire companies and especially to the firehouse has not always resulted in the approval of parents. Complaints about the behavior of volunteer firemen, including accusations of public drunkenness, racing of fire engines, and fighting, became widespread by the 1830s and 1840s in large cities and helped catalyze efforts at municipalizing this public service. The frequent appearance of boys at fires and in the firehouse in the middle decades of the nineteenth century helped cast the firemen’s indecorous behavior in a particularly ominous light. Most companies had young followers who ran with the engines to fires and socialized at the firehouses. Legally, they were allowed to do no more than tag along because volunteer fire departments have historically stipulated that no one
259
under eighteen, or in some cases no one under twenty-one, could join a company. These rules were not fully enforced, however. In 1854 a member of San Francisco’s Columbia Engine Company was killed when a dog, tripping him, caused his head and body to be crushed under the engine. In their testimonial, the company mourned the loss of a youth “scarce eighteen years of age,” but a local newspaper reported his actual age as sixteen (Alta California, March 12, 1864). The 1856 membership list of St. Louis’s Franklin Fire Company included several eighteen-year-old members who had already belonged to the company for a year. One eighteen-year-old, Aug. Hefner, was formally listed as having joined the company at age sixteen. Anecdotal evidence testifies to the fact that younger boys also socialized at the firehouse and “ran with the engines” to fires (Franklin Fire Company 1856). As parenting grew in importance among evangelical Christians in the nineteenth century, the seeming threat posed by the fire department also increased. Men in this period spent more time at work and away from the home, and childrearing patterns in America changed, becoming more gender-specific and subject to greater attention and concern for both husband and wife. Critics like Theodore Dwight bemoaned the new neglect of childrearing duties among fathers, and Catharine Beecher, in her guidebook for new brides, American Woman’s Home, advised overbusy fathers to conscientiously subtract time from their business in order to interact with their children. Domestic mothers more than made up for any loss of paternal attention their children may have experienced. Childrearing was approached by Christian women with a new fervor
260
Fire Companies
and intensity. Indeed, their foremost duty within the domestic sphere was to guide and shape the morals and character of their children. Thus the presence of a child in the firehouse rather than by the family fireside became a matter of greater concern by the mid-nineteenth century than it had been in earlier times. Corruption of youth was one of the most frequent and powerful accusations made against fire departments. Critics of volunteer fire companies claimed that youth gangs were attracted to and harbored by some companies and that these young gang members sought out engine houses as locations for staging crimes against innocent bystanders. Firemen denied these accusations but had more difficulty refuting the claim that any association with firemen negatively influenced the moral character of minors. Fire companies tried to stem off criticism by passing redundant laws banning “boys” from the firehouses and by attempting to keep them from running with the engines, but this legislation appears to have been as ineffective as any other laws passed at midcentury designed to police the personal behavior of the volunteers. Volunteers worked without salary and were largely autonomous from municipal oversight. Furthermore, many firemen had themselves grown up around engine houses and believed that such exposure was ideal for training boys to become prepared firemen. One St. Louis fireman who remembered his “apprenticeship” as a member of the youth group “Slowline” No. 2, which was attached to the Union Fire Company in the 1840s, claimed that organized groups of boys within the companies “were the flower of the organization to which they belonged, and no efforts were spared to make them all that practice could do as
regarded speed, vigilance and efficiency” (Lynch 1878, n.p.). Most parents did not believe these were the best skills a boy could learn in his teenage years, and the immoral influence of the volunteer firehouse on the young was widely condemned in cities across the country. The criticism of boys in the firehouse became a key factor in the rise of paid firefighting in cities. The new paid departments that emerged at the close of the 1850s and 1860s replaced privately owned firehouses with public firehouses open to public scrutiny. The moral behavior of paid firemen, including their language, was strictly regulated, as were minimum age requirements for membership. The new paid fire department was, by design, both morally upright and child-free. The chief engineer of Cincinnati’s new paid fire department in the late 1850s provided typical praise of his new paid department when he stated in his annual report that “under the present control, the engine-houses are no longer nurseries where the youth of the city are trained up in vice, vulgarity, and debauchery” (Latta and Latta 1860, 24). The moral, politically accountable, and professional public servants that compose fire companies today are an outgrowth of these late-nineteenth-century paid fire companies, themselves formed in reaction to the “vulgar” and “debauched” volunteer companies. The contemporary relationship between boys and fire companies can therefore be seen as the result of a deliberate distancing of boys from association with firemen in the mid-nineteenth century. Parents support the childhood infatuation with the fire company secure in the fact that their sons can join those companies only upon reaching adulthood. Boys may wear fire hats today, as did Currier
Fishing and Ives’ junior firemen, but parents need no longer fear that the “little fireman” might become a reality. Amy S. Greenberg References and further reading Burger, Jim. 1976. In Service: A Documentary History of the Baltimore City Fire Department. Baltimore: Paradigm Books. Cannon, Donald J. 1977. Heritage of Flames. New York: Doubleday. Franklin Fire Company. 1856. “Minutes.” Missouri Historical Society, St. Louis Volunteer Fireman Collection. Greenberg, Amy S. 1998. Cause for Alarm: The Volunteer Fire Department in the Nineteenth-Century City. Princeton: Princeton University Press. Hazen, Margaret Hindle, and Robert M. Hazen. 1992. Keepers of the Flame: The Role of Fire in American Culture, 1775–1925. Princeton: Princeton University Press. Holzman, Robert, S. 1956. The Romance of Firefighting. New York: Bonanza Books. Latta, Alexander Bonner, and E. Latta. 1860. The Origin and Introduction of the Steam Fire Engine Together with the Results of the Use of Them in Cincinnati, St. Louis and Louisville, for One Year, also, Showing the Effect on Insurance Companies, etc. Cincinnati: Moore, Wilstach, Keys. Lynch, Tom. 1878. “St. Louis: The Volunteer Fire Department, 1832– 1858.” National Fireman’s Journal (August 3).
Fishing The sport of fishing has often occupied a prominent place in the lives of American boys. Fishing is, in essence, a form of play, and as such, it has historically served as the means by which millions of American boys establish a lasting relationship with the world of nature. It has, furthermore, often provided the basis for camaraderie between boys and men. For many American boys, fishing trips have
261
been a rite of passage in which the challenges of outdoor life, the company of other males, and the thrill of catching fish contribute to the formation of a masculine identity (Sjovold 1999, 76–81). Historians have argued that as the United States moved toward a modern industrial economy over the course of the nineteenth century, American men struggled harder to keep pace with the demands of a highly competitive workplace. Artisans, mechanics, farmers, and whitecollar workers labored feverishly in an attempt to master the market before it mastered them. The “market revolution” transformed the American conception of manhood during the nineteenth century by fostering the myth of “the self-made man,” the individual who successfully directed all of his aggressive impulses toward the goal of social and economic advancement (Sellers 1991, 237–268). This new conception of manhood was accompanied by a new conception of boyhood. If American men were now expected to channel their energy into the serious work of market production, American boys were allowed (and perhaps even expected) to engage in activities that were not particularly serious. Over the course of the nineteenth century, boyhood came to be seen as a phase of life defined by fun and frivolity, not grim purpose (Rotundo 1993, 20). Before they entered the adult world, where men vigorously competed with one another in their vocations, boys developed their sense of self by participating in some of life’s more innocent contests. Before the responsibilities of adulthood started to command the bulk of their time and energy, boys were allowed to indulge themselves in activities that delighted and amused them. And nothing seemed to delight and amuse American boys so much as the sport of fishing.
A young boy proudly shows off the fish that he just caught. (Shirley Zeiberg)
Fishing This separation between the seriousness of adulthood and the frivolity of boyhood is aptly captured in much of the angling literature published during the nineteenth century. In The American Angler’s Book, for example, Thaddeus Norris, one the nation’s most popular angling writers, recalled how he and his boyhood friends used to head into the surrounding countryside to catch trout and panfish. “In my school days, a boy might have been envied, but not loved for proficiency in his studies,” Norris wrote. He and his friends had far more respect and admiration for those boys “who knew the best fishing holes” (Norris 1864, 28). Another popular angling writer, Fred Mather, told his readers that as a boy, he “had no taste for anything like the harness of civilization.” Like many other boys of his generation, Mather “had no further object in view than to be in the woods and on the waters” (Mather 1897, 7). Norris and Mather both describe boyhood as a time in their lives when they deliberately avoided the tedium of schoolwork and the “harness of civilization” to indulge themselves in an activity that provided them with a full measure of joy and liberty. Norris, Mather, and their boyhood friends valued play more than they valued the successful performance of those purposeful tasks that inevitably led to college, careers, and the crushing burdens of adulthood. Nineteenth-century American angling writers regarded their boyhood fishing experiences as part of an important phase in their development as individuals. Fishing excursions provided boys with an opportunity to explore the wonders of the world outside—a place where a young boy’s sense of self took form as he scrambled around the rocks and trees, cooked fish over a campfire,
263
and slept out under an open sky. Like most other angling writers, Mather recalled his boyhood fishing excursions with much fondness, for these experiences always reminded him that there was more to life than the duties of trade and commerce. Men who skipped the pleasures of boyhood were, in Mather’s mind, a very sorry lot indeed: There are men who never could have been boys—engaged in boyish sports and had a boy’s thoughts. Every one has known such men. Men who must have been at least fifty years old when they were born—if that event ever happened to them—and have no sort of sympathy for a boy or his ways; crusty old curmudgeons who never burned their fingers with a firecracker or played hookey from school to go afishing. They may be very endurable in a business way, but are of no possible use as fishing companions (Mather 1897, 11). Boys have typically approached the sport of fishing with a more playful attitude than adults, who sometimes viewed the sport as a form of competition. During the nineteenth century, as the sport of angling became more popular with urban middle-class men, fishing excursions frequently developed into elaborate displays of masculine prowess and social prestige. For example, some men purchased expensive rods and tackle as a means of differentiating themselves from those anglers who used simpler equipment. Many nineteenth-century angling writers also distinguished between “sportsmen” and “potfishers,” a distinction that emphasized the social and cultural distance between the genteel angler who regarded fishing as a form of art and the crude
264
Fishing
backwoodsman who fished for food. Finally, adult anglers often projected these social divisions upon the fish they caught. One historian has argued that it has always been common for people to try to find within nature a confirmation of their own social values (Thomas 1983, 61). This observation is well supported by the thousands of angling narratives published in the nineteenth-century American sporting press, in which anglers commonly distinguished between the highly desirable “game” fish and the less attractive “coarse” species. Trout, bass, and salmon, for example, were typically coveted as the noblest species, but other varieties such as carp and catfish were regarded with scorn and contempt by “gentlemen” anglers. There was, in the minds of many adult anglers, a world of difference between catching a wily trout out of a pure mountain stream and catching a carp that lazily cruised the canals and ditches of the lowlands. To catch a “smart,” “strong,” and “hard-fighting” fish such as the trout was one way in which an angler could appropriate those qualities for himself (Sjovold 1999, 75–118). This association between game fish and gentlemen anglers was cogently summarized by Charles Hallock, the publisher of Forest and Stream, who declared: “Define me a gentleman, and I will define you a game fish” (Hallock 1873, 25). American boys generally did not invest the sport of fishing with these social meanings. Before they learned how to divide the world into social and cultural categories, boys went fishing for the sole purpose of fun. Before they were socialized into the practice of conspicuous consumption, boys were content to head into the woods with nothing more than a cane pole, a simple hook, and a can of
worms. “All boys, whether born with a horn or a silver spoon in their mouths, are pretty sure to have been endowed with a fish hook,” one angler wrote in 1873. “And their first essays at angling were with pin hooks and thread tackle, with which they hurried away to pond or rivulet, in search of horn-pout, roach, eels or the beloved perch” (“Perch Fishing” 1873, 161). Boys did not particularly care whether they caught “game” or “coarse” fish. Any fish that took the hook provided some measure of amusement and for most boys amusement was the main purpose of life. Besides, boys were probably more fascinated than disgusted by the head of a horn-pout (a variety of catfish) or the shape of an eel, two species of fish that a gentleman angler would never deign to catch. Boys had not yet learned to graft class distinctions onto the sport of fishing, for they were probably too preoccupied with the great mysteries of the outside world. Did worms feel pain when they were impaled on the hook? Why did catfish have whiskers? In time, of course, boys would learn the answers to these questions, and as they did they would also learn that there was more to the sport of angling than “pin hooks and thread tackle.” They would learn that an expensive rod worked better than a cane pole, that a gentleman used English flies instead of live worms, that trout were “nobler” than carp, and that angling was an art in which some people excelled but others did not: The veteran angler, perfect in all his appointments, provided with costly Conroy rod and patent reel, and flies of cunningest contrivance, began his career as a pond fisher, solicitous only for the capture of the commonest fish
Fishing of the waters. He now only condescends to the taking of lordly salmon, and royal trout, or princely bass, yet the remembrance of the pastimes of early days, in the little brook that sparkled through the paternal meadows, or in the undimpled mill pond fringed and shaded with alders and willows . . . is still dear and fresh in his heart. He pursued the noble sport of his maturer years, with a quiet serenity, but he idolized the humbler sports of old, and enjoyed each moment with immeasured delight (“Perch Fishing” 1873, 161). It was common for many adult anglers to look back upon their boyhood fishing excursions with such fondness. Many nineteenth-century anglers remembered how, as boys, they walked through a landscape that had not yet been transformed by the forces of progress. The fields, forests, and streams of the United States provided the ore, coal, lumber, and water power required for the development of an industrial civilization. As the pace of industrialization increased over the course of the nineteenth century, the nation’s lakes and streams suffered accordingly. Dams blocked the spawning paths of anadromous fish, manufacturers dumped toxic chemicals into the water, and lakes and streams silted up as a consequence of soil erosion. Anglers were among the first to notice these changes in the land. “Who that has practiced the art of angling . . . has not been led to mourn the destruction of his favorite fishing stream by the saw dust and other debris of lumbering establishments?” one angler asked in 1882 (“How Six . . .” 1882, 325). Another angler wrote that his boyhood haunts in New England had been destroyed by the “heavy feet and rough hands” of civilization; the forests
265
had been cut down, the streams had been dammed, and the land had been sold for building sites (“The Trout Brook” 1847, 195). One sportsman wearily concluded that “trout and progress are . . . incompatible” (“Concerning Black Bass” 1884, 241). Many anglers thus associated boyhood with a time in their lives when the woods were thick and the water was clear. Anglers often linked the innocence of boyhood with the innocence of a nation that had not yet experienced the growing pains of modernity. Although their boyhood haunts had been transformed by growth and development, it was possible (and perhaps still is) for adults to periodically reclaim the spirit of their youth with a fishing trip. The wilderness of their boyhood might have passed, but there were still many places in the countryside where weary city men could find fresh air and clear water— places where they could swim, spit, play cards, and sleep to their heart’s content. And when they had had enough of those activities, they could rise from their cots, grab their rods and tackle, and walk into those places in their imagination where they kept all of their memories of what it was like to be a boy. By the end of the twentieth century, the processes of industrial development and urban growth had transformed the American landscape in ways that Fred Mather and Thaddeus Norris would not have imagined. Although it might have been possible for a boy in the 1870s to find a good creek or pond within walking distance, most boys in the United States today go fishing only when they accompany family or friends on a long car ride to the countryside, where good fishing waters may still be found. Many anglers regard the future with some degree of apprehension; they worry if the stream they
266
Football
fish with their children today will still run clear in twenty years. In many parts of the United States, these fears have taken the form of an organized movement dedicated to the protection of our nation’s lakes and streams. Many members of Trout Unlimited and the Izaak Walton League, two organizations that play a significant role in preserving our watersheds, for example, started fishing when they were boys. Such organizations have worked very hard in recent decades to strengthen pollution controls, maintain water quality standards, and restore some of those lakes and streams damaged by urban and industrial development. Although Americans commonly associate the growth of such a civic spirit with the transition to adulthood, perhaps this spirit derives from another source. Perhaps this spirit to protect the woods and waters for future generations derives from a very strong desire among men to indulge in innocent diversions and play as they did when they were boys. Carl-Petter Sjovold See also Hunting References and further reading “Concerning Black Bass.” 1884. American Angler 5, April 19. Hallock, Charles. 1873. The Fishing Tourist: Angler’s Guide and Reference Book. New York: Harper and Bros. “How Six Mauch Chunkers Spent a Fourth of July.” 1882. American Angler 1, May 20. Mather, Fred. 1897. Men I Have Fished With. New York: Forest and Stream Publishing. Norris, Thaddeus. 1864. The American Angler’s Book. Philadelphia: E. H. Butler. “Perch Fishing.” 1873. American Sportsman 3, December 13. Rotundo, E. Anthony. 1993. American Manhood: Transformations in Masculinity from the Revolution to the Modern Era. New York: Basic Books.
Sellers, Charles. 1991. The Market Revolution: Jacksonian America, 1815–1846. New York: Oxford University Press. Sjovold, Carl-Petter. 1999. “An Angling People: Nature, Sport and Conservation in Nineteenth-Century America.” Ph.D. diss., University of California at Davis. Thomas, Keith. 1983. Man and the Natural World: Changing Attitudes in England, 1500–1800. London: Allen Lane. “The Trout Brook.” 1847. Spirit of the Times 17, June 19.
Football Soldiers and peasants have played football, a kicking game that involved an animal head or bladder, for thousands of years. By the medieval period it had become an annual ritual in Europe, with the blood spread over the fields symbolic of fertility. Contests between married men and bachelors or between opposing villages often produced violence and mayhem that caused English authorities to call for a ban on the game by the sixteenth century. In English schools, a regulated version of the game with formal rules evolved into soccer and rugby, which allowed running with the ball. By the late nineteenth century, American schools took up the formal game, and at annual rules meetings, male student representatives gradually changed the regulations to allow more running with and, eventually, throwing of the football. The decreased emphasis on kicking the ball produced a distinct version of the game played only in the United States and Canada, in which the object is to run with or catch a thrown ball in the opponent’s goal or to kick the ball through goalposts located behind the goal in order to score the most points; this version attracted great crowds of spectators in commercialized spectacles on college cam-
Football puses or urban sites. High school teams soon adopted the college rules, and by the beginning of the twentieth century, park districts, settlement houses, religious groups, and youth agencies also fielded teams. By the 1970s football had replaced baseball as the national sport in the United States, and organizers exported the game to Europe, Asia, and Mexico, though it has not replaced soccer as the primary sport in other regions of the world. Football gained some popularity in the United States by the 1820s, where Harvard students participated in intramural games on “Bloody Monday” to open the school year. By the 1840s, Yale students engaged in similar affairs, which school authorities banned in 1860. The game continued, however, among Boston schoolboys throughout the 1860s. In 1868, Princeton students adopted the London Football Association rules (for soccer) and engaged Rutgers in the first intercollegiate contest the following year, with teams fielding twenty-five players each. In 1872, Harvard began playing by the rugby rules, which allowed the ten to fifteen players per side to advance the ball by running with it. Intercollegiate contests and the rivalries such events fostered required agreement on the rules of play. The studentorganized teams met annually but often failed to reach agreement, until the emergence of Walter Camp, Yale team captain, as leader of the rules committee. Known as the “father of American football,” Camp dominated the governing body from 1878 to 1925. The introduction of the scrimmage line in 1880 separated the teams into offensive and defensive units and distinguished the American game from British soccer. Further elaboration of the rules, which eventually required a
267
Two boys in football uniforms, 1940s (Archive Photos)
team to gain 10 yards in four tries (known as downs) in order to maintain possession of the ball, further distanced the sport from its European influences. Schoolboy teams copied their collegiate counterparts, and high schools in Pennsylvania and Illinois followed the lead of Boston schools and fielded contingents in the 1870s. By the 1880s, football had spread to other areas of the Midwest, West, and South from its origins in the Northeast. Organized and administered by students, these teams challenged local and regional rivals for supremacy. In 1885, Chicago high school teams founded the Cook County Football League with five teams that competed to determine the municipal crown, and Baltimore, Boston, and Michigan high schools organized
Boys playing football, 1998 (Photodisc)
Football similar associations by 1888. A New York interscholastic league began in 1892. Harvard, Yale, and Princeton Universities battled for national intercollegiate honors. A brutal game often featuring massed plays and military strategy, football appealed to young men eager to assert their masculinity and to an aggressive young nation anxious to fulfill its Manifest Destiny by taking its place among world leaders. The quest for victory and prestige led teams at the high school and college levels to recruit top players and professional coaches. Students at colleges and their alumni even began subsidizing star players who entertained the best offers for their services. By the 1890s intercollegiate football had become a commercialized spectacle that brought attention and profit to colleges. National championships, held in New York, attracted tens of thousands to the Thanksgiving games and their accompanying festivals. In 1902, New York challenged Chicago to a national championship of high school teams. Chicago humiliated the visitors 105–0 with its “open” style of play that featured end runs and speed rather than the massed formations practiced in the East. A different Chicago team won the 1903 rematch in New York, 75–0. Despite the lopsided victories, regional and national challenges only increased as top high school teams traveled across the country to promote institutional and civic pride. That trend has continued throughout the twentieth century. Many boys joined less organized neighborhood, sandlot, and town teams by the 1890s, but issues of brutal play and numerous deaths brought adult intervention. The Young Men’s Christian Association, settlement houses, and religious
269
organizations began sponsoring teams to promote their own particular values and to gain greater control over male youth. College presidents organized a governing body in 1906 that eventually became the National Collegiate Athletic Association. High school faculty members and administrators in New York and Chicago pioneered the public schools’ athletic leagues, which brought reform measures and adult supervision to interscholastic contests. Between 1906 and 1910, more innovative play and new rules that allowed for more forward passing of the ball somewhat diminished the brutal consequences of massed play; fans’ zeal and boosters’ attempts to gain a winning advantage continued unabated. Games between towns, clubs, and rival companies often elicited considerable gambling, and some teams, unsupervised by the school authorities, became semipro and eventually fully professional units. Independent athletic clubs began paying college or high school stars to play by the early 1890s. The Morgan Athletic Club, a Chicago neighborhood boys’ team, began in 1898, and along with the Dayton Triangles, which started as the St. Mary’s Athletic Association, became early members of the National Football League. To uphold religious commitment, the Knights of Columbus and Jewish settlement houses fielded boys’ football teams. Without forsaking religious values or pride in diversity, Catholic high schools gained greater assimilation in American society through their sporting relationships. Influenced by the success of Notre Dame in the 1920s, “Catholic” football took on the appearance of a religious crusade in games in which Catholic schools played against public and presumably Protestant high schools. The 1937 Prep
270
Football
Bowl, which served as the Chicago city championship between winners of the Catholic and Public Schools Athletic Leagues, drew 120,000 fans to Soldier Field—the greatest number of spectators at an American football game of any kind. By the 1940s, thanks to the Catholic Youth Organization, even Catholic elementary schools were fielding teams that produced well-seasoned boy players for their high schools and colleges. By that time, municipal park districts had begun local play for youth teams organized on the basis of age and weight class divisions to minimize competitive advantages. In 1929, a national organization, the Pop Warner youth football association, took the name of an innovative and successful college coach. Operating under the same principle of age and weight class divisions, Pop Warner today offers competition for more than 300,000 football players and cheerleaders aged five to fifteen. The Pop Warner enterprise has since become an international one, encompassing Mexico, Russia, and Japan as well as thirty-eight U.S. states. In addition to sponsoring a national championship, Pop Warner recognizes academic AllAmericans at the youth level by providing annual scholarships for higher education. The Pop Warner football program has spawned numerous organizations with similar emphases and goals. Among them, American Youth Football is an international coalition that promotes competition through a national ranking system, tournaments, and individual skills contests, as well as flag and touch football contests. Municipal recreation programs, park districts, and other organizing agencies throughout the United States also conduct flag and touch football competitions, which limit both the
physical contact and equipment expenses of tackle football. Both flag football and touch football are adaptations to and less dangerous than the tackle version of the game. Although tackle football requires an opponent to stop a ball carrier from advancing by bringing the runner to the ground, flag football allows an opponent to simply grab either of two flags attached to a runner’s belt. Touch football requires only hand contact with a runner’s body. Some smaller, mostly rural high schools or community programs unable to field eleven players for tackle football have maintained the tackle rules but play with six or eight players per squad. Since the 1970s, urban high school teams, once the mainstay of youth football in the United States, have faced increasing problems because of insufficient funding for extracurricular programs, crime in the neighborhoods surrounding schools, and rising costs. Football continues to flourish among boys in suburban schools and park districts and many rural communities. Many coaches, parents, and players extol the courage, aggressiveness, teamwork, and physicality required to play the game as building character for boys. Others see football as promoting the inherent competition, civic pride, and emphases on winning that characterize American culture. Gerald R. Gems References and further reading Gems, Gerald R. 1996. “The Prep Bowl: Football and Religious Acculturation in Chicago, 1927–1963.” Journal of Sport History 23, no. 3: 284–302. ———. 2000. For Pride, Patriarchy, and Profit: Football and the Incorporation of American Cultural Values. Metuchen, NJ: Scarecrow Press.
Foster Care McClellan, Keith. 1998. The Sunday Game: At the Dawn of Professional Football. Akron: University of Akron Press.
Foster Care Foster care is a method of providing care in family homes for boys and girls whose biological parents cannot provide for them, either temporarily or permanently. Unlike apprentices or boys indentured or placed out, children in foster care are not expected to work for their keep. When foster care first began in the late nineteenth century, some foster parents took in children without charge. Since the 1930s, virtually all foster parents have been reimbursed for their expenses in caring for youngsters, partly by local and state government and, since 1961, partly by the federal government. Foster care exceeded placing out in popularity by the 1930s, and by 1958, there were more children in foster care than in orphanages. Approximately half the children in foster care are boys. Over time, child welfare agencies have grown more careful about selecting and training foster families and about checking up on children in them. Nonetheless, abuses continue, and foster care is far from being an ideal program for all impoverished or abused boys and girls. In the seventeenth and eighteenth centuries, when needy children’s families could not adequately support them, local public officials commonly removed the children and indentured them to live with other families in the community. An indenture was a contract between the local government and a family that required the family to feed, house, and board a boy until age twenty-one in re-
271
turn for his labor. Some needy children also ended up in almshouses (local public institutions that cared for poor people of all ages), from which they were sometimes indentured. In the nineteenth century, orphanages maintained by public officials or private agencies became the most popular method of caring for needy youngsters. Boys and girls usually stayed a year or two in an orphanage before being indentured out to work for families living nearby. Private and public agencies developed foster care programs gradually in the late nineteenth century. The New York Children’s Aid Society was the first to place out youngsters with families without indentures. However, most of the families that accepted boys and girls placed out by the Children’s Aid Society wanted the youngsters only for their labor. As Children’s Aid Societies developed in other cities, child welfare workers became concerned that boys and girls were being too much exploited in their new family homes. Partly to end such exploitation, the Boston Children’s Aid Society under the direction of Charles Birtwell began paying some families to board children in the 1880s. In Philadelphia in the 1890s, Homer Folks pushed the Children’s Aid Society in that city to do the same. In Massachusetts, government officials converted the Monson public almshouse into a child welfare institution from which youngsters were to be placed out promptly to live with families. Most families were willing to take in only older boys and girls because they were able to work. In the 1880s, to ensure that all needy Massachusetts children could enter foster families, the state authorized payments to families willing to board boys and girls younger than ten who were
272
Foster Care
A father watches as his adopted son gives their sixteen-month-old foster daughter a kiss, 1994. (Stephanie Maze/Corbis)
unable to work. The expense of paying foster families to board children discouraged other states from following the lead of Massachusetts. New Jersey was the only other state to do so in the nineteenth century. Concern about abused children surfaced in the late nineteenth century and led to some youngsters being placed in foster homes. In 1874, members of the New York Society for the Prevention of Cruelty to Animals discovered that there were no laws protecting children from abuse, and so they founded the first Society for the Prevention of Cruelty to Children (SPCC). Concerned citizens in other
cities soon did the same. The founders of SPCCs were largely middle-class and very aggressive in policing lower-class families. SPCC officials defined neglect and abuse broadly. They assumed impoverished parents to be neglectful and abusive if they let did not dress their children warmly enough or if they permitted youngsters to beg or peddle objects in city streets. The SPCCs encouraged family members, neighbors, friends, and employers to report neglect and physical abuse and frequently solicited the aid of police as well. When cases came to court, judges often accepted the advice of SPCC officials and appointed the SPCC as the guardian of an abused child. The agency then placed the youngster either in an orphanage or in a foster home. In 1909 the White House Conference on Dependent Children gave foster care a push. Public and private officials who attended the conference endorsed the idea that the best place for a needy boy was a family home, preferably his own. However, if parents could not properly care for a boy, the next best place was a foster family home. By 1910, 61,000 children were in foster care, but institutional care of poor children was still more common: in 1910, orphanages housed 111,514 boys and girls. Foster care grew in popularity in the 1920s, and the number of boys in institutions for children declined. States ended the practice of indenturing boys and girls in the 1920s. Nonetheless, not all children were eligible for foster care: agencies were often unwilling to place out African American children. Relatives usually took in black boys and girls whose parents had died or could not support them. Even as it became more popular, foster care was far from an ideal program. Methods of selecting foster homes were
Foster Care somewhat haphazard, and once children were placed with foster families, they were not checked up on very frequently. Agencies counted on foster parents to report on the children and sometimes on neighbors, teachers, and pastors to notify the child welfare agency about how foster children were being treated. Agents went out to check on youngsters only once or twice a year. Siblings often ended up in different foster homes. Agencies frequently moved foster boys and girls from home to home as officials sought to find a family that was willing to care for the child for free. During the Great Depression, thousands of children became homeless as their families proved unable to support them adequately. In 1933, 120,000 boys and girls needed foster care. Yet in the 1930s, public and private child welfare agencies, both those that institutionalized children and those that placed them in foster care, suffered from a lack of funds. Fewer Americans could afford to pay taxes or donate money to private charities. Moreover, in tough economic times, few families were willing to take in foster children for free, although they were quite willing to take children for a fee. Child welfare agencies struggled; in 1931, they were able to find enough money so that 74 percent of foster children were in homes. In the 1930s, agencies removed youngsters to foster homes largely because of poverty. Interest in taking boys and girls from their biological parents due to neglect and abuse declined in the Depression and World War II. Officials believed that most families had so many economic problems to deal with that there was no point in bothering them about abuse. In addition, financial stringency meant that agencies hired fewer child social workers to look into the issue of child abuse.
273
In 1935, Congress passed the Social Security Act. Although it did not then provide any money for foster care, the law created Aid to Dependent Children (later renamed Aid to Families with Dependent Children, or AFDC), which extended monetary grants chiefly to single mothers to help them keep their children at home rather than place the boys and girls with foster care agencies or orphanages. Placements of children in foster care declined from 59 for every 10,000 youngsters in 1933 to 38 for every 10,000 in 1960. Nonetheless, foster care became the most accepted method of caring for needy children outside their natural homes by 1958, when orphanages had so declined in number that more children were in foster care than in institutions. In the 1960s there was a renewed interest in foster care. The prosperity of the decade led many Americans to believe the nation could conquer poverty. After campaigning in impoverished Appalachia in 1960, newly elected president John F. Kennedy was also very interested in trying to end poverty. His administration supported services to end poverty. In 1961 the Congress authorized AFDC funds to be used for fourteen months to assist boys and girls removed from “unsuitable homes” to foster care. Then in 1962, Congress prolonged foster care payments indefinitely and authorized federal funding for counseling and other service programs to help biological families keep their sons and daughters at home. The number of children in foster care rose substantially in the 1960s and 1970s, partly as a result of the new federal payments for foster care. But other factors contributed to the increase as well. There were fewer alternatives to foster care because orphanages had all but disappeared. More births to single mothers who found
274
Foster Care
it difficult to care for their youngsters and a greater willingness of child welfare workers to find foster homes for minority children also brought more boys and girls into the system. Finally, the development of pediatric radiology, which allowed doctors to detect signs of physical abuse more accurately, and new theories about the battered child syndrome resulted in huge public concern about child abuse, tough new state laws on abuse, and the removal of many children to foster homes because of abuse. By this time foster care programs had been improved: foster parents were carefully checked on and licensed, and they received fixed monthly fees for the expenses involved in raising foster children. Child care agencies also worked to place siblings together, to visit boys and girls frequently while they were under care, and to try to keep boys and girls in foster homes only until their biological parents could again support them. Ironically, government began to invest more in foster families than it provided needy parents to support their own youngsters. Government payments to foster families were larger than AFDC payments to impoverished mothers and fathers, and states spent more on foster care than on preventive programs that helped biological families stay together. Nonetheless, foster care had not become, nor would it ever be, a complete panacea for needy boys and girls. Although one-half of foster children returned to their own homes, the other half never did. Older youngsters were particularly likely to be permanently separated from their biological families and to be in foster care for many years. Despite the best efforts of social workers, some foster families were abusive. And although African American children were no longer
discriminated against, minority boys and girls stayed longer in foster care and were less likely to be returned to their biological parents than were white children. Finally, Native American boys and girls were especially likely to be removed from their natural families and placed in foster care. Between 1969 and 1974, almost 35 percent of Native American children were living with foster or adoptive families, most of whom were not Indians. In 1969 in sixteen states, almost 85 percent of Indian children in foster care were living in non-Indian families. Some efforts were made in the 1970s and 1980s to correct the inequities in foster care. Child welfare agencies began to adopt formal kinship care (which meant paying relatives, usually low-income grandmothers, to care for children), especially for African American boys and girls. Of course, blacks had long relied on extended family to care for needy youngsters, but now agencies recognized that relatives deserved payment for their parenting. Moreover, because black youngsters were so likely to stay in foster care for years and so unlikely to live again with their biological parents, kinship care made foster care more palatable by enabling grandparents, aunts, and uncles to take black boys and girls into their homes. The Indian Child Welfare Act of 1978 discouraged removal of Native American boys and girls to non-Indian families. Tribal governments had control over Native American children on reservations and shared control with states for Indian children living off reservations. The goal was to keep Indian youngsters within their tribe and culture, so that both could continue to thrive. Finally, in 1980, Congress passed a law called the Adoption Assistance and Child Welfare Act, which amended the Social Security
4-H in the Midwest Act and was intended to discourage extended stays in foster homes and to encourage boys and girls to return to their biological families or enter permanent adoptive homes. The federal government promised more money to the states to help them devise “permanency plans” for each child entering foster care, plans that called for youngsters to remain in foster homes a short time. “Permanency plans” were to be reviewed and possibly revised every few months. In the early 1980s, the number of boys and girls in foster care dropped but then resumed its upward spiral. The 1980 legislation was not as effective as it might have been because the Reagan administration did not provide full funding for it. In addition, substance abuse of illegal drugs and alcohol, along with the spread of human immunodeficiency virus (HIV) disrupted families, caused physical problems among infants, and led to an expanded need for foster care that showed no signs of diminishing. Priscilla Ferguson Clement See also Abuse; Adoption; Indentured Servants; Orphanages; Placing Out References and further reading Ashby, LeRoy. 1997. Endangered Children: Dependency, Neglect, and Abuse in American History. New York: Twayne Publishers. Blacher, Jan. 1994. When There’s No Place Like Home: Options for Children Living Apart from Their Natural Families. Baltimore: P. H. Brookes Publishers. Child Welfare League of America. 1994. Kinship Care: A Natural Bridge. Washington, DC: Child Welfare League of America. Gittens, Joan. 1994. Poor Relations: The Children of the State in Illinois, 1818–1990. Urbana: University of Illinois Press. McKelvey, Carole A., and JoEllen Stevens. 1994. Adoption Crisis: The Truth
275
Behind Adoption and Foster Care. Golden, CO: Fulcrum Publishing.
4-H in the Midwest A youth program administered by the U.S. Department of Agriculture (USDA) Cooperative Extension Service and primarily serving rural youth, 4-H has existed for almost a century. Known first as boys’ agricultural clubs (along with girls’ agricultural clubs), the name 4-H was widely adopted in the 1920s. Throughout most of the twentieth century, 4-H has been an important social and educational institution for American boys. In the rural Midwest during the first half of the twentieth century, 4-H clubs offered one of the few out-of-school organized activities for young people. Today, 4-H continues to be a major program for midwestern young people: since the early 1960s, boys’ and girls’ 4-H have been integrated, and 4-H clubs have also been established in urban areas. 4-H colors are green and white, and the club’s emblem is a green four-leaf clover with a white “H” on each leaf. The four letters stand for head, heart, hands, and health. The club’s motto is “To make the best better.” Each club had and continues to have considerable autonomy but is supervised and promoted by both county and state Cooperative Extension officials. Today, 4-H activities are open to all young people everywhere from the ages of nine to nineteen. The creation of boys’ and girls’ agricultural clubs grew out of a widespread concern about the inadequacies of rural life. Long before 1900, farm families had been leaving the countryside because towns and cities offered greater economic and social opportunities, and that exodus continued after 1900, raising fears about a depleted countryside incapable of feeding a
276
4-H in the Midwest
Gutherie County, Iowa; a boy named Burkhart with his 4-H livestock project, ca. 1920 (Iowa State University Library/Special Collections Department)
nation and about the loss of an independent class of farm people. In 1907, President Theodore Roosevelt, himself concerned about the “flight to the cities,” formed the Country Life Commission to determine ways that farm life could be more attractive and more profitable, thus keeping more people on the land. Reporting in 1909, the commission proposed numerous changes, including the formation of an extension service that would bring scientific and technological information to farm families and would also work to meet other agricultural, domestic, and social needs of rural residents. At the same time, school administrators and farm leaders across the country were also responding to problems of rural life by setting up boys’ and girls’ agricultural clubs to help improve farming practices, provide farm youngsters with social opportunities, and create in farm youth a sense of pride in their work. Those orga-
nizations, the forerunner of today’s boys’ 4-H, began in several midwestern states, including Ohio, Illinois, Indiana, and Iowa, in the early 1900s. In Ohio, School Superintendent Albert B. Graham asked young boys to test the soil on their farms and to conduct experiments with seed corn. In Illinois, Farm Institute president William B. Otwell created interest among boys by offering a $1 prize for those boys who grew the best yield of corn; before long, Otwell’s corn-growing contests had attracted 1,500 boys. Later, farm implement manufacturers began offering premiums on plows and cultivators to winners of corn-growing contests. Soon school administrators in other midwestern states began sponsoring corn-growing contests. In Indiana, J. F. Haines initiated such contests, as did Cap C. Miller in Iowa. Also in Iowa, Page County superintendent of schools Jessie Field, believing that farm children needed to develop pride in agri-
4-H in the Midwest culture, organized both girls’ and boys’ clubs, with the latter carrying out experimental projects on seed selection, milk testing, and road improvement. By 1907, the basic principles of what would later become 4-H had been tried and successfully tested. Organizers had shown that farm youth responded to clubs that introduced them to agricultural science and technology and that young people responded to the value of incentives. In 1914, the forerunner to 4-H clubs would find a permanent home. In that year Congress passed the Smith-Lever Act, which created the Cooperative Extension Service as part of the USDA. Previously, states like Iowa had set up state extension services. The Smith-Lever Act created a partnership between the state and federal agencies. Each state college accepting federal funds had to set up an administrative division to supervise extension work in agriculture and home economics. Within these two fields, specialists in subject matter such as agronomy and human nutrition worked at the state level, but county-level personnel formed the heart of the program as their extension staff interacted with the farm population. Accordingly, each county would hire a male county agent to work with local farmers and to pass along new findings from the state’s agricultural college. Female home demonstration agents in the county had the responsibility to work with farm women in the fields of nutrition, child care, clothing, and home management skills. As a part of their responsibilities, extension personnel at the state and local levels developed and supervised boys’ and girls’ club work in agriculture and home economics. 4-H remains part of the USDA today. A major part of the initial motivation to form both boys’ and girls’ agricultural
277
clubs was the belief that to bring improvements to the farming sector, officials had to address their programs for change to young people as well as to their parents. For members of boys’ clubs especially, the result was that the projects mirrored those of their fathers. For example, a major lament of extension personnel from its founding through the 1920s was that farmers did not keep records. Accordingly, in 1921, Iowa extension officials established a farm record club for farm youths. Officials assigned an extension specialist to work with the young people, who then kept track of the business records for the family farm. Officials also assisted the youngsters in making an analysis of the records at the end of the year. As part of an additional effort to train future farmers, extension personnel helped introduce simplified farm accounts in the seventh and eighth grades as part of arithmetic assignments. Both boys and girls had opportunities to take part in 4-H, but the number of boys’ clubs lagged behind that of girls’ clubs. A vital part of any club was the volunteer leader. Each 4-H club depended on a mother, father, or some community person to serve as leader. Typically, men served as boys’ club leaders and women as leaders for girls’ clubs. Traditionally, women were more willing to give of their time in this capacity, with the result that girls’ clubs were more numerous than boys’ clubs. Fathers often believed they were simply too busy with farmwork to take time for either a leaders’ meeting or the 4-H meeting itself. The effort to improve farming and farm living by training future generations of farmers was carried out mainly through local clubs. By the early 1920s, a minority of counties had professional club leaders who handled most of the youth activities.
278
4-H in the Midwest
If there was no club leader, then the county agent and the county home demonstration agent divided responsibilities for the county’s 4-H clubs. Boys’ and girls’ clubs shared some activities, but basically the two groups had separate meetings and separate types of projects. This arrangement would prevail into the early 1960s, after which time one state 4-H organization served both boys and girls. A major part of each 4-H member’s activity was a yearly project. For boys, it was usually related to agriculture, especially livestock. In Iowa in 1921, boys clubs included a Corn Club, Baby Beef Club, Purebred Calf Club, Dairy Calf Club, Market Pig Club, and Sow and Litter Club. Given extension officials’ belief that a major way to improve farming was to train 4-H members in proper farming techniques, boys’ clubs typically focused on projects with economic value. Taking part in the Market Pig Club clearly had economic ramifications for members, and sometimes a state director actually put a price tag on the projects carried out by 4-H members. In effect, boys were being trained to be good farmers, and in keeping with that tradition, they typically worked individually rather than in groups. Moreover, by carrying out a yearlong project—whether it involved raising corn or a purebred calf—youngsters were also learning responsibility. By contrast, girls’ clubs had sewing and cooking projects but also had programs on citizenship, music, and art, areas typically missing from the boys’ club agendas. From the beginning, 4-H work included activities outside the local club meetings. A part of these activities was the submission of projects for judging at the county fair. Winners at the county level then entered their projects at the state fair. Often 4-H members presented
demonstrations in connection with projects or showed livestock, providing members with experience in public speaking and making appearances before large audiences. Carrying out yearlong projects also meant learning new skills, gaining valuable information about selected projects, and assuming responsibility for long-term commitments. 4-H members attended state cattle meetings and the National Dairy Show, and boys participated in judging contests, usually of livestock projects. Members typically attended a week of activities at the state land-grant universities each year and also selected members to attend the National 4-H Congress in Chicago. In many states, 4-H officials created camps where members could experience recreational outdoor activities and at the same time pursue varied activities such as developing citizenship and leadership skills. An international 4-H program was created after World War II, in which exchange programs with rural families in other countries might enable a Minnesota youth to live with a German farm family and then bring a German youth to live with a Minnesota farm family. In this way, 4-H members came to know about and appreciate foreign cultures. Although boys’ 4-H clubs remained basically the same from the 1930s through the 1950s, important changes took place in the 1960s. Of major importance was that 4-H shifted from mostly project-oriented education to developmental education and experiences. The latter area has come to include the study of personal development and leadership, citizenship and civic education, environmental issues, science and technology, career development, and diversity. Iowa extension officials described the change as one in which the 4-H leaders came to be more interested in
4-H in the Midwest the boy who held the rope than in the animal at the other end. Along with this change, members gained greater freedom to select their own activities. From the 1920s on, state and local leaders had created lists of acceptable projects accompanied by appropriate lists of literature, but in the 1960s members began to select their own projects, even if they were not on an approved list. The organization also restructured itself. Originally, boys’ and girls’ club work had been separate, but in the 1960s gender lines began to blur, and state 4-H officials combined clubs for the two sexes. Another development was the creation of a new constituency: urban members. Activity in that area had actually started in the 1950s, when extension officials created a program for urban youth in Chicago. In 1961 the Chicago Board of Education asked 4-H to organize clubs in the city’s public housing projects, where school dropout rates were high. Sixteen clubs were organized in the housing projects, and soon urban 4-H programs spread outside the Midwest. Urban 4-H work in Iowa includes both 4-H community clubs and 4-H special projects. Community club members in urban areas select the same type of projects or activities as do 4-H members in rural areas. With regard to special projects, the goal is somewhat different: 4-H personnel go into the public schools to present programs on subjects such as health, science, and nutrition. In the 1960s, 4-H clubs also began to work with the Expanded Food and Nutrition Education Program (EFNEP), which provided federal funds to develop nutritional programs for low-income Americans. EFNEP originated with extension home economists, but a 1969 federal appropriation made it possible to carry out some EFNEP activity through 4-H pro-
279
grams in depressed urban areas. 4-H members and personnel soon helped develop nutrition programs for young people, including games that emphasized nutrition information and nutrition day camps. In the last three decades of the twentieth century, 4-H worked to redefine itself for each new generation, but one aspect of the organization has not changed: volunteers remain crucial to the program. Paid aides and paraprofessionals are used in certain situations, but generally volunteers continue to serve as the backbone of the organization. In 1976, Cooperative Extension published a study that both documented change and laid out goals for the future. The publication, 4-H in Century III, noted that 4-H membership had stabilized and indicated that the organization would have to reach out to new constituencies and do a better job of publicizing the organization. The publication also made clear that 4-H had developed new programs in the 1970s. Rather than listing the individual projects that had been so central to boys’ 4-H, the publication dealt with boys’ learning experiences. Instead of baby beef and marketing pigs projects, the publication listed programs in areas such as economics, jobs and career exploration, environmental and natural resources, citizenship, education and community development, and creative and performing arts. Today 4-H remains one of the most successful parts of Cooperative Extension work. The organization continues to change as it redefines itself for each new generation of youth. In doing so, 4-H now serves urban as well as rural youth, and in contrast to earlier times, 4-H today includes African American, Native American, Latino, and Asian members. In the
280
Franklin, Benjamin
Midwest, particularly in rural areas, 4-H is still a major program for youth, and the majority of members still live on farms and in towns with populations under 10,000. The organization still depends on volunteer leaders. Young people have a wide array of programs and activities from which to choose, as well as the freedom to construct individual activities. At present, 4-H’s alumni total about 45 million. Today, although the membership base has broadened to include both rural and urban youths and activities have largely replaced projects, the goal remains to help young people develop the skills, knowledge, and attitudes that will make them productive, contributing members of society. Dorothy A. Schwieder See also Clubs; Farm Boys References and further reading Rasmussen, Wayne D. 1989. Taking the University to the People: Seventy-Five Years of Cooperative Extension. Ames: Iowa State University Press. Reck, Franklin M. 1951. The 4-H Story. Ames: Iowa State College Press. Schwieder, Dorothy. 1993. 75 Years of Service: Cooperative Extension in Iowa. Ames: Iowa State University Press. Wessel, Thomas, and Marilyn Wessel. 1982. 4-H: An American Idea 1900–1980. Chevy Chase: National 4-H Council.
Franklin, Benjamin The model for success in the eighteenth and nineteenth centuries, Franklin was renowned not only as a diligent artisan, philanthropic citizen, and wily diplomat of the American Revolution but also as the author of his Autobiography, which became a manual of self-education and personal independence for ambitious boys. Although he began as an appren-
tice, Franklin both worked hard and mastered the appearance of doing so, attracting the attention of influential men who could help him. As editor of the Pennsylvania Gazette, he sponsored journalism throughout the colonies and persuasively promoted his favorite causes. Publication of Poor Richard’s Almanack not only made his fortune but also provided practical instruction for ordinary colonials who had little other material to read. And his founding of an early subscription library and clubs to promote enlightened conversation helped other young working men and boys educate themselves. In 1752, when he began his Autobiography for the edification of his son, and after the American Revolution in 1784, when he more consciously fashioned it for future citizens, Franklin presented his own life as an example of the diligence and hardwon virtue that would bring due rewards. Benjamin Franklin was born in Boston in 1706, the youngest son of his father’s seventeen children. Josiah Franklin had emigrated from England with his first wife, Anne, and their three young children in 1683. Unable to support himself in the colonies by his trade of dyeing cloth, he became a candle maker and soap boiler. After bearing three more children, Anne Franklin died in childbirth, and within a year, Josiah married young Abiah Folger of Nantucket, who would bear ten more children, including Benjamin. Franklin grew up in this large artisan family. Later in his life he could remember when thirteen children, all of whom would survive to adulthood, sat at his father’s table. Josiah made a point to include in family dinner conversations sensible friends or neighbors who would introduce topics to improve the minds of his children. Ben was an alert boy who learned to read early and loved books, and
Franklin, Benjamin
281
Franklin selling his ballads on the streets of Boston (Library of Congress)
his father sent him at the age of eight to Boston’s grammar school with the hope that he would become a minister. The boy thrived at the Latin school, rising to the head of his class and proceeding to the next above it; the following year he would have been ready to enter the third class. But his artisan father feared that after the expense of a college education, Ben would be unable to earn a living. After less than a year, Josiah took Ben out of the grammar school that would have enhanced at least his status and placed him with a teacher of writing and arithmetic. Although he excelled at writing,
the boy was hopeless in arithmetic, and at the age of ten he joined his father’s business as an apprentice, where he cut wick, filled the dipping molds for candles, attended the shop, and ran errands. Unhappy in his father’s shop, Ben threatened to go to sea. A leader among other boys, he was adept at managing a boat in the waters and marshes surrounding Boston. He also was a strong swimmer and studied motions and positions to teach himself strokes. At one point, hoping to improve his speed, he made oval pallets to hold in his hands and strike the water and a kind of sandal for the soles of
282
Franklin, Benjamin
his feet but soon abandoned these devices because they made him tired. Another time, after enlisting a friend to take his clothes to the opposite shore, he managed to pull himself across the water hanging on to the string of a kite. He also led friends into scrapes and later recalled building a wharf for fishing on a salt marsh with stones intended for a neighbor’s house. He was sternly corrected by his father, however, when workmen discovered the missing stones. When Ben was twelve, Josiah concluded that the lively boy should be settled in a trade that would hold his interest, and took him to observe workmen building houses or laying bricks. One option was placement with a cousin, who was a cutler and made knives. But Ben’s relish for reading finally determined the choice of a trade, and his father persuaded him to sign an indenture binding him to his brother James until he reached the age of twentyone and had learned to be a printer. Ben learned the trade quickly and welcomed his new access to books. Not only did his brother, who had served an apprenticeship in London, stock the most current imported books and periodicals, but Ben also made friends with booksellers’ apprentices, who provided him books on overnight loan. His brother, who had the Franklin propensity for turning a profit, encouraged the twelve-yearold to write ballads that could be sold on the street. Ben wrote The Lighthouse Tragedy, which related the dramatic drowning of a lighthouse keeper and his wife and daughter, and A Sailor’s Song on the Taking of Teach or Blackbeard the Pirate. Both sold well, delighting the boy until his father informed him that poets generally were beggars. Ben then determined to learn to write good prose and began by setting out his side of the argu-
ments he engaged in with a friend. This time his father encouraged him, urging him to develop elegance of style. For this Ben turned to the Spectator, a London periodical of 1711 put out by Joseph Addison, who, renowned for his ease of manner, eloquence, and wit, popularized the cosmopolitan culture of clubs and coffeehouses. Ben read Addison’s essays, then set them aside and rewrote them in his own words, comparing his work to the original and correcting his faults. To improve his vocabulary, he turned essays into verse and back to prose. Laying them aside, he jumbled up his notes and then put the essays back in order, trying to learn to arrange his thoughts. While apprenticed to his brother, Ben studied when he could, catching moments before and after work and on Sundays, when he skipped church. By the age of sixteen, he was boarding away from home and became a vegetarian, using money he saved on food to buy books and time he saved by eating only a light lunch for further study. At this time he finally mastered arithmetic and boned up on navigation. He also read John Locke’s 1690 work, An Essay Concerning Human Understanding, which only recently had reached the colonies, and studied rhetoric, practicing the Socratic method in conversations with his friends. In 1721 James Franklin began to print a newspaper, the New England Courant, modeled on the Spectator and containing essays written by his friends, “the Honest Wags,” who sought to pique the dominant New England clergy. Sixteen-yearold Ben had taught himself to write well enough to be eager to try his hand at a contribution. Assuming the persona of a young widow, Silence Dogood, he slipped under the shop’s door anonymous letters that combined the style of the Spectator
Franklin, Benjamin with the leather apron satire of his brother’s friends. Ben’s authorship was never discovered by his brother, who printed fourteen of Silence’s letters, arguing in favor of female education and ridiculing sons of the wealthy who were able to attend Harvard College but gained little from the experience. When James’s criticism of the Massachusetts Assembly invoked their censure and he was briefly imprisoned, management of the paper fell to Ben. On James’s release, the assembly insisted that the paper not be continued under his name, and the Honest Wags decided to continue publishing under the name of Benjamin Franklin. The agreement was made that Ben would be publicly released from his indenture, although secretly a new contract was drawn up, still binding him until the age of twenty-one. But seventeen-year-old Ben chafed under this extended apprenticeship to James, who was prone to assert his shaky authority with an occasional beating. Secretly he planned to steal away on a ship to New York, where he hoped to find work in his trade. Thus Franklin, the quintessential runaway apprentice and self-made man, recorded the story of gaining his personal independence. William Bradford, a printer in New York, could not give the promising boy work but recommended that Ben move on to Philadelphia, where Andrew Bradford printed a newspaper, the American Weekly Mercury. On Ben’s sea journey his small craft rocked in an October gale. Landing on the New Jersey shore, he discovered that he would have to walk in the rain 50 miles to Burlington in order to catch another boat to Philadelphia. When that boat stalled due to the lack of wind, Ben helped to row, arriving at his destination wet, dirty, and extremely hungry. With only a Dutch dollar and copper
283
shilling to his name, for three pennies he bought three large puffy rolls, which he carried under each arm, walking for the first time the streets of his adopted city. Joining a crowd of people, he entered the Quaker meeting house, where during the silent meeting, he promptly fell asleep. Although Andrew Bradford did not have work for him, Ben was taken on by a rival printer, Samuel Keimer, who welcomed his skills. As a promising young artisan, Ben attracted the attention of William Keith, governor of the province, who promised to set him up in business should his father agree. A few months later, eighteen-year-old Ben took a ship to Boston and was greeted glumly by his brother but fondly by his father, who nevertheless felt him too young and irresponsible to have a shop of his own. Josiah promised, however, that Ben could return to Philadelphia, and if he behaved respectfully and worked industriously, he would set him up in business when he reached the age of twenty-one. On Ben’s return to Philadelphia, he continued to work for Keimer and attract the attention of distinguished gentlemen, who enjoyed his conversation based on the wide extent of his reading. William Keith still hoped for a printer who would take his side in his battles with Pennsylvania’s proprietors and urged Ben to travel to England to purchase a press and printing supplies on the governor’s letter of credit. Not until the youth arrived in London did he fully realize that Keith was liberal with such promises and had no credit. But the eighteen months Ben spent there served to continue his apprenticeship. Easily finding work, he read widely and attended the theater for the first time, steeping himself in the cosmopolitan coffeehouse culture he had only glimpsed at the New England Courant.
284
Fraternities
Forming friendships with other artisans, he delighted them with his antics and skill while swimming in the Thames River and briefly considered staying in London to open a swimming school. But once again, this bright and personable youth attracted the attention of a helpful mentor, and Thomas Denman, a Quaker merchant and shipboard acquaintance, invited him to use his skills in writing and arithmetic as a clerk in his mercantile business. Ben returned to Philadelphia with Denman and began work at his Philadelphia store. Very likely, he would have become a merchant had not he and Denman taken ill, resulting in Ben’s slow recovery but the Quaker’s death. Franklin returned to his trade and found work once more with Samuel Keimer. In 1726, Benjamin Franklin was twenty, working for a printer whose skills he considered inferior to his own. Within three years he had gained the confidence of the colonial government by printing public projects, weakened his employer’s newspaper by writing witty essays for that of a rival, and received the support he needed to set up his own shop, from which he began to publish the Pennsylvania Gazette in 1730. With other artisans he formed the Junto, a club of mutual improvement, in 1727. Soon he married Deborah Read, an artisan’s daughter of frugal and industrious habits, who was willing to live plainly and help him with his business. He also engaged in his first public project, founding a subscription library. Having shed his Presbyterian background, he abandoned the deism he had flirted with as a youth and settled into faith in a divine being served by good works, although he still kept Sunday as his studying day. Later in his life, he self-consciously recorded how he worked carefully at this time to acquire
virtue. Listing the moral virtues he had encountered in his reading, he worked on one each week until it became a habit and he could go on to the next. In a redruled ledger he recorded a black speck for each lapse in each virtue; his goal was to keep each line clear of spots. Franklin had early acquired habits of temperance in food and drink, but he found order very difficult and probably never succeeded in gaining humility. But, he concluded with good humor, although the reality of virtue may have eluded him, at least he had achieved the appearance of it. And the industry and frugality he worked so hard to display eventually did bring him the success he so assiduously sought. Jacqueline S. Reinier See also Apprenticeship References and further reading Clark, Ronald W. 1983. Benjamin Franklin: A Biography. New York: Random House. Franklin, Benjamin. 1959. Autobiography and Selected Writings. New York: Holt, Rinehart and Winston. Wright, Esmond, ed. 1989. Benjamin Franklin: His Life as He Wrote It. Cambridge, MA: Harvard University Press.
Fraternities A fraternity is an organization or society for young men or women often found on college campuses in the United States and Canada. These organizations contribute to the social, communal, professional, or honorary aspects of student life at a college or university. The term fraternity comes from the Latin word frater, meaning “brother.” Women’s fraternities are called sororities, a term derived from the Latin word soror, meaning “sister.” These terms are representative of the
Fraternities familial bond that is formed between members of these organizations. Because most fraternities, such as Sigma Phi Epsilon, take their names from Greek letters, fraternities are sometimes referred to as Greek letter organizations, and members of fraternities are sometimes called Greeks. The use of these letters is traditional, dating back to 1776 and the founding of what is considered the first American fraternity, Phi Beta Kappa. These letters can take on a deeper significance for members, since they often represent a hidden meaning such as the founding values of the organization. Phi Beta Kappa, for example, stands for the motto of the fraternity, “Love of wisdom, the guide of life.” Some other fraternal values include honor, diligence, service, scholarship, integrity, loyalty, and respect. Fraternities typically have other symbols that take on significant meanings for members. These symbols can include pins or badges, crests or coats of arms, secret handshakes or signals, mottoes, creeds, and ritual ceremonies. Fraternities have evolved into several classification types. Professional fraternities select members from students engaged in a particular field of study, such as engineering or education. They are generally coeducational organizations that focus on providing their members with structured social, academic, or networking opportunities to further themselves in their professions. The first professional fraternity was the Kappa Lambda Society, founded at Transylvania University in Lexington, Kentucky, in 1819 by medical students. Honor societies recognize a student’s academic, leadership, or service accomplishments. Some of these organizations coordinate events for their members, the university, or the surrounding community. Phi Beta
285
Kappa started as a general fraternity but officially became an honor society recognizing outstanding academic achievement in arts and sciences in the late 1890s. Another honor society, Mortar Board, recognizes outstanding service, scholarship, and leadership among college seniors. General or social fraternities are arguably the most secretive, most identifiable, and most prevalent on college campuses. The emphasis of a general fraternity varies from group to group or campus to campus, but they tend to be values-based organizations. Some are local organizations existing solely at one campus, but many are tied to a national organization with chapters in various colleges and universities across the country. General fraternities and sororities often take on a family atmosphere. Many of them have houses where members live together like a family, often with house mothers. Members eat meals together, share rooms, and live in an environment of companionship, camaraderie, and friendliness. They frequently call themselves brothers or sisters. For young men who are away from their families for the first time, there is a strong appeal to join a general fraternity. General fraternities try to provide their members with a supportive environment that helps them adapt to the college lifestyle. They also offer opportunities for members to grow and develop their talents and leadership skills. General fraternities are also very popular because of their single-sex composition and their ability to help young men and women mature while still retaining a connection to their youth. These organizations provide countless opportunities for members to explore their independence, establish their identity, define and push their boundaries, and come to grips with their emotions,
286
Fraternities
and they do so mostly in an all-male or all-female environment. They also provide a link to boyhood or girlhood through the competitions; emphasis on games, pranks, and playfulness; and social activities they provide. Indeed, sometimes these groups are accused of placing too much emphasis on the juvenile aspects and not enough on the maturing, supportive aspects of the organization. Since the founding of Phi Beta Kappa at the College of William and Mary, general fraternities have gone through many cycles of prosperity and difficulty. The early years were tentative because colleges and universities often thwarted efforts of students to create fraternities. In early America, colleges could be found only in a few places on the East Coast and in the Midwest. Students at these schools were monitored from their waking hours until bedtime, and their activities were restricted mainly to studies, nourishment, prayer, and sleep. Fraternities attempted to liven up the rigid life of students by providing a social forum for the discussion of issues and ideas not taught in class. Although the social and intellectual outlet grew popular with students, the faculty did not always look favorably upon such behavior. Members often had to hold meetings, recruit members, and conduct other activities covertly, possibly spurring the secretive nature of fraternities today. The Civil War also hurt the fraternity effort because many college-age students went to battle. After the war, fraternities expanded and prospered as colleges and universities began to change. Colleges became more popular, more diverse, and more accessible. The Morril Land-Grant Act of 1862 helped states create and develop a number of colleges and universities, which taught a wider variety of sub-
jects and became more accessible to the general population. As the number and size of universities grew, so too did fraternities. Existing fraternities expanded their membership to campuses all over the country, and new fraternities emerged and followed suit. Although the college population was becoming more diverse, the makeup of many fraternities was not necessarily so. They rarely strayed from their original membership of white, Christian males. Students who were unable to join or who wanted to create a different experience often founded their own organizations. The first recognized sisterhood can be traced back to the Adelphean Society (later known as Alpha Delta Pi), founded in 1843 at Wesleyan Female College. Sororities became more popular as the number of women admitted to colleges began increasing during this period. In 1906, at Cornell University, Alpha Phi Alpha was the first general fraternity founded for black college men. Two years later, the first general sorority for black college women, Alpha Kappa Alpha, was founded at Howard University in Washington, D.C. General fraternities founded by Jewish members and based on Jewish beliefs or ethics, such as Zeta Beta Tau (1903, City College of New York), Sigma Alpha Mu (1909, City College of New York), and Alpha Epsilon Pi (1913, New York University), were also established at this time. Men’s and women’s general fraternities continue to exist as single-sex organizations, although they have formally eliminated exclusionary practices with respect to race or sect. The traditions, values, and beliefs that would predominantly attract a person of a particular faith or culture still exist, however. The first half of the twentieth century witnessed as many ups and downs for fra-
Fraternities ternities as the latter half of the nineteenth century, starting during World War I and continuing through the Great Depression and World War II. At the end of World War II, the GI Bill, or Serviceman’s Readjustment Act of 1944, provided college funds for those who had been in the armed forces. These veterans took advantage of the opportunity, and the population of colleges and universities grew. Fraternity growth was also remarkable because these organizations provided an atmosphere of camaraderie similar to that GIs remembered from the armed forces. The good times continued until the late 1960s, when student unrest mushroomed. Since many college students were seeking freedom from institutional structure and “the establishment,” fraternities suffered devastating losses in membership as well as decreases in numbers of chapters throughout the 1970s. General fraternities seemed to lose their focus on values and concentrated on the social aspect of campus life as their selling point. Movies such as Animal House (1978) popularized the stereotype of fraternities at this time as wild, out-of-control organizations. The 1980s and early 1990s marked another period of rapid growth for fraternities as college and university enrollments increased. National and international fraternities as well as universities emphasized educational programs and adopted strict regulations on alcohol use and hazing in an attempt to refocus the fraternity movement on its founding values. Membership in fraternities has remained relatively stable since that time, although slight increases and decreases are common from year to year. A development of note in recent years is the increase in formation of Asian, Latino and
287
Latina, and Native American local and national fraternities and sororities. Members of general fraternities are usually recruited in their first few years in college, although some fraternities recruit upperclassmen or even have large graduate chapters comprising men who are out of college or never went to college but are contributing members of the community. Some fraternities hold recruitment periods at the beginning of an academic semester. These periods have traditionally been called “rush” periods, where interested students would rush to fraternity chapters to meet the members of those organizations. Once a match is made between a fraternity and an interested student, that student becomes affiliated with the organization. Many fraternities have a set amount of time for the new members (sometimes called pledges or associate members) to learn more about the fraternity, participate in its functions and activities, and get to know its members. At the end of this period, if the new member has met the requirements of the organization, he generally goes through some type of ritual ceremony signifying his initiation into the fraternity. This period varies in length from group to group and has been done away with altogether by other groups. When a young man is initiated into a general fraternity, he typically cannot choose to become a member of another general fraternity. He takes an oath to become a member of that particular organization for life, which is one characteristic that separates these organizations from many others. When an undergraduate member of a fraternity graduates from college, he becomes an alumnus of that organization. Many alumni members help their undergraduate chapters or international organizations by donating
288
Fraternities
money, talent, time, and advice. Studies also show that alumni members of fraternities tend to be quite loyal and generous to their alma maters as well. The connections between alumni fraternity members are often as strong as when they were in college. Many members form bonds with each other in college that last a lifetime, and it is not uncommon to see fraternity brothers at each other’s special occasions decades after their college years have passed. Despite their popularity among some students, general fraternities have many critics who point to hazing, the misuse of alcohol, and the departure from founding values as reasons for their abolishment. Hazing is the abusive or demeaning harassment of an individual or group seeking membership into an organization or acceptance from others. The practice of hazing can be traced back to early English colleges and a time before fraternities as we know them existed. Still, many fraternities have traditionally adopted hazing practices as a rite of passage in which pledges prove their worthiness to become members of the organization. There have been documented cases in which hazing incidents have caused injury or death to new members of a fraternity. There have also been several cases of students becoming sick or dying from the misuse of alcohol related to a fraternity function. Other critics claim that the organizations have lost sight of their founding values and have become organizations that perpetuate elitism, sexism, separatism, and racism. Local, national, and international fraternities are continually battling these issues and attempting to better the fraternity experience and change their public image. All national and international general fraternities have adopted antihazing policies
and attempt to deal with hazing situations swiftly and severely. Furthermore, many fraternities have introduced “substance free initiatives,” banning alcohol from their chapter houses or at fraternity functions altogether. Relationships among fraternities are often facilitated by umbrella organizations. These organizations aid communication between fraternities; provide a forum for the exchange of ideas, policies, and resources; and promote a stronger Greek community. There are several organizations that serve this purpose for national and international fraternities. The National Interfraternity Conference (NIC) serves sixty-four national and international general fraternities. The National Panhellenic Conference (NPC) serves twenty-six national and international general women’s sororities. The National Pan-Hellenic Council (NPHC) oversees nine predominantly black national fraternities and sororities. On the campus level, an interfraternity council, Panhellenic council, or another umbrella group often oversees the relationship between general fraternities and sororities. The Professional Fraternity Association (PFA) and the Association of College Honor Societies (ACHS) provide guidance to national, international, and campus professional fraternities and honor societies, respectively. Tom Jelke References and further reading Anson, J. L., and Robert F. Marchesani, Jr., eds. 1991. Baird’s Manual of American College Fraternities. 20th ed. Indianapolis: Baird’s Manual Foundation. Lucas, Christopher J. 1994. American Higher Education: A History. New York: St. Martin’s Press. Rotundo, E. Anthony. 1993. American Manhood: Transformations in
Frontier Boyhood Masculinity from the Revolution to the Modern Era. New York: Basic Books. Whipple, Edward G., and Eileen G. Sullivan, eds. 1998. New Challenges for Greek Letter Organizations: Transforming Fraternities and Sororities into Learning Communities. New Directions for Student Services, no. 81. San Francisco: Jossey-Bass. Winston, R. B., Jr., William R. Nettles, III, and John H. Opper, Jr., eds. 1987. Fraternities and Sororities on the Contemporary College Campus. New Directions for Student Services, no. 40. San Francisco: Jossey-Bass.
Frontier Boyhood Both boys and girls played essential roles in the history of the American frontier. They performed much of its labor and played prominent parts in its social life. The demands and opportunities of the frontier in turn shaped young people’s lives profoundly. What boys and girls did on the frontier and how they were influenced by it contributed greatly to the shaping of the United States. A frontier might be defined as an area where European and Euro-American societies entered country new to them and where they and the peoples of that country encountered and were changed by one another—an encounter that ended with the new arrivals dominating the native peoples. A frontier was also a place where there was an enormous amount of work to be done and too few people to do it. To meet this need, the family became a flexible and effective laboring mechanism, and sons and daughters were the family’s most versatile workers. From the crude farms on the Appalachian frontier of the eighteenth century to the homesteads, mining towns, ranches, and newborn western cities of the nineteenth-century frontiers, boys performed a prodigious amount of labor.
289
Farms dominated the eastern frontier, and on them children did much of the work. As young as four or five they were helping with household chores—gathering fuel, sweeping and cleaning, and feeding animals. When a boy was just a little older, he took part in the first planting of corn, done by making a hole in the soil with a digging stick and dropping in seed, then covering it with a brush of the foot. During the summer he tended the crops and in the fall took part in the harvest. By age ten or twelve, a boy was helping feed his family by hunting smaller animals such as squirrels and game birds as well as gathering wild plants, and not long afterward he was stalking deer and larger game. Young males also helped at slaughtering, both of what was hunted and of domestic animals. All this and more helped prepare boys for the heaviest tasks they began to assume in their teens—girdling and felling trees, splitting rails and making fences, and constructing log buildings. On the western farming frontier, boys continued to take on most of these jobs, plus others. Youths as young as ten could take command of more effective steelbladed plows in the crucial task of breaking the prairie sod for planting. Cattle raising was relatively more important in the Far West, encouraged by both terrain and a hungry eastern market for beef, and boys spent hundreds of hours per year in the saddle watching over the animals and retrieving them when they wandered away. Except during the time of the California Gold Rush in 1849, when westward migration was dominated by adult men, children were a prominent part of the third of a million persons who crossed to the Pacific Coast along the overland trails. Here they also carried out important tasks. Boys helped care for the oxen
290
Frontier Boyhood
A trade card showing a family outside their home in the Far West, ca. 1880. The labor of boys was essential on frontier farms. (Library of Congress)
that drew the wagons and the other animals that would be a crucial part in a new life in Oregon, California, or Utah. They fetched water and gathered wild greens and the distinctive type of western fuel, the dried dung, or “chips,” of bison and cattle. In towns and cities along the western frontier, boys also played important economic roles. Most work in gold, silver, and copper mines was too difficult for males under the age of sixteen or so, but boys could tend mules, care for the miners’ tools, and help with other odd jobs. In towns they also peddled newspapers, served as messengers and delivery boys, worked in liveries, and assisted with a variety of other work. As they approached adulthood, boys began to work more outside the home to provide extra income for their families. Older sons on the farming frontier often
accompanied their fathers as they sought summer employment in towns, on railroads, at army posts, and in other enterprises. When a father died or abandoned his family, the economic responsibilities of a son expanded still more. While barely in their teens, some boys took on a father’s roles. Not all a boy’s time was spent at work, however. Frontier sons engaged in a wide range of play on the frontier. They brought with them to the frontier many games learned in more settled areas. When thrown together with other children from many backgrounds, they found these games a means of forming instant youthful communities. Some, like ringaround-a-rosy and “How many miles to Miley Bright?” had roots centuries deep. Others were invented on the spot, often reflecting the peculiar conditions of frontier life.
Frontier Boyhood Boyhood amusements sometimes allowed them to rehearse for adult life. A west Texan recalled “branding” dead antelopes with a stick in imitation of older males at a cattle roundup. “Hunting” with stick guns and building mock houses were other preparations for adulthood. Parents encouraged such rehearsal with gifts reflecting masculine roles: toy guns, hobbyhorses, and playtime tools. Much play, however, was not so organized or oriented toward specific functions. Rather, a boy spent hours exploring the countryside or town, looking for spontaneous fun wherever it might be found. This, too, was an important part of the youthful frontier experience. Through it a boy came to understand his world and formed emotional attachments to it that were in many ways deeper than those of his elders. Parents often feared that the children they took to the frontier would be lost to such dangers as attacks by Native Americans and wild animals. Although some boys were killed or kidnapped by Native Americans, such incidents were extremely rare, and although there were some injuries and deaths from poisonous snakes, those too were highly unusual. Other animals posed virtually no threat at all. Instead, boys were most at risk from accidents and, above all, disease. On the overland trails, rambunctious sons sometimes fell beneath the wheels of the wagons. Farmwork was full of potential for injury, especially as more machines such as threshers were introduced. Hunting accidents and drowning also took a toll. In mining towns, boys at play or work might fall down abandoned shafts, and others lost fingers and eyes while amusing themselves with discarded blasting caps. The many hours spent on horseback could include disastrous spills. Dan-
291
ger on the frontier came not so much from conditions there but from the working patterns, equipment, and way of life that families brought with them. And so it was with disease. By far the greatest physical threats to pioneer boys were illnesses carried to frontier settlements. On overland trails, travelers lived in close contact with others from across the country and the world. Conditions were primitive and often filthy, and contagions were easily passed from one camp to another. Childhood diseases such as measles were common. In 1849 and 1852 cholera killed perhaps a few thousand persons, and children were especially susceptible. In frontier towns conditions were also crowded and appallingly unsanitary. Water supplies were frequently contaminated, and before the establishment of basic public services, heaps of garbage and dead animals and other breeding places of disease could be found in towns. Nor were even farms immune. Contagion could be transmitted by watercourses and contact with visitors. Among the most common illnesses were measles, whooping cough, dysentery, pneumonia, and scarlet fever. Accidents and insect bites might lead to blood poisoning and threatening infections. Diphtheria was perhaps the most frightening disease. Spread easily from child to child, it brought high fever, a slow constriction of the throat, and in a tragically high percentage of cases, death by suffocation. Nevertheless, the vast majority of frontier boys passed into maturity without ever facing a serious threat to their lives or safety. As they grew toward adulthood, boys were also gradually integrated into their society. As elsewhere, education was crucial to that process. Frontier parents were pulled in two directions with
292
Frontier Boyhood
regard to schooling their children. They badly needed their children’s help in the family’s work, yet they also valued education as a means of ensuring a better future for their sons and daughters. Consequently, schools typically were founded as soon as a handful of children had settled in an area. Often this was done by parents taking the initiative in hiring a teacher and finding a place to hold classes, rather than by local governments. In fact, most frontier areas had more schools per school-age child than the rest of the country, and the West spent more per capita on public education. Yet the frontier West trailed the nation in the number of days each year students spent in school. Weather and isolation were partly to blame for spotty attendance, but more important was the tremendous amount of work to be performed on frontier farms, much of which children could do. Especially during planting and harvest times, parents kept their children home. Boys beyond the age of eight or ten were more able than their sisters to handle heavy work and so were more necessary for the annual tasks associated with a new frontier settlement. Boys’ education, then, tended to proceed according to the rhythms of farming, yet boys stayed in school longer than girls. Daughters were often pulled out of school and into domestic work in their early teens, but sons, when not working at home, might attend classes until young adulthood. As in much of the United States outside the cities, frontier children attended oneroom schools. All ages and levels met and learned together under the tutelage of a single schoolmaster or schoolmistress. Conditions were crowded and facilities usually primitive, but those who passed through this system typically spoke
highly of it. Schools also played important roles in frontier community life. School programs were occasions for large gatherings of locals. The boys who performed were a living celebration of what adults hoped would be the transplanting of culture to a new country. On other social occasions the community came together to raise funds for support of education. These public efforts emphasize how frontier boyhood ultimately was inseparable from the adult society around it. Sons and daughters played prominent parts in other public occasions such as parades, which is again evidence of their symbolic importance in a community’s values and its faith in the future. At dances and other festivities, boys were welcome. For all its unique aspects, then, boyhood was also intimately entwined with the larger frontier experience. The boys who grew into men on the frontier were, however, set apart from the generation before them by a deeper identification with the places they knew in their youth. In this way, the frontier continued to influence the nation well after its passing. Elliott West See also Cowboys; Farm Boys; Jobs in the Nineteenth Century References and further reading Baur, John E. 1978. Growing Up with California: A History of California’s Children. Los Angeles: Will Kramer. Faragher, John Mack. 1979. Women and Men on the Overland Trail. New Haven: Yale University Press. Garland, Hamlin. 1926. Boy Life on the Prairie. Boston: Allyn and Bacon. Hampsten, Elizabeth. 1991. Settlers’ Children: Growing Up on the Great Plains. Norman: University of Oklahoma Press. Moynihan, Ruth Barnes. 1975. “Children and Young People on the Overland
Frontier Boyhood Trail.” Western Historical Quarterly 6 (July): 279–294. Peavy, Linda, and Ursula Smith. 1999. Frontier Children. Norman: University of Oklahoma Press. West, Elliott. 1983. “Heathens and Angels: Childhood in the Rocky Mountain
293
Mining Towns.” Western Historical Quarterly (April): 145–164. ———. 1989. Growing Up with the Country: Childhood on the Far-Western Frontier. Albuquerque: University of New Mexico Press.
G Gambling
often mirrors the “real” world that contains it. Roger Caillois (1979) defines play as a free and voluntary activity, a source of joy and amusement that is essentially a separate occupation, carefully isolated from the rest of life and bounded by precise limits of time and place. It is governed by a set of arbitrary, specific, and unexceptional rules. Play has no meaning other than its own intrinsic meaning; and the playing of games is also uncertain, since neither the precise course of events nor the outcome is known to the participants beforehand. And finally, play is unproductive in that no wealth or goods are produced, although they may be exchanged in the course of the game. Within such a broad definition, we can find many types of play and games, among them the various forms of gambling. Most American youngsters have grown up in a society with decidedly contradictory views regarding gambling. For some, gambling is portrayed as evil, a sin that parents and other authority figures condemn and forbid. For others, games of chance are relegated to the world of grownups who have the money and discretion to participate in either spontaneous or commercial gambling ventures. A few children may be exposed to gambling games at an early age, usually playing “for fun” with adults or peers, perhaps
Gambling is a universal cultural phenomenon, and Americans have been called a “people of chance.” American boys, even in their play, have been socialized into a culture of competition, and because all are not endowed with equal shares of physical prowess or skill, their games often allow for success through the vagaries of chance or luck, creating a mirror of the “democracy” of gambling where anyone can win on a particular occasion. Although most common gambling behaviors do not occur until adolescence or young adulthood, boys can be seen to be culturally, if not biologically, predisposed toward gambling—even in the way they play as children. Play and games function as an important dimension of culture, providing release from the restrictions of everyday life while simultaneously reflecting the very myths that hold a culture together. The world of play and games offers a varied menu of rituals, myths, icons, and heroes that articulate, reinforce, and transmit cultural values. Johan Huizinga (1955) calls play an independent activity that is “senseless and irrational”—by definition it is “make believe”—but it is an important part of the process by which humans are socialized into their culture. Erving Goffman (1961) regards play as a “world building activity,” and the world of play
295
296
Gambling
as part of the celebration of holidays or special occasions. The latter situation has deep cultural roots. In Europe from the seventeenth through the nineteenth centuries, many references in the visual and literary arts confirm that children as well as adolescents used to engage in games for money played with cards, dice, or other implements. Even when the tide of opinion turned against gambling during the Victorian era, there were still defenders of the practice who argued that not only did betting make economic sense for those who bet small amounts to make greater sums, but it also was a means to develop the character and composure needed throughout one’s life. A person does not magically become a gambler at a certain age. Various cultural signals actually condition the would-be gamblers. The media participate by romanticizing stories about gambling and gamblers, frequently showing daring, larger-than-life heroes thriving on risk, and also by giving enormous publicity to game show contestants or gamblers who win substantial prizes. American cultural myths and values reinforce the materialism, the longing for material reward, and the excitement of pursuing dreams that characterize many gamblers’ fantasies. The ritualized play of several childhood games provides training for future gambling activity, and in some cases they may be seen as a kind of gambling. In this way, their culture may actually predispose American children to gambling behavior. One way to understand play and gambling is to see the activity as a social fiction. In the world of make-believe, games do not replace reality, but they do suspend the consequences of real life for the duration of play. Often the games people play mirror, if only obliquely, their real
lives, and in the context of play the suspense, conflict, and uncertainty of real life become easier to manage. Roger Caillois suggests that playing games stimulates ingenuity, refinement, and invention as it teaches people loyalty and builds character. Further, there is a remarkable coincidence between the principles that rule various kinds of games— whether they are games of chance or games of skill—and those that characterize all human interaction: The need to prove one’s superiority The desire to challenge and overcome an obstacle The hope for and the pursuit of a reward The desire to test one’s strength, skill, endurance, or ingenuity The need to conform to rules, the duty to respect them, and the temptation to circumvent them. In the context of American society, games provide everyone, including boys, an important outlet for these human tendencies. There are several classes of games, including games of competition, chance, simulation, and vertigo. For the purposes of this article, the most important games involve the complementary concepts of competition (skill) and chance. Although there are games of pure chance, such as a lottery, and games of skill, such as chess, many popular games played by children and adolescents blend these two characteristics in a formula that allows some measure of equal participation. A game of skill will be dominated by participants of superior ability, whereas a game of chance may be too capricious. But a game that combines both allows skill to influence the outcome without predetermining it.
Gambling Less skillful players may compete knowing that they have at least some chance of success, and a win is more significant than merely a chance outcome. Competitiveness and aggressiveness are traits that are cultivated in males as part of their socialization process from the time they are children. Young females may play roles of cooperation and caring responsibility in many of their games, but young boys act out patterns of competitive and daring achievement, as can be seen in the popularity of athletics and rough-and-tumble play among boys. However, since not all young males are endowed with equal shares of strength, endurance, and coordination, traditional contests that are games of competitive skill may not offer satisfactory avenues of play to each. Games that combine a measure of skill and talent with the impartiality of blind chance offer the most favorable conditions to test a boy’s mettle. They account for the popularity among young boys of certain games that resemble gambling and for the prevalence of gambling behavior among adolescent and adult males. As young males are socialized into a world in which individual achievement is tempered by forces beyond their control, it is little wonder that sports and other childhood games have such an attraction; and it is equally clear why the fascination with the safe competition of games does not diminish when the youngster becomes an adult. Gambling encounters involve face-toface interaction in terms of a specific sequence of ritualized events: squaring off (the challenge under a set of rules), determination (the play producing an outcome), disclosure (the completion of the game), and settlement (the acknowledgment of the outcome). The scenes of such action are primarily male-dominated
297
(Goffman 1967). They are occasions on which boys display forms of stereotypically male character traits: courage, integrity, gallantry, and composure. The participants risk these character traits when they engage in a game or contest, and if they make a good showing they can enhance their ego while acting in culturally approved ways. Relatively few children have either the inclination or the financial resources to wager money on an uncertain outcome in the hope of monetary reward. Some children who do may be reprimanded and forced to return wagers if the gambling activity becomes known to adult authority figures. Adolescents may find greater opportunity to gamble and have more inclination to bet, especially if they have their own money and are part of a receptive peer group. Gambling is very likely to occur in male groupings, whether at recess in schoolyards; at camps, clubhouses, and caddyshacks; or later at college or military service. Nevertheless, a number of childhood and adolescent games bear a strong resemblance to gambling behavior and in some cases are actually gambling experiences. Not surprisingly, the majority of participants in these games are boys, although there is usually no formal prohibition against girls entering the game. Predictably, these games are perceived by the players as combining luck and skill, with the emphasis perhaps on skill. Yet there is no necessary correlation between success in these games and success on the athletic field or classroom; in fact, many times the nonathlete and academic underachiever gains status among his peers by virtue of his success in the game. The games are treated as contests, competitions in which there are clearly winners and losers. The prize is gain, either
298
Gambling
in substance or at least in status. Such games played by boys have included marble shooting, baseball card flipping, card games, pinball, and video games. As toys, marbles date back at least as far as ancient Egypt and Rome, and although they may have fallen from favor in the twenty-first century, for much of the nineteenth and twentieth centuries they were an important part of American boyhood rituals. Playing marbles was unlike organized sports based on adult versions of similar games, such as Little League Baseball; adults generally played no part in setting up the rules for marble play. The games developed spontaneously among the participants, and there were many variations of games played. Some basic rules survived, and in the eighteenth century boys shot at marbles in a ring drawn on the ground in much the same way as boys did two centuries later. The object of the game was to claim opponents’ marbles by hitting them with one’s own marbles, risking them in the process; not playing marbles “for keeps” was not really playing. In this sense, players risked their own stake of marbles in the hope of acquiring an additional stake. And while marbles had relatively little monetary value, they were forms of currency and status among players. Moreover, a boy was not supposed to buy marbles; he was supposed to win them. Since marbles were portable, youngsters could carry them anywhere and strike up a game whenever the opportunity arose. Schoolyards were a popular venue for the activity: not only was the game a pleasant diversion from the work of a school day, but it could be played within the time frame of a recess, taking advantage of whatever surface was available. In fact, for one form of the game,
“chase,” players did not even need to draw a ring on the ground; shooters simply aimed at one another’s marbles where they lay. Squaring off involved a simple challenge between two or more players, and any variations in accepted rules were negotiated in advance. Occasionally a weaker opponent was given a handicap, perhaps “first up” or a favorable starting position. The outcome of the game was determined by the victor’s marble striking the loser’s (in chase) or knocking the loser’s marble out of the ring. Settlement involved the winner’s taking possession of his prize. Because varieties of marbles held status, as did recognized champions, a prized “shooter” or other valuable marble could be spared if an agreement was made to ransom the marble with an appropriate number of less valuable ones. Marble etiquette among children developed as surely as gambling etiquette among adults. Cheating, though possible, was frowned upon, and winners and losers were expected to finish the match with grace and sportsmanship. Although the game did not necessarily have a gender bias toward boys, the informal structure of elementary schoolyard play reinforced the idea that marble play was usually for boys. The courage needed to face a schoolyard champion was no small matter for a new boy or a previous loser, but he knew that immediate recognition and status could be obtained by virtue of his success in the fateful encounter of the game. To be sure, a player risked his marbles, but he also risked his pride. As in the case of all games, outside the specific arena of marble competition, the prize had comparatively little monetary worth. Baseball cards, perhaps, had more immediate collector value than marbles. A boy could buy marbles, choosing the ones he most desired, but baseball cards
Gambling originally came as premiums with other products: first with cigarettes and later with bubble gum. Youngsters collected cards at random, hoping to fill out teams, leagues, or the ranks of their favorite players. As collectors obtained duplicate cards or became willing to sacrifice cards in their collections for other desirable pieces, games involving card wagers evolved. Again, this evolution took place without adult intervention, though a father might show his son some of the finer points of card flipping. As time passed, card subjects were expanded to include players of other sports such as football and movie characters such as the Pokémon figures. As with marble games, there were many variations of baseball card flipping that could accommodate two or more players. In each case, the cards to be risked were selected from the player’s store, constituting his stake for the game. Substitution or ransom of favorite cards was rare. In simple card flipping, the object of the game was to flip the card in such a way as to match the face of the opponent’s card. One player flipped first, and the card landed heads or tails on the ground. The second player had to match the face with his card. If successful, he would claim both cards; if not, his opponent would. Another version had players scaling the cards at an array of cards propped against a wall. Skillful players could have an edge at this game in which the object was to knock down the propped cards, with the player who knocked down the last card claiming all the cards. Chance and skill combined in baseball card games. Cards could have subjective value as a measure of their rarity or popularity, and successful players were admired by their peers for flipping skill or
299
the size of their collection. Cheating by taping cards together or weighting corners was possible but so obvious as to be useless except against the most naive opponents. The game followed the ritual patterns of a gambling encounter, and unless players had an ample store of duplicate cards or played only with undesirable ones, they risked something of value in the hope of adding to their store. Card playing does not become popular among children until they are somewhat older than the typical marble shooter or baseball card flipper. The obvious reason for the delay is the comparatively sophisticated sets of rules accompanying the various card games other than war and fish. Hearts, crazy eights, pinochle, rummy, and varieties of poker may be easily found in all sorts of male groupings—scout trips, summer camps, informal gatherings—and later especially among groups of college students or servicemen. Again, boys were more likely to gravitate toward card games because of their predisposition to games with rules and a clear advantage to those with the skill to remember the play of the cards. For many youngsters, playing cards for money or another stake is their first exposure to adult gambling behavior. In twentieth-century American popular culture, technological icons have had undeniable appeal to children and adults alike. This attraction can be seen in the popularity of pinball, video games, computer games, and handheld game devices. Not only do the playthings have a level of cultural significance in themselves, but the playing of the games is yet another ritualized encounter with varying combinations of chance and skill in the context of a fateful encounter. Although the gambling roots of pinball games have been well documented, the popularity of these
300
Gambling
and other mechanical or electronic games does not rest exclusively on winning a bet. In fact, the “win” is usually additional playing time and the chance to post a high score, marked with the player’s initials or nickname, which remains for others to see until the score is exceeded or the machine is reset. Playing the game has a hypnotic appeal, even if provision is made for two or more players to compete against one another simultaneously. But more often than not, the play seems to be a contest between the human and the technological adversary. When considering the question of why children gamble, the most obvious answer is that they are greedy; they always want more than what they have, particularly if someone else has something they desire. Studies have shown that when young children are placed in gambling situations, both boys and girls play with equal attention, although boys are seen to bet more aggressively and believe that their skill at the game can directly influence the outcome. Childhood games bearing similarities to adult gambling almost always fall into the pattern of competition. Even if the flip of a trading card is as random as the toss of a coin, the players believe that they are exercising a skill as they try to match the face of a card in play. In the gambling games of boys there is a balance of skill, which makes the victory honorable and worthy of admiration, and luck, which makes victory possible for anyone. Such games are steeped in ritual and surrounded by a particular subculture in which the game is celebrated or the prize valued. A marble game may be a rite of passage for the boy in the schoolyard as he tests his skill against another player. The knot of students flipping baseball or Pokémon cards celebrates a rite of unity, sharing not only
the moment but the desire to build their personal collections. A card game with friends or a session in a video arcade may serve as a rite of reversal, a time of release from the repetition of daily tasks as players become so immersed in their play that nothing else matters. Character can be displayed and even developed through these rituals, and honor can be won along with material or recreational rewards. Historically, gambling behavior has been found earliest and most consistently among boys because of social and cultural bias, not genetic predisposition. It remains to be seen whether the incidence of gambling play will increase among girls to mirror the recently documented parallel gambling behavior among adult males and females. James F. Smith See also Baseball Cards; Competition; Games References and further reading Aries, Philippe. 1962. Centuries of Childhood: A Social History of Family Life. Translated by Robert Baldick. New York: Vintage. Caillois, Roger. 1979. Man, Play, and Games. Translated by Meyer Barash. New York: Schocken Books. Derevensky, Jeffrey L., Rina Gupta, and Giuseppe Della Cioppa. 1996. “A Developmental Perspective of Gambling Behavior in Children and Adolescents.” Journal of Gambling Studies 12, no. 1: 49–66. Frank, Michael L., and Crystal Smith. 1989. “Illusion of Control and Gambling in Children.” Journal of Gambling Behavior 5, no. 2: 127–136. Goffman, Erving. 1961. Encounters: Two Studies in the Sociology of Interaction. Indianapolis: Bobbs-Merrill. ———. 1967. Interaction Ritual: Essays on Face-to-Face Behavior. Garden City, NY: Anchor Books. Hughes, Fergus P. 1999. Children, Play, and Development. Boston: Allyn and Bacon.
Games Huizinga, Johan. 1955. Homo Ludens: A Study of the Play Element in Culture. Boston: Beacon Press. Ide-Smith, Susan G., and Stephen E. Lea. 1988. “Gambling in Young Adolescents.” Journal of Gambling Behavior 4, no. 2: 110–118. Packard, Cynthia, and Ray B. Browne. 1978. “Pinball Machine: Marble Icon.” Pp. 177–189 in Icons of America. Edited by Ray B. Browne and Marshall Fishwick. Bowling Green, OH: Popular Press.
Games Games played by boys may be classified in various ways, but all systems recognize distinctions among games of physical ability, mental skill, fantasy, and chance—examples are tag, checkers, cowboys and Indians, and bingo—but most games involve combinations of these elements. All games can be played by both boys and girls, but certain games have at one time or another been viewed as primarily masculine. Marbles, various stickball games, cops and robbers, kite flying, leapfrog, and the more violent forms of tag are examples. Since girls willingly play boys’ games, but boys vigorously resist girls’ games, it is somewhat easier to list what boys will avoid playing. Dolls (unless they are toy soldiers), jacks, jump rope, and ring games of young girls fall in this category. In the twentieth century, board games and electronic games have altered the nature of boys’ play. Today, boys spend less time in large-group outof-door games and more with small-group or individual games indoors. Boys’ games contribute to their physical and mental development, teaching them to organize and play by rules. There are, however, two major problems in discussing the meaning of boys’ games. First, they are ephemeral, and since much of
301
their meaning is contained in the specific ways they are played, there is little or no historical record. Boys can play a game without ever learning what it is called. Thus, lists of games, even when they include instructions for playing them, are always incomplete. Names and styles of play change from year to year and neighborhood to neighborhood. For example, the game known as “anti-I-over” throughout most of the western United States is also called “Andy-over” or “Annie-over” in the Midwest, “Anthony-over” and “hail-over” in the South, “haily-over” in New England, and dozens of other names throughout the world. Although all variations involve two teams throwing a ball over a small building, catching it, and running around the building to capture an opposing team member, the rules can be adjusted to better suit the size of the structure or the number and ages of the players. A second difficulty arises from the organization of play by adults who usually attempt to teach sportsmanship and limit violence in boys’ games. As a result, books containing descriptions and rules for games are usually bowdlerized, omitting any reference to the cheating, arguing, and protesting that usually accompanies any boys’ game. Daniel Carter Beard, one of the founders of the Boy Scouts of America, even tried to write rules for such spontaneous activities as snowball fights. One of the earliest handbooks, by an author who signed himself “Uncle John,” was Boys’ Own Book of Sports, Birds, and Animals, published in 1848. Because the book is divided into sections on “Minor Sports,” which include marbles, tag, blindman’s buff, and follow-the-leader, and other activities such as gymnastics, archery, swimming, skating, and rowing, it is clear that Uncle
302
Games
“Snap the Whip”: boys playing a game in front of a schoolhouse. Engraving after a painting by Winslow Homer, 1873. (Library of Congress)
John wanted boys to choose the more organized sports. Some disapproved games were described in 1891 by the anthropologist Stewart Culin. An example is “hide the straw,” in which a boy is given a straw to hide while the other players close their eyes. After searching for some time, they accuse the boy of hiding it on his body and force him to open his mouth, which they then stuff with “coal and dirt.” Such accounts are rare, and the gap in the knowledge of what boys are supposed do and what they actually do is wide. On June 23, 1913, a group of city officials in Cleveland, Ohio, spent the day recording what children were doing. The majority of the 8,920 boys observed were “doing nothing” or “just fooling,” according to the surveyors, who were also concerned
because most of the boys were seen in streets, alleys, and vacant lots rather than yards and playgrounds. What the officials failed to record is what they meant by “nothing” and “fooling” (Johnson 1916, 49). Because it was early summer, a large number of boys were playing baseball, and others were flying kites, playing tag, or riding bicycles, but their informal games were invisible to the adults. “Doing nothing” and “just fooling” included breaking windows, destroying abandoned buildings, chalking suggestive words on walls, throwing mud at streetcars, touching girls, looking at pictures of women in tights on billboards, wearing suggestive buttons, stealing, gambling, and drinking. These are all activities that the boys themselves might turn into a game.
Games One thing is clear: the street was the center of play. Despite the hazards of traffic and the opposition of merchants and homeowners, boys play their games where the action is. From the nineteenth century to the present, boys’ games formed, dispersed, and reformed to the rhythm of fire trucks, ice cream vendors, and periodic parades. Boys struggled for control of the streets in both cities and suburbs. In cities, stickball, handball, and tag incorporated lampposts, manhole covers, fire hydrants, and the stairs leading into buildings into the game. Hopscotch, hockey or shinny, and, above all, skelly require marking the pavement with chalk. Skelly is played in a space about 10 feet square, marked with small boxes numbered one to twelve around the perimeter, with a box numbered thirteen in the center of the playing area. Boys weight bottle caps with wax or tar and shoot them from one number square to another with a flick of a finger. The first player to move his piece from square one to twelve without landing in the “dead box,” or square number thirteen, is the winner. Autobiographies and novels describing childhood in the nineteenth and early twentieth centuries confirm remarkable continuity in boys’ games. Chase and capture games such as “I spy,” “hare and hounds,” and “prisoner’s base” were played throughout the country. Local variations arose, of course. African American boys in the South played a version called “chickamy, chickamy,” in which an older child plays the role of a witch who chases younger children pretending to be chickens. A version recorded in a village near New York City in the 1890s was called “head off and head on.” In this game, two teams, led by the two largest and toughest boys, raced from one base to
303
another trying to avoid capture. The capturer had to hold his captive long enough to shout, “Five and five are ten; you’re one of my men.” Captured boys could be freed by being slapped on the hand by an uncaptured member of their team. Marble games were the most popular alternative to chase and capture games. Marbles are mentioned in almost every memoir of boyhood and are frequently referred to in the narratives of former slaves. Marble games combined skill, strategy, chance, and an opportunity to win valuable objects. Many boys were proud of their collections and used them to buy other things and services. One former slave recalled that he used his winnings at marbles to hire another boy to teach him the alphabet. William Wells Newell, the first serious American student of children’s games, was impressed by the complicated rules boys employed in their games. When a boy wanted to change his position when shooting at another marble he could shout, “roundings,” but if his opponent quickly countered with “fen [defend] roundings,” the move was prevented. Time-out to argue over rules was invoked by calling “king’s excuse,” or “king’s X,” a rule invoked in other games as well. The playground movement, which was formally organized in 1906, was particularly concerned with getting boys off the street and into supervised recreational facilities. The first public playgrounds were located in poor neighborhoods in large cities with immigrant populations, but middle-class parents soon demanded playgrounds in their neighborhoods as well. Many were built in schoolyards. Well-known landscape architects such as Frederick Law Olmsted, Jr., were hired to plan playgrounds. Fences were an important element of the design, since children
304
Games
Boys enjoy a game in a kindergarten class. (Shirley Zeiberg)
under twelve years old were kept in one area with sandboxes and small swings and seesaws, whereas older boys and girls had larger separate spaces with merry-gorounds, slides, and climbing apparatuses, usually a jungle gym. Reformers believed that the proper playground equipment and a well-trained supervisor could improve the health, initiative, purity of mind, cooperativeness, ambition, honesty, imagination, self-confidence, obedience, and sense of justice of all boys and girls. Thousands of playgrounds were built across the nation with this ideal in mind, but the expense of maintaining them during the Depression and World War II was too much for most communities. Unsupervised playgrounds with bro-
ken and dangerous equipment were almost as much fun as streets and vacant lots for older boys, however. The large cast-iron frames for the jungle gym, slides, and swings were sometimes more than 15 feet high. Boys challenged each other to climb to the top and play tag. Two major changes in playground design and apparatus occurred after 1945. First, as families moved to the suburbs they bought or made small swings and jungle gyms for their backyards. Boys could play in their neighbor’s yard without facing the threat from gangs of older boys, but the physical challenge was less. Playground equipment manufacturers responded with new designs for jungle gyms—geodesic domes, forts, and rocket
Games ships—intended to stimulate a child’s imagination as much as his muscles. New, brightly colored, and softer materials were used to prevent injuries caused by falling. These playgrounds evolved further in the 1970s and 1980s, when the architect Robert S. Leathers began promoting community-built playgrounds constructed from old telephone poles, truck tires, and planks of splinter-resistant pine. By the mid-1980s more than half the new playgrounds in the United States were built of wood, but it was soon discovered that these structures were too expensive to maintain, and plastic playground equipment became popular again in the 1990s. Plastic is also used extensively in the new indoor playgrounds found in shopping malls. Many of these “soft modular playgrounds,” as they have been called, are privately owned and charge for admission, but they allow boys to pelt each other with soft plastic balls and re-create some of the violence of earlier playgrounds in relative safety. Many playgrounds have basketball courts, some have baseball fields, and a few have space for football. These games can be played informally; the phrase “pickup game” has been part of the American vernacular since the early twentieth century. Whether shooting “hoops” in a driveway or “banging the boards” on a playground court, basketball remains one of the easiest modern sports to play casually, and stickball remains baseball’s unpretentious sibling on city streets. None of these games could withstand the bureaucratization and commercialization of twentieth-century sports, however. Little League baseball, with umpires, uniforms, and volumes of rules and statistics, began in 1939, followed by Pop Warner football leagues, youth soccer, and inner- and intercity basketball
305
tournaments. Specialized summer camps for these and other games began to make their appearance in the 1970s. The challenge to street and playground amusements from board games and now electronic games is the most remarkable revolution in the history of American boyhood. Cards, checkers, dominoes, chess, backgammon, lotto, and goose (a variant of Parcheesi) were all traditional adult games played by children—jigsaw puzzles were also available from the early nineteenth century—but few children had their own board games until the 1890s. In that decade, boys in Worcester, Massachusetts, indicated a preference for checkers over any other board game but mentioned such new board games as Messenger Boy, Nellie Bly’s Trip around the World, and Innocence Abroad. Many of the new games were marketed as educational tools that could teach business strategy or geography. Board games proliferated in the twentieth century. Although board games involving play money had been marketed since the 1880s, when George S. Parker invented a banking game, business expansion in the 1920s and the Depression of the 1930s stimulated new interest. George Darrow’s game Monopoly became an almost instant success after he sold it to Parker Brothers in 1934. Monopoly spawned hundreds of imitators such as Risk, Boom or Bust, Ticker Tape, Go for Broke, Easy Money, and Careers. Since the 1980s, traditional board games have had to compete for boys’ time with a growing variety of electronic games that have become increasingly complex and portable. The fast pace and attractive graphics of the video and handheld games, especially those with violent action such as Mortal Kombat, Street Fighter, and sundry games based on invaders from
306
Gangs
outer space, appeal especially to boys. Improvements in technology allow new games to appear monthly, and their resemblance to comic books is not accidental. Although they are much more expensive than comic books, the games are collectable, easily traded, and often have movie and television spin-offs. Bernard Mergen See also Basketball; Gambling; Video Games References and further reading Champlin, John D., Jr., and Arthur E. Bostwick. 1890. The Young Folks’ Cyclopedia of Games and Sports. New York: Henry Holt. Croswell, T. R. 1898. “Amusements of Worcester Schoolchildren.” The Pedagogical Seminary 6: 314–371. Culin, Stewart. 1891. “Street Games of Boys in Brooklyn, N.Y.” Journal of American Folklore 4, no. 14: 221–237. Feretti, Fred. 1975. The Great American Book of Sidewalk, Stoop, Dirt, Curb, and Alley Games. New York: Workman. Johnson, George E. 1916. Education Through Recreation. Cleveland, OH: Survey Committee of the Cleveland Foundation. Mergen, Bernard. 1982. Play and Playthings: A Reference Guide. Westport, CT: Greenwood Press. Newell, William Wells. 1883. Games and Songs of American Children. New York: Harper and Brothers. Sutton-Smith, Brian, Jay Mechling, Thomas W. Johnson, and Felicia R. McMahon, eds. 1999. Children’s Folklore: A Source Book. Logan: Utah State University Press. Uncle John. 1848. Boys’ Own Book of Sports, Birds, and Animals. New York: Leavitt and Allen.
Gangs Although the roots of the word gang are several centuries old and its meanings have varied across time, the term today typically refers to more or less organized
groups of urban youths with a higherthan-likely chance of committing crimes. Despite a century of research, no clear consensus on a more precise definition has been definitively established, leaving open a host of problems for anyone attempting to understand gangs as a social phenomenon. The study of youth gangs is made even more difficult by the fact that young people form strong peer groups that resemble gangs as part of the usual course of adolescent social development. However, groups of urban youths have been labeled “gangs” by authorities in urban areas since at least the early 1800s in the United States. During the twentieth century, media panics about urban youth gangs have occurred repeatedly, often complemented by spectacular representations of gangs within news reports, political statements, and commercial popular culture. Most researchers agree that gangs are endemic to U.S. society, and at the most abstract level, gangs are responses to social-structural inequalities, usually related to the local economy, disruptions caused by migration, or institutionalized discrimination. Gangs vary in their criminal orientation, their ethnic makeup, and their commitment to a particular territory. Although girls have formed autonomous gangs, young men make up the vast majority of gang members. There is widespread agreement that there are now more youth gangs in the United States than ever before, but without better data on gangs in the past, any quantitative differences in their orientations toward crime and violence are matters of conjecture rather than science. The roots of the word gang can be traced back at least to the twelfth century in England, when the word referred to a band of robbers and thieves. The age
Gangs range for these groups is unknown, but gang was used to refer to adults more often than youths until this century. Organized youth gangs engaging in criminal activity of various sorts were noted in London during the fourteenth century, in France during the fifteenth century, in Germany during the seventeenth century, and in the United States in the early nineteenth century. Despite this historical diversity across nations, Malcolm Klein (1995) claims that youth gangs have always been primarily a U.S. phenomenon. The kinds of youth groups that have been called gangs by journalists, reformers, residents, police authorities, and scholars in the U.S. have varied widely during the past 200 years. Some of the earliest youth gangs were auxiliaries of adult criminal organizations. However, rowdy social clubs and unorganized peer groups hanging out together in public areas or engaged in occasional minor mischief have been called gangs as well. The range of phenomena that have been labeled as gangs has made writing a coherent history of gangs a difficult task, which probably accounts for the small number of such histories to date. Most researchers agree that forming peer groups based on age is an expected part of adolescence, particularly in the twentieth century, and until recently there has been almost no research on the behaviors and values that distinguish gangs from other kinds of youth peer groups. Gangs, then, have often existed only in the eyes of their observers. Frederic Thrasher’s 1927 sociological study of more than 1,300 gangs in Chicago is usually credited with being the first reliable scholarly source on U.S. gangs. He noted that gangs did not usually appear of their own accord and typically formed in reaction to some other group, whether a
307
rival peer group, an existing youth gang, the police, or some other adult attempt to restrict their activities. Klein (1995) estimates that more than half the reliable information on gangs was produced between the mid-1950s and the end of the 1960s, when the academic study of gangs went through a dynamic period of expansion, spurred in part by a climate of moral panic and policy innovation surrounding youth in the post–World War II period. Youth gangs have been the objects of both panic and fascination since the late nineteenth century. The 1890s, 1920s, 1950s and 1960s, and the period from the late 1980s to the late 1990s all witnessed moral panics, increased attention to commercial popular culture, and high levels of reform/research activity around gangs. However, high visibility in the media or attention from academics is not a reliable gauge of gang activity or violence. There is wide agreement that gangs are endemic to the U.S. social structure and are more or less inevitable outcomes of enduring structural inequalities. Also, the lack of reliable data on gangs (there are no national uniform crime reporting procedures for gang-related crimes) and the variability in the local definitions of gangs and gangrelated crime make historical comparisons with contemporary gangs difficult at best. The periodic “rediscovery” of youth gangs may be a sign of displaced social anxieties about demographic, cultural, or economic changes or the product of lawand-order campaigns by politicians and police authorities. The desire to sway public opinion during these periods of “rediscovery,” which often results in more funding for research and enforcement, means that the information that the public receives on gangs is often distorted. Distortions and stereotypes find reinforcement through various forms of commercial
308
Gangs
popular culture. Novels, songs, and films about youth gangs have found a ready audience since the 1950s, reflecting a consistent fascination with youth gangs, their charismatic leaders, and the social struggles they engage. Any accurate portrayal of gangs must address the variety of gang formations that have appeared over time. For brevity, three major distinctions between gangs gesture toward the complexity of their differences. One major difference over time is the degree to which a gang is formally oriented and organized around criminal activity. Gang affiliation is not necessary for young people to break laws; the majority of juvenile crimes are committed by young people who are not members of gangs (Shelden, Tracy, and Brown 1997). Klein, Cheryl Maxon, and Jody Miller (1995) have argued that gang members commit most crimes without the knowledge or cooperation of their gang. Also, most gang crime is of the “buffet variety,” that is, primarily motivated by immediate opportunity rather than planning and specialization in a particular kind of crime. The most common crimes associated with gangs are car theft, robbery, and burglary. Although there is debate on the topic, Klein’s research points toward a lack of organizational and group cohesiveness within gangs that casts serious doubts about the abilities of most gangs to successfully carry out systematic criminal activities that require high levels of trust and secrecy, like drug smuggling and distribution. For instance, his data show that less than 25 percent of all drug sales arrests made during the mid-1980s crack epidemic in Los Angeles were gang-related. By far the most common activity of all gangs is simply hanging out. It is generally accepted that crime makes up a minuscule portion of the total
activities of most gangs, although crimes attract the most attention from the public and from authorities. A second major variation among gangs is their ethnic makeup. Since youth gang formations are closely tied to immigration and rural-to-urban migrations, the ethnic character of gangs has changed over time. In the nineteenth century and the first half of the twentieth century, gangs were primarily made up of young people from the most recent wave of immigration, which originated mostly from eastern and southern Europe. However, increasing numbers of African Americans migrated to cities in the twentieth century, accompanied by significant migrations from Puerto Rico and Mexico. As Euro-American neighborhoods numerically lost population to suburbanization, upward mobility, and “white flight,” new arrivals took their place in the formerly homogeneous ethnic enclaves. Schneider ties the youth gang activity in New York City during the midtwentieth century (the inspiration for West Side Story) to the demographic shifts and social conflicts that resulted from these migrations in some neighborhoods. Although one ethnic group usually predominates in most gangs, it is not unusual for there to be several different ethnicities within a single gang, depending on its location. The territorial and generational nature of some gangs is a third variance. Until recently, youth gangs have primarily been concerned with defending local areas from “outside” or rival groups. Since Latinos and African Americans were seldom welcomed into urban areas, the youth gangs that originally formed in defense of the newcomers became institutionalized in some neighborhoods and supported a generational pattern so that gang mem-
Gangs
309
Teenage members of the “Crips” and “Hustler” gangs make their secret hand signals, Watts, Los Angeles, 1983. (Bettmann/Corbis)
bership passed from father to son. These sorts of “traditional” gangs are often subdivided into age-based cliques, with younger divisions called “peewees,” juniors, or other diminutive names. During the last twenty years, some gangs have formed without the same kinds of ties to particular neighborhoods or locations and with a greater integration across age groups. However, these newer gangs often maintain ethnic homogeneity, particularly among newer immigrant groups, as their organizing principle. The psychological and social reasons that motivate individuals to form or join gangs have been the predominant topic of study by gang researchers and social workers, but the results of these studies are more distinguished by their differences, contradictions, and variety than
their agreements. The several competing theories attempting to explain the formation of gangs are derived from general theories of crime and thus are not specific to youth gangs. As noted above, gangs have tended to form in neighborhoods where different ethnic and social class populations are in transition. In these areas, exclusive territorial claims to street corners, public parks, and commercial entertainments like movie theaters and dance clubs are often at the center of disputes between peer groups of older and newer residents. The social stresses of migration and immigration, the lag in the creation of new neighborhood businesses and institutions, and the oftentimes hostile reception given to the newer residents create circumstances that enhance gang formation. These
310
Gangs
factors are often emphasized in theories of “social disorganization.” Thrasher is the founder of this theoretical school, although several scholars have since updated his work. Another group of scholars has examined class and racial discrimination as an explanation for why gangs have formed. Although poverty and discrimination in themselves do not fully explain the formation of gangs (or criminal behavior in general), their influence is substantial and undeniable. In this school of thought, gangs are more likely to form when economic opportunities are limited and social mobility is blocked. Gang formation is considerably more likely under these circumstances when more or less “successful” criminal adaptations among unorganized youths and adults have already been established. Thus, if social and economic limitations are enduring, and criminal activity becomes commonplace, as in many inner-city neighborhoods after World War II, youth gangs can be expected to form. But an established, local criminal subculture need not be present for gangs to form. An orientation toward crime can be socially learned or reinforced from a wide variety of other sources. Researchers have variously argued that such learning or reinforcement can come from the family, the peer culture, or the media; from positive experiences with crime or rewards for delinquent acts; or from conflicts with other individuals or groups. Some scholars have also pointed to the ineffectiveness of family, school, and civic organizations in socializing, controlling, and guiding young people toward mainstream values in areas where the factors favoring criminal activity are the greatest. Here, the overlap with the “social disorganization” school of gang studies becomes evident.
Another group takes sharp issue with these explanations and argues for alternative perspectives, although this group also has several different variations. Some scholars argue that the more powerful groups within society, like the Euro-American middle class, are able to use their higher social and economic status and their political influence to label the cultural values and behaviors of other, less influential groups as criminal and deviant. In an effort to highlight differences and distinctions between themselves and others, the more powerful groups promote laws, legal practices, and media representations that institutionalize their own values in a positive way. Other scholars argue that the brutally competitive capitalist system by necessity produces conflicts between different social groups. However, the crimes committed by elites, such as bank fraud and police brutality, are represented as more understandable, less serious, and more individualized (thus casting no moral shadow on the larger social group) than those crimes typically committed by the lower-status groups attempting to adapt to the harsh conditions of their everyday lives. Girls have been part of gang life at least since Thrasher’s study in 1927 but were usually considered to be members of auxiliary organizations, primarily interested in sexual and emotional attachments to the core gang of young men. Although these observations of past gangs (all made by male researchers) are now in dispute, there are strong indications that gang formations are intimately related to gender identities. Cultural assumptions about the successful attainment of adult masculinity are part of most of the gang theories mentioned above. For instance, some have argued that limited opportu-
Gold Rush nities for achieving adult masculine status (a well-paying job, a respected place in the community, etc.) are motives for gang formation among young men. Recent work, however, shows that the participation of young women in the primary gang activities is currently rising, as is the formation of all-female gangs. Two contemporary developments among gangs are particularly worthy of note. The first is the development of “wannabe” gangs in some suburban and middle-class neighborhoods. Possibly inspired by the representation of gangs in popular culture and the shifts in the cultural meanings of masculinity, these new gangs are often more focused on violence than on criminal gain per se. The second development is the rapid expansion of gangs across the nation, particularly the formation of gangs in smaller and medium-sized U.S. cities. Klein (1995) estimates that the number of “gang cities” has grown from a handful of the largest U.S. metropolitan areas in the early twentieth century to more than 1,000 cities and towns at the century’s end. Joe Austin References and further reading Chesney-Lind, Meda, and John Hagedorn, eds. 1999. Female Gangs in America. Chicago: Lake View Press. Cummings, Scott, and Daniel Monti. 1993. Gangs. Albany: State University of New York Press. Goldstein, Arnold. 1991. Delinquent Gangs: A Psychological Perspective. Champaign, IL: Research Press. Huff, C. Ronald, ed. 1990. Gangs in America. 1st ed. Newbury Park, CA: Sage. Klein, Malcolm. 1995. The American Street Gang. New York: Oxford University Press. Klein, Malcolm, Cheryl Maxon, and Jody Miller, eds. 1995. The Modern Gang Reader. Los Angeles: Roxbury.
311
Moore, Joan. 1991. Going Down to the Barrio: Homeboys and Homegirls in Change. Philadelphia: Temple University Press. Schneider, Eric. 1999. Vampires, Dragons, and Egyptian Kings: Youth Gangs in Postwar New York. Princeton: Princeton University Press. Shelden, Randall, Sharon Tracy, and William Brown. 1997. Youth Gangs in American Society. New York: Wadsworth. Thrasher, Frederic. 1963. The Gang: A Study of 1,313 Gangs in Chicago. Rev. ed. Chicago: University of Chicago Press.
Gays See Same-Sex Relationships
Gold Rush “There comes a time in every rightly constructed boy’s life when he has a raging desire to go somewhere and dig for hidden treasure,” Mark Twain wrote in The Adventures of Tom Sawyer (1876). In the American imagination at least, this time was 1849; this somewhere was California. Indeed, there exists a widespread and seemingly natural link between American boyhood and the California Gold Rush. The forty-niners, according to most contextual and historical accounts of the period running from 1848 into the early 1850s, were for the most part young men and boys from the American Northeast and Midwest. Such a characterization, although it contains a kernel of truth and much cultural meaning, is more an image than reality. Certainly, there were young boys in California during these years. Yet most of the region’s vaunted boyishness came from men over the age of eighteen. The importance of the Gold Rush for American boyhood lies not in the actual numbers of boys who went to California
312
Gold Rush
Young Forty-Niner with rifle and shovel. Quarter plate daguerreotype. Collection of Stephen Anaya (Oakland Museum of Art, 1998)
during its “flush times.” Rather it appears to rest in the event’s expansion of the concept of boyhood, its legitimization of the idea that there exists a “boy”—restless, playful, and irresponsible—within the heart, mind, and spirit of every AngloAmerican man. In actual fact, boys were rare in Gold Rush–era California, if the term refers to Anglo-American males under the age of eighteen. The mass immigration of Anglo-American forty-niners to California—some 80,000 individuals in 1849, about the same numbers in the following several years—was overwhelmingly male and predominantly young. Yet it in-
cluded few children. Records of the overland migration kept at Fort Laramie, Wyoming, indicated the passing of 39,506 men, 2,421 women, and only 609 children at the end of the summer of 1850. In 1850, only 4.4 percent of California’s population were under ten years old, compared with 29 percent for the United States as a whole. Numbers for older children were higher in the same year but still lagged far behind the rest of the country. Children between the ages of ten and nineteen constituted 8.1 percent of the population, compared to 23.4 percent in the rest of the nation. By the mid-1850s the numbers of children in California were growing, as was their percentage of the population. Still, this process was slow. Records at the port of San Francisco for the year 1854 indicate the arrival of 47,811 immigrants by sea, of whom 38,490 were men, 7,131 were women, and 2,190 were children (Baur 1978, 142–143, 205). Of course, some Anglo-American boys and male adolescents under the age of eighteen did take part in the rush, and many non-Anglo children were on the scene from the beginning. For the former, their low number appears to have made them something of a novelty. As one boy who was eleven years old when he arrived in 1852 recalled: “San Francisco . . . was a paradise for boys. Men . . . petted them at every opportunity, and gave money to the youngsters with a free hand” (“Newsboys” 1882). Through 1849 and beyond, stories abounded of opportunities for young boys in the cities of San Francisco and Sacramento. Some made as much as $60 per day for shining boots and sharpening knives. Others found lucrative employment peddling fruit or newspapers, carrying letters, and saving places in line at always busy post offices.
Gold Rush As for non-Anglo boys, their numbers were much higher in California; yet for them the Gold Rush era was no paradise. Native American boys among California’s many Indian peoples experienced the rush as a disaster, and probably no Indian family was untouched by death from disease, dislocation, and violence. Although death was probably the most common experience of the Gold Rush for Indian boys, others survived, many by entering the households of white emigrants. Here they became domestics and adopted sons because a large number of miners seem to have entered into marriages of convenience with Native American women. Despite the romantic tales of opportunity for Yankee boys, several factors conspired to keep children and young boys in the United States from joining the rush to California. One was distance. In 1849, California could only be reached by an overland journey some 2,000 miles from St. Louis to Sacramento, by steamship and sail via the Isthmus of Panama, or by a six-month-long sail of more than 13,000 miles around Cape Horn. Other factors included the costs and dangers of such an arduous journey. Best estimates indicate that the typical forty-niner had to invest more than a year’s pay to cover the ocean passage to California between 1849 and 1852. The length of the voyage practically barred young boys from the Cape Horn route, and those who took steamer passage still faced seasickness, scurvy, and cholera along the isthmus. The overland route was less costly in these early years, and more young children traveled that way. Yet here too, parents lived in constant dread because overactive boys frequently fell from wagons, some to be crushed beneath iron-rimmed wheels.
313
A final barrier against boys wishing to join the rush was the questionable morality of gold seeking. Despite the tendency for later writers like Twain to draw seemingly natural connections between boyhood and treasure seeking, a great many antebellum-era Americans failed to see California as a healthy region for the development of the rightly constructed boy. To be sure, “gold fever” hit American boys hard during the summer of 1849 and fall of 1850. As one alarmed observer recalled, boys who became afflicted with the fever grew “restless at school, inattentive to customers in the store, and incompetent at the workshop or on the farm.” These boys experienced the Gold Rush but nearly always at a distance, through popular songs, published stories, and letters from California. In 1850, gold was still the “root of all evil,” and schoolboys learned the difference between “wealth” (which was earned through hard work) and “filthy lucre” (which, like the gold of an imagined California, was wealth quickly acquired and divorced from the time, energy, and work necessary for its moral production). Letters and published writings from the time indicate that even as they dreamed of California, antebellum boys could spout a veritable catechism of fables and aphorisms against its allurements, from “All that glitters is not gold” to references to the story of King Midas. Yet if the realities of distance, danger, cost, and morality kept the numbers of actual boys in California to a minimum, the imagery of the Gold Rush frequently centered on the forty-niner as an example of “natural” boyish exuberance. The idea that all forty-niners were boys, or even “b’hoys”—the term for the muscular and unaffected urban or western young men who, like Mike Fink and Davy Crockett,
314
Gold Rush
appeared regularly in early dime novels, adventure stories, and almanacs—reflected the literary and political conventions of the time. For antebellum readers, Manifest Destiny was itself a youth movement, a natural result of the growth of an adolescent republic. The United States, in turn, was “Young America,” a term that seemed to resonate with the pace of the nation’s first wave of industrialization, along with its rapid growth in transportation, trade, and area of settlement and urbanization. In effect, for gold seekers and observing public alike, the term boy as it was used to describe the forty-niners was more metaphorical than descriptive. Josiah Royce described 1849 as “the boyish year of California” in one of the early histories of the event, by which he hoped to capture the spirit of optimism and irresponsibility of the rush. Failure and “lost illusions” marked its later years, as expectations always overran realities, and few miners struck it rich. At the same time, throughout their disappointments, according to Hubert Howe Bancroft (1888), emigrants to the region and state would “do what they could to make themselves boys again.” In this context, practically every forty-niner was a “boy,” regardless of age, status, or position in life. As Jared Nash of Maine wrote during his voyage to the Pacific, one night following supper he and his fellow gold seekers “went on deck and there enjoyed our selves in dancing through the evening, and a merry set of boys we were” (Nash 1849). When he wrote this letter, Nash was in his midtwenties, was married, and had two children. Reference to boyhood, in other words, captured the spirit of the time and the event. For generations of Americans in turn, this spirit has imbued the rush with
a great deal of charm. Yet consciously or unconsciously, the largely unquestioned self-image of the boyish argonaut has masked a host of social realities that are far more serious. To begin with, a closer look at actual gold seekers reveals that as many as one-third were married, most were in their midtwenties and early thirties, and nearly all had strong family connections and economic responsibilities in the Midwest or Northeast. These men abandoned parents, wives, and children, leaving family members to deal with debts incurred in raising money for the voyage and to survive as best they could during one of the nation’s three most serious cholera epidemics. Men might not be able to do this sort of thing. Boys could. And so forty-niners became “boys.” The Gold Rush, in turn, became an exposition of youthful male high jinks. The record of these high jinks is vast and constitutes much of the record of the rush in the public imagination. In California and along the way, men in their twenties and thirties, married men, with families and children of their own, became boys: they had orange fights between California-bound ships off the coast of Brazil; they engaged in allmale dances and organized bullfights and bear fights; they joined vigilance committees and went in for lynch laws; and they threw rocks through the windows of Chinese businesses and shot newly purchased pistols and rifles at Indians just to see them jump. They innocently gave in to the region’s temptations, to the allurements of gambling halls, prostitutes, and the region’s daily violence against nonAnglo peoples in California. With its exposition of these behaviors for eastern observers, along with its gathering of them under the rubric of boyishness, the California Gold Rush was a wa-
Graffiti tershed for the concept of boyhood as it emerged in the second half of the nineteenth century. The model of boyhood as it emerged during the event and era in turn reflected a blend of traits: it blended early modern or Puritan ideals of the innate depravity of children with Victorian concepts of childish innocence; it rationalized irresponsibility and environmental destruction with a heavy layer of youthful exuberance. Even years later, as one “graybeard” from the Gold Rush recounted for John Muir, the forty-niner was still a boy, “always a boy, and [damn] a man who was not a boy” (Kowaleski 1997, 390). This cultural proposition, the idea that “men will be boys,” was still somewhat foreign to many Americans in 1849. During the Gold Rush and in the decades that followed, it would become so often repeated that by the time of Mark Twain’s writing it seemed a fact of nature. Brian Roberts References and further reading Bancroft, Hubert Howe. 1888. California Inter Pocula. San Francisco: History Company. Baur, John E. 1978. Growing Up with California: A History of California’s Children. Los Angeles: William Kramer. Haven, Alice Bradley [Cousin Alice]. 1853. “All’s Not Gold That Glitters”; or, the Young Californian. New York: Appleton and Company. Holliday, J. S. 1999. Rush for Riches: Gold Fever and the Making of California. Berkeley: University of California Press. Hurtado, Alfred. 1988. Indian Survival on the California Frontier. New Haven: Yale University Press. Johnson, Susan Lee. 2000. Roaring Camp: The Social World of the California Gold Rush. New York: W. W. Norton. Kowaleski, Michael, ed. 1997. Gold Rush: A Literary Exploration. Berkeley: Heyday Books and California Council for the Humanities. Levy, Jo Ann. 1992. They Saw the Elephant: Women in the California
315
Gold Rush. Norman: University of Oklahoma Press. “Newsboys of Old: How They Flourished in California Thirty Years Ago.” 1882. San Francisco Call, January 29. Roberts, Brian. 2000. American Alchemy: The California Gold Rush and Middle Class Culture. Chapel Hill: University of North Carolina Press. Rohrbough, Malcolm. 1997. Days of Gold: The California Gold Rush and the American Nation. Berkeley: University of California Press. Rourke, Constance. 1928. Troupers of the Gold Coast; or, the Rise of Lotta Crabtree. New York: Harcourt Brace. Royce, Josiah. 1886. California: From the Conquest in 1846 to the Second Vigilance Committee in San Francisco. Boston: Houghton Mifflin. Twain, Mark (Samuel Clemens). 1980. Roughing It. Reprint, New York: New American Library.
Graffiti Graffiti is the plural form of graffito. It is derived from the Italian word graffiare, meaning “to scratch,” and probably first appeared as a reference to the anonymous writings and drawings (“little scratches”) on the stone walls of Roman ruins. The term is currently used to refer to most kinds of spontaneous, informal, unauthorized, or illegal writings and drawings in shared public spaces. Since at least the nineteenth century, graffiti have been interpreted by scholars and collectors in Europe and the United States as important indicators of moral, political, and social trends; graffiti are frequently interpreted as signs of the times. Throughout the twentieth century, the public walls have been pressed into service as a mass medium to broadcast all sorts of graffiti, including political demands, slogans, racial slurs, erotic musings, folk wisdom, humor, and proclamations of love. During the late 1960s, a new kind of artistic name writing, which has often been
316
Graffiti
Men talk in front of a storefront covered with graffiti in Los Angeles, 1992. (Peter Turnley/Corbis)
called “graffiti,” appeared in the United States on the walls and subway systems of Philadelphia and New York City. Despite the controversy surrounding it, this artistic name writing (simply called “writing” by the young people, predominantly boys, who developed it) is unique in human history and categorically different from its predecessors. This new type of graffiti writing became the earliest foundation in the now long-lived hiphop movement, which also includes rapping, mixing, break dancing, and fashion. Images and markings made by humans on communal walls have been found dating from as far back as the Paleolithic period of human history. These markings, which later included pictographs and words, are among the artifacts found in
almost all ancient monumental structures. However, this sort of writing was probably produced in a more or less formal way by religious figures, political leaders, and others who had access to literacy and writing. As literacy became more widespread, informal and unauthorized writings by the more common people—graffiti in its contemporary meaning—became a possibility. Graffiti of this sort have been discovered in central Turkey dating from before 1000 B.C.E. and were common within the cities of classical Greece and the Roman Empire, including the walls of public latrines. Graffiti have been found in locations across the globe, including medieval European monasteries and churches, the Great Wall of China, and Mayan ruins.
Graffiti The Italian Antonio Bosio, who collected samples from the Roman catacombs, wrote one of the earliest books on graffiti in 1593 C.E. Influenced by romanticism, the rising interest in primitive art and graffiti during the eighteenth and nineteenth centuries was motivated by the belief that all creative works reflected the culture and society that produce them. Collectors, scholars, and journalists expanded their focus to include more contemporary graffiti, primarily in public latrines, taverns, and inns. Men and boys probably produced most of these graffiti. During the early nineteenth century, scholars and commentators began to take notice of contemporary urban children’s graffiti. Children were believed to be less corrupted by the adult world, thus possessing a natural innocence and unselfconscious creativity that made their art unique. By the mid-nineteenth century, the study of graffiti was accepted as a legitimate pursuit and graffiti had also gained a popular audience for its humorous qualities. The interest in the graffiti of boys and young men did not diminish in the twentieth century. In fact, the public interest seems to have grown along with the amount of graffiti in public spaces. This increase was caused by at least three historical developments since the mid-nineteenth century. First, the rapid and widespread urbanization that began in Europe and the United States after the mid-nineteenth century concentrated large and diverse populations into smaller areas. Second, mass production made the means of writing (such as crayons, chalk, pens and pencils, and later, ink markers and spray paint) more easily accessible to boys. Third, the slow change in the social role of boys from workers and contributors to a family income to students attending
317
school created the basis for greater literacy as well as a shared culture that valued and gave new significance to the graffiti they wrote. As young people were gathered into schools and reformatories, graffiti likewise became an institutionalized ritual. The graffiti written by hoboes on freight trains during the Great Depression, many of whom were teenage boys, as well as the chalk drawings of younger children on city streets, were photographed and collected during the 1930s and 1940s. Although the content of graffiti has been understood to reflect larger social trends, the motives of young people who write graffiti have been explained primarily in psychological and developmental terms. The majority of graffiti written by youth in cities is attributed to boys. Girls do create graffiti, but studies of “latrinalia” (graffiti in public toilets) have shown they do so in much smaller quantities. Public and scholarly interest in girls’ graffiti as well as the number of female graffiti artists has increased since the 1970s. The gender differences are typically explained by differences in the ways that boys and girls are socialized for public behavior. Several schemas for categorizing the common forms of graffiti have been developed since the middle of the twentieth century. Robert Reisner (1971) has created one of the more elaborate systems, with five major categories and variations under each one. Reisner’s graffiti categories include agnomina (names and initials); amorous; obscene, taboo, and erotic; intellectual (rhymes, sayings, folk wisdom, humor, and statements of a general sort); and protest. The vast majority of graffiti fall into the first four categories, with the largest number being names and initials. A sixth category, which covers territorial markers made by
318
Graffiti
gangs and other neighborhood-based groups, was introduced by David Ley and Roman Cybriwsky (1974) and is useful for understanding graffiti since 1950. During the 1960s, anticolonial, civil rights, and student movements around the world used the public walls to broadcast political criticism and demands for social change. Although older forms of latrinalia and public commentary continued, the most noteworthy graffiti of this period was an attempt by activists and the younger generation to communicate with authorities and the general public from outside the boundaries of the established media institutions, like television and journalism. The mass movements of this era highlighted the significant populations whose voices were being left out of the mainstream discussions about social change. Demands for better housing and employment, as well as criticisms of continuing poverty, discrimination, exclusions in the law, governmental deceit, and police brutality were painted on public walls, often accompanied by calls to violence or revolution. These graffiti were meant to sound the alarm for the large populations, including students and younger boys and girls, who were excluded from political power and dissatisfied with the world as it was. Territorial graffiti of gangs, the political graffiti of the 1960s and 1970s, and the proliferation of colorful, commercial signage in public spaces formed the social basis for the newest emergence of public wall writings, which has sometimes been called “graffiti art.” This aesthetically elaborate and usually illegal public script is called “writing” by its originators. Writing first appeared in Philadelphia in the mid-1960s, but its most noted development occurred on the New York City subway system beginning
in the early 1970s. Although the “message” of this public writing is a continuation of a much older tradition of name writing, the aesthetic elements and the widespread cultural organization of boys as well as girls producing the writing are radical new developments in the history of graffiti. These departures from the graffiti of the past and the negative associations that have been attached to graffiti traditionally have convinced many writers to reject this categorization altogether and to call their work art rather than graffiti. The early practitioners of writing in New York City were primarily workingclass African American, Puerto Rican, and immigrant youths between the ages of twelve and seventeen. As with past graffiti, boys were much more likely to participate in writing than girls were, although girls and young women have always been present in some proportion. As the practice of writing evolved over the next decade, the social makeup of writing culture diversified to include boys of European heritage as well as middle- and upperclass boys and young men. Although cartoon characters and urban landscapes become important later, writing has primarily been concerned with the name. Writing began as relatively small signatures (no more than 2 feet long) written with ink markers or spray paint on neighborhood walls but quickly moved to the surfaces of buses, trucks, and the subway system, including the station walls and the trains themselves. Writers rarely used their birth names. As their numbers increased and competition for space and attention grew, writers carefully created new names to add distinction and to maximize their visual and aural impact. Over time, new letter designs were invented that reshaped and fragmented the
Graffiti alphabet in a swirl of colors, so that the resulting name might be unreadable and might be recognizable to the unschooled eye only as a logo or symbol. Along with the innovation in letter design came an increase in the scale of the name. By the mid-1970s, writers in New York City were painting “masterpieces” that covered the entire side of a subway car, works that were 8 to 10 feet high and up to 75 feet long. These “whole car” works often incorporated cartoon characters; images of popular performers (John Lennon, Alice Cooper); and representational backgrounds from the urban landscape, like the skyline, Brooklyn Bridge, or Statue of Liberty. They also frequently incorporated political criticisms, popular slogans, holiday greetings, celebrations of special occasions (birthdays, the nation’s Bicentennial), and other messages. These messages and other easily recognizable images indicated a desire by writers to communicate with the wider urban public. Beginning in the early 1980s, writing was associated with the emerging hiphop movement, and writers found work as set designers for rap music videos and break dance performers and as illustrators for album covers. Writers developed a very organized culture that provided an appreciative audience for their works, supported creative developments of the form, and passed writing skills from one generation to the next. The illegal nature of most writing means that it has not been welcomed in most cities. Authorities in New York City undertook an expensive “war on graffiti” on the subways that lasted almost two decades. As writing moved away from its original locations on the East Coast and new writing cultures were formed in other U.S. and world cities, similar antigraffiti campaigns have been
319
waged. These confrontational campaigns have often resulted in reinforcing those writers who are engaged in “bombing,” writing their names in a simpler style in as many places as possible, over those writers engaged in creating masterpieces, which require more time for the writer to paint the elaborate mural-like designs. The public example of writing inspired a number of artists whose works have been recognized in major museums, exhibitions, and collections around the world. These include artists such as Jenny Holzer, Barbara Kruger, Keith Haring, Jean-Michel Basquiat, Jonathan Boyarsky, John Fekner, and Richard Hambleton. A number of the writers themselves have also painted on canvas and exhibited in galleries, first in the mid-1970s and again during the New York art boom of the early 1980s. Although a few have been able to establish themselves within the world of fine art and now have work in major public and private collections, their numbers are tiny compared to the total number of boys and girls who write. In the 1980s, writing moved out from major East Coast cities in the United States and was taken up by young people all over the globe. By the late 1980s, when the city of New York successfully rendered the subways “graffiti-free,” writing culture had established new centers of activity in several European countries as well as Australia, with cities in Central and South America, Scandinavia, South Africa, and Japan following soon after. As cities successfully remove the writing from subway systems, most writers have since moved to work on freight and passenger trains, returning again to the site of youth writing that began among the teenage hoboes of the 1930s. Joe Austin
320
Grandparents
References and further reading Abel, Ernest. 1977. The Handwriting on the Wall: Toward a Sociology and Psychology of Graffiti. Westport, CT: Greenwood Press. Austin, Joe. 2001. Taking the Train: Youth, Urban Crisis, Graffiti. New York: Columbia University Press. IGTimes, in association with Stampa Alternativa. 1996. Style: Writing from the Underground. Terni, Italy: Umbriagraf. Ley, David, and Roman Cybriwsky. 1974. “Urban Graffiti as Territorial Markers.” Annals of the Association of American Geographers 64: 491–505. Powers, Stephen. 1999. The Art of Getting Over: Graffiti at the Millennium. New York: St. Martin’s Press. Reisner, Robert. 1971. Graffiti: Two Thousand Years of Wall Writing. New York: Cowles Book Company. Sheon, Aaron. 1976. “The Discovery of Graffiti.” Art Journal 36, no. 1: 16–22. Stewart, Jack. 1989. “Subway Graffiti: An Aesthetic Study of Graffiti on the Subway System of New York City, 1970–1978.” Ph.D. diss., New York University.
Grandparents Whether a boy knew his grandparents at all, let alone enjoyed an amicable relationship with them, depended strongly on when he was born, the part of the country in which he lived, and what social class to which he belonged. In the seventeenth century in New England, where Puritans migrated in family groups and lived in towns surrounded by friends and relatives, boys could grow up knowing one or both of their grandparents. Because the sex ratio was comparatively even in New England, it was not difficult for young women and men to find mates, marry, and form families. Mothers could bear as many as eight to eleven children in about two-year intervals over their fertile years, and older children could be leaving home as younger ones were born.
Parents thus could become grandparents at the same time that they were still producing children. Those men and women in inland communities who lived into their early seventies could survive long enough to participate in their grandchildren’s lives. In contrast, in the seventeenth-century Chesapeake region of the South, most of the early immigrants were young men and women who came as indentured servants and would serve four to seven years working in the tobacco fields before they could form a family. Since there were fewer young women than men in the Virginia and Maryland colonies, many men had to wait to find a wife or never married at all. For those young men and women who did marry and form families, the length of the marriage could be as short as ten years because of the high death rate. Malaria, which was endemic in the South, attacked the population randomly but was particularly deadly to pregnant women. Boys who survived infancy and childhood often lost a parent before they grew to adulthood, and few boys had a living grandparent. By the eighteenth and nineteenth centuries, the death rate had diminished in the South but was still higher there than in the North. More parents survived into adulthood than previously, but many mothers or fathers still died when their children were young. Half- or fully orphaned boys and girls frequently moved in with relatives, sometimes with a grandparent. Mothers in southern planter families were keenly aware of the importance of older relatives to their children, both as possible surrogate parents, should the need arise, and as mentors to youngsters. Jane Hamilton illustrated such concerns when she wrote about her fouryear-old son: “I try to make him behave
Grandparents himself at the table and be polite to every body, but particularly his Grandfather” (Censer 1984, 48). Hamilton’s remarks also reveal the important role that relatives played in the socialization of children on the eastern seaboard. Sons and grandsons were expected to be obedient and respectful to older relatives. But because the southern planter played the part of patriarch, he could be a particularly stern and distant figure to his grandchildren. Boys in the North in the eighteenth and nineteenth centuries were more likely to enjoy the advantage of knowing their grandparents. Few children actually lived in the same household as their grandparents, but many lived close to older relatives. A well-to-do figure like Benjamin Franklin, who was estranged from his son William, could still play an influential role in his grandson’s life. If parents were impoverished, however, and unable to support needy grandparents, grandmothers and grandfathers might have to move into local almshouses and be supported at public expense. In such cases, boys and girls would not be able to have much contact with their grandparents. After 1800, the high colonial birthrate began to decline in the North as parents relied on various methods to limit family size. Although a mother might bear four to five children at two-year intervals early in the marriage, after the age of thirty-five, she might begin to limit births. Children could be raised more intensively in these slightly smaller families, and grandmothers played important roles in watching over health and teaching values and manners. As the role of patriarch softened in the North, a surviving grandfather could become a more empathic figure. Older people were highly valued in the early-nineteenth-century
321
United States for their life experience and the wisdom gleaned from their closeness to death. Grandparents could play an important role in the socialization of boys. Although more multigenerational families lived together between 1850 and 1885 than at any other time in U.S. history, in only 20 percent of all households did children live with grandparents (Coontz 1992, 12). Americans expected to live in nuclear families of parents and children and moved in with grandparents or took grandparents into their own homes only when it was economically necessary. In such families, grandmothers might watch the youngest children while mothers worked inside or outside the home. Grandfathers might teach grandsons skills they needed to work on farms or at various crafts. Among families that moved west in the nineteenth century, very few migrated with grandparents. Hence, boys and girls who moved west with their parents to the farming or mining frontiers often lost contact with their grandfathers and grandmothers and perhaps never saw them again. In the twentieth century, most American families were nuclear ones; extended families that included grandparents were rare. However, during the Great Depression when fathers lost jobs, they, their wives, and their children often had to move in with grandparents for economic reasons. During World War II, when fathers went off to fight, wives and children again frequently moved in with grandparents. According to family counselors of the time period, multigenerational households experienced increased intergenerational conflict, which often centered on child training. A boy growing up in the Depression or World War II might have a greater chance than boys who lived before or after him to live with
322
Grandparents
A grandfather helps his grandson learn to ride a bike. (Photodisc)
and know his grandparents, but he might also experience more familial tension between his parents and grandparents. As life expectancy increased in the 1900s and fertility and years with dependent children in the household decreased, more and more adults became grandparents and enjoyed unique intergenerational relations with their grandsons and granddaughters. Thanks to gains in life expectancy, an adult who became a parent at age twenty-five might become a grandparent at age fifty and might reasonably expect to become a great-grandparent around age seventyfive. Contrary to popular myths, middle age, not old age, was the time most persons became grandparents in the twenti-
eth century. Most first-time grandparents were in their midforties or -fifties and still in the workforce. Half of adults aged forty-five to fifty-nine and 89 percent of those aged sixty and older were grandparents (Hogan, Eggebeen, and Snaith 1996, 39–40). During the twentieth century, increased longevity and differential fertility patterns increased the likelihood that most adults no longer had their own dependent children in the household when they became grandparents and lived to enjoy their grandchildren for more years. Modern grandparents are healthier, more financially stable, and more physically active than grandparents were a few decades ago. Grandparents today typically play an active part in the lives of their grandsons and granddaughters. They are financially better off and better able to assist grandchildren than were their counterparts in previous generations, but they also have a greater number of competing interests outside the family network, at work and in the community. A man’s willingness to accept the role of grandparent depends on how well he knew his grandparents as a boy. Those who spent a lot of time with their grandparents generally assume an active grandparenting role themselves and provide a critical factor in family continuity. Women are most likely to embrace the grandparent role and be involved with grandchildren. Men, who are more likely than women to be involved in a career at the time they become grandfathers, are less likely to be active grandparents. Grandparents report easier relationships with granddaughters than with grandsons, especially when grandsons become adolescents. Grandparents are a stabilizing factor in families. The presence of grandchildren, on average, increases contact between grandparents and their children who are
Grandparents parents, which supports the claim that grandparents are the family “kin keepers” and link resources between generations to those in need, especially in African American families. Regardless of nationality or ethnicity, grandmothers initiate and maintain more contact across generations than do grandfathers. In the United States, where adult children form nuclear families and the grandparent role is not rigidity scripted, most grandparents have minimal control over their grandchildren and enjoy relationships with the youngsters that are friendly, informal, companionable, and indulgent. In the 1950s, researchers identified five major styles of grandparenthood: the formal, the fun seeker, the distant figure, the surrogate parent, and the reservoir of family wisdom. Thirty-three percent of grandparents in the 1950s were formal, and only 4 percent were reservoirs of family wisdom, an authoritarian pattern in which the grandfather dispenses wisdom, special skills, and resources (Neugarten and Weinstein 1964, 199–204). By the 1990s, grandparents were more likely to be fun seekers involved in their children’s and grandchildren’s lives. Although this pattern of closeness may create tighter bonds, it may also increase conflict if the generations disagree on childrearing practices. Regardless of grandparenting style, grandparents are a stabilizing force in the family and a source of family continuity. In the late twentieth century, grandparents suggested that their roles had changed. Fewer grandparents reported relationships with grandchildren as relaxed, many said they were less apt to take their grandchildren to special outings, act as their confidante, or bring grandchildren presents the parents would not purchase. Nevertheless, communication between
323
the grandparent and grandchild generations was high. Two-thirds of grandparents talked with grandchildren on the telephone at least once a month, half of grandparents had grandchildren over for a meal in a month, and about 40 percent of grandchildren spent a night at their grandparents’ house each month. Only 10 percent of grandparents had no monthly contact with grandchildren (Coontz 1992, 14, Hodgson 1992, 204–207). Modern grandparents are indulgent, and manufacturers target grandparents who spend an average of $650 on their grandchildren per year (Morgan and Levy 1993, 3–4). Grandparents are twice as likely to purchase clothing than toys for their grandchildren, which may indicate that those grandparents assist their adult children’s households. Vacations designed especially for grandparents and their grandchildren are blossoming, and museums offer free admission to grandparents with grandchildren. For many of the current elderly in the modern United States, grandparenthood has become a status symbol. A trend in modern grandparenting is the increasing number of grandparents providing child care for or raising grandchildren. Nearly 40 percent of grandparents in the 1990s provided child care to their grandchildren on a regular basis (Smith 2000, 3). African American families were more likely to use a grandparent for child care than white families. Much of this increase in grandparents providing child care was due to the rise in female employment and female-headed households. Grandmothers who provided child care were likely to have more education, live with their spouses, be more active in their community, and be younger than those who do not provide child care (Casper and Bryson 1998, 198).
324
Grandparents
Another modern development is the increasing number of children under age eighteen who live in their grandparents’ homes (Smith 2000, 1). According to the U.S. Census Bureau’s 2000 Report, the number rose from 2.2 million children in 1970 to 3.9 million in 1997. Altogether, 8 percent of U.S. households with children have grandparents living within them as well. Moreover, in three out of four of these households that include grandparents, parents, and grandchildren living together, the grandparent is the head. Grandparents who head households are younger (average age fifty-four) than grandparents who do not head households (average age fifty-eight). The percentage of grandparents raising grandchildren with no parent present is growing. In 1995, grandparents had sole custody of nearly 6 percent of African American grandchildren and 2 percent of white grandchildren. More than half of these grandchildren had a mother only, nearly 30 percent had neither parent living, 5 percent had a father only, and 17 percent had both parents living (Casper and Bryson 1998, 198). On average, grandparents raising grandchildren without a parent in the home are less educated, poorer, and more likely to be unemployed than are grandparents who are not responsible for raising grandchildren. Grandchildren raised in grandparents’ households have more behavioral problems and lower academic achievement than children reared in two-parent households, but a positive finding is that these grandchildren are doing better than those raised in singleparent households. Divorce can cause stress in the grandparent-grandchild relationship. For many paternal grandparents, visits with grandchildren decrease after their adult son divorces and custodial rights are given to
the mother. Some grandparents have gone to court to secure visitation rights to see grandchildren, and others have sought other options such as combining visits when the grandchildren are with their fathers. Regardless, after divorce, grandsons and granddaughters have fewer interactions with their grandparents unless the grandparents make extreme efforts to visit their grandchildren. Little specific information on grandparent relationships with grandsons exists; but longer life expectancies, changes in gender roles, and more grandparentgrandchild interaction in recent years are certain to have an impact on current and future grandparent relationships. More grandparents are caring for or raising grandchildren. Modern grandchildren and grandparents are fortunate that they will have many more active years to nurture their relationships than did grandparents in the past. Janice I. Farkas References and further reading Burton, Linda M., Peggy DilworthAnderson, and Cynthia Merriwether–de Vries. 1995. “Context and Surrogate Parenting among Contemporary Grandparents.” Marriage and Family Review 20: 349–366. Carr, Lois Green, and Lorena S. Walsh. 1979. “The Planter’s Wife: The Experience of White Women in Seventeenth-Century Maryland.” In A Heritage of Her Own: Toward a New Social History of American Women. Edited by Nancy F. Cott and Elizabeth H. Pleck. New York: Simon and Schuster. Casper, Lynne, and Kenneth Bryson. 1998. Co-resident Grandparents and Their Grandchildren: Grandparent Maintained Families. Population Division Working Paper no. 26. Washington, DC: Population Division, U.S. Bureau of the Census.
Great Depression Censer, Jane Turner. 1984. North Carolina Planters and Their Children, 1800–1860. Baton Rouge: Louisiana State University Press. Coontz, Stephanie. 1992. The Way We Never Were: American Families and the Nostalgia Trap. New York: Basic Books. Haraven, Tamara K. 1982. Family Time and Industrial Time: The Relationship between the Family and Work in a New England Industrial Community. Cambridge and New York: Cambridge University Press. Hodgson, Lynne. 1992. “Adult Grandchildren and Their Grandparents: The Enduring Bond.” International Journal of Aging and Human Development 34: 209–225. Hogan, Dennis, David Eggebeen, and Sean Snaith. 1996. “The Well-Being of Aging Americans with Very Old Parents.” Pp. 327–346 in Aging and Generational Relations over the Life Course. Edited by Tamara Haraven. German: Aldine de Gruyter. Levy, Barry. 1988. Quakers and the American Family: British Settlement in the Delaware Valley. New York: Oxford University Press. Morgan, Carol M., and Doran J. Levy. 1993. “Gifts to Grandchildren.” American Demographics 9: 3–4. Neugarten, Bernice L., and Karol K. Weinstein. 1964. “The Changing American Grandparents.” Journal of Gerontology 26: 199–204. Nye, F. Ivan, and Felix M. Berardo. 1973. The Family: Its Structure and Interaction. New York: Macmillan. Rossi, Alice S., and Peter H. Rossi. 1990. Of Human Bonding: Parent-Child Relations across the Life Course. New York: Aldine de Gruyter. Smith, Kristin. 2000. Who’s Minding the Kids? Child Care Arrangements. Washington, DC: U.S. Department of Commerce, Economic and Statistics Administration, U.S. Census Bureau. Thomas, J. L. 1990. “The Grandparent Role: A Double Bind.” International Journal of Aging and Human Development 31: 169–177. Wilks, Corinne, and Catherine Melville. 1990. “Grandparents in Custody and Access Disputes.” Journal of Divorce and Remarriage 13: 36–42.
325
Great Depression Between October 1929 when it began and December 1941 when it ended, the Great Depression had a profound impact on the health, schooling, work, and leisure activities of American boys. Although boys from all racial, ethnic, socioeconomic, and geographical backgrounds were affected, African American boys and youths from large rural families were especially hard hit. These boys frequently took on adult responsibilities at an early age, often leaving school for low-paying jobs to help the household economy. With more than one-quarter of the workforce unemployed, however, boys were not always able to find work, and thousands of teenagers took to the road in search of better opportunities. Some of them became migrant workers, traveling with their families, and others became part of the large 1930s transient population who “rode the freights.” Although the Hoover administration did little to address youth problems, under President Franklin D. Roosevelt’s New Deal, organizations such as the Civilian Conservation Corps (CCC) and the National Youth Administration (NYA) helped provide deprived teenagers with jobs, skills, and financial resources to allow them to continue their education. One of the most remarkable ways in which the Depression affected children was by dramatically lowering the standard of nutrition and health care they received. During the 1930s, widespread unemployment and reduced wages placed one-third of the nation below the poverty line and another third at a basic level of subsistence. Because there was no federal and very little local relief, the first three years of the Depression were particularly rough on children’s health. By 1933 infant
326
Great Depression
Boys such as this one picking vegetables worked to support their families during the Great Depression. (Library of Congress)
mortality rates had risen to 69 per 1,000 live births for all babies, and to 94.4 for African American infants. Although very few children actually starved to death during this period, some poor rural youths did die from malnutrition-induced diseases. In Harlan County, Kentucky, for example, 231 children died from malnutrition-related illness between 1929 and 1931. Food handouts, clothing donations, and other forms of federal relief such as Aid to Dependent Children (ADC; later Aid to Families with Dependent Children, or AFDC) brought important direct assistance to many youths. However, the insufficient health care and nutrition boys received during the early and middle years of the Depression did have lasting results. More than 40 percent of the first million men examined for World War II military
service were rejected because of health problems stemming from the boys’ diets and health care during the Depression. With increased family hardships came additional responsibilities for middleclass and working-class children. Frequently, there was a gender division in the tasks children performed; boys were more likely to secure paid part-time work, whereas girls often took on additional domestic responsibilities such as cooking, caring for younger siblings, and making clothes. In his famous 1930s study of Oakland, California, youths, Glen Elder (1974) found that more than 50 percent of the adolescent boys were working part-time jobs to help make ends meet at home. Although children’s labor was never well paid during the Depression, part-time jobs were more plen-
Great Depression tiful in urban areas. City boys could work as newspaper carriers, shoeshine boys, janitorial assistants, store clerks, messengers, or delivery agents. Paid positions for children in rural areas were rarer, but boys still found ways to earn money. Sixyear-old Aaron Barkham and his older brothers, for instance, sold moonshine to miners in West Virginia. Other boys in small towns collected empty medicine jars and soda bottles, which they sold to pharmacists for 2 cents apiece. Despite the large numbers of boys working part-time jobs, school attendance for American children on the whole actually increased during the 1930s. This was due in part to the efforts of the National Recovery Administration (NRA), which passed codes restricting child labor (they were later declared unconstitutional), and the NYA, which offered work-study programs for impoverished students. Mostly, boys stayed in school longer because full-time employment was simply unavailable. Nevertheless, completing high school remained a luxury for many African American boys, especially in the South. There poverty rates among black families were especially high, and African American boys sought work, even at the lowest-paying jobs, rather than remain in school unemployed and hungry. Although the Great Depression caused boys to mature faster than their more affluent 1920s peers, school-age boys still found time to play games and enjoy some forms of popular entertainment. Curtailed family resources often meant that boys had to be more inventive in the games they played. Recalling his youth in Okemah, Oklahoma, Robert Rutland writes that boys played cops and robbers and cowboys and Indian games with “homemade wooden guns that shot ‘bul-
327
lets’ made from cut-up inner tubes” (1995, 85). Boys could also obtain cheap toys by cutting out cardboard soldiers printed on the backs of cereal boxes or by purchasing army figures at the local fiveand-dime store. Most other toys and popular pastimes were inexpensive as well, and boys all over the country idled away hours climbing trees, roller skating, or playing games of touch football, sandlot baseball, and marbles. Radio programs also offered boys hours of entertainment—at least for those in areas with electricity. (Until the late 1930s some pockets of the rural South were without running water and electricity.) Boys fortunate enough to have a radio at home could tune in to amusing programs such as Amos ’n’ Andy, Little Orphan Annie, and The Jack Benny Show as well as to Roosevelt’s encouraging “fireside chats.” Boys without home radio sets could also listen occasionally; during the World Series, local radio shops played broadcasts of the games, allowing hundreds to listen outside the store. Sometimes radio stores even put up scoreboards with lights to help listeners follow the game as they rooted for heroes like Carl Hubbell, Lou Gehrig, Joe DiMaggio, and Dizzy Dean. For most American boys, though, an afternoon at the movies far exceeded listening to the radio or, during the late 1930s, reading comic books in popularity. Even very poor families found ways to set aside a nickel or a dime to send a child to the movies. Eight-year-old Slim Collier of Waterloo, Iowa, lived in a house with no running water but still managed to attend the Saturday matinee each week. As Collier recalls, “A dime was a weekly event. It bought me a bag of popcorn and a seat in the third row of the theater where I could see Bob Steele shoot off the
328
Great Depression
Government agencies found jobs for youths during the Great Depression. (Library of Congress)
Indians. On Saturday—buffalo nickel day, they called it. It provided conversation to my schoolmates for the rest of the week” (quoted in Terkel 1970, 96). For many boys like Collier, the weekly matinee brought hours of entertainment and escape from family hardships; for one price children got to see a double feature (which often included a western or gangster film and a comedy) and a cartoon. Comedies such as the Marx Brothers’ film Duck Soup remained especially popular throughout the 1930s as audiences— young and old alike—went to the movies to be amused rather than challenged by what they saw. While millions of Americans were seeking relief and affirmation at movie theaters, hundred of thousands of teenage boys turned to the “open road” for escape. In 1932, the National Children’s
Bureau estimated that more than 250,000 youths had already joined the existing transient population. Although girls sometimes took to the road, most of the new transients in the United States during the 1930s were boys, some barely thirteen. Many, like fourteen-year-old Ed Paulsen, “rode the freights,” stowing away on boxcars to travel from one town to the next. Boy “tramps,” as they were called, picked up odd jobs wherever they could find them, but high unemployment rates often meant that there were no more positions at the current place than there had been at the city before. For homeless boys, survival often meant stealing—milk off back porches, clothes off lines, or bread and crackers from grocery stores. It also meant adopting strategies like traveling in pairs or gangs to protect themselves from police officers and
Great Depression “wolves,” adult predators who seduced or robbed young homeless boys. Boys caught breaking the law were often turned over to transient camps, where they might enter a federal relief program. Nevertheless, many homeless boys remained in charge of their destinies, and they experienced the freedoms of the road as long as they wished. Not all boy “tramps,” however, enjoyed the same social mobility in riding boxcars or hitchhiking. Jim Crow (segregation) laws in the South and other forms of prejudice throughout the nation presented special obstacles for African American teenagers. The most famous example of the injustices these boys endured was the Scottsboro case, which made international headlines throughout the 1930s. Picked up while riding on a freight train in Alabama, the nine “Scottsboro Boys” were accused of raping two white women who had been traveling on the same train. Despite numerous inconsistencies in the two women’s stories and doctors’ testimony stating that no rape had occurred, the boys were convicted by an all-white jury and, with the exception of the thirteen-year-old, sentenced to death. Although none of the boys was sent to the electric chair, they spent several years in jail while their cases were tried and retried by the State of Alabama. The five youngest were released in 1937 after the charges against them were dropped, but the last one, Clarence Norris, was not paroled until 1946. In addition to the numerous teenage boys who left home in search of better opportunities, thousands of other youths were part of transient families. Some of these were white families like the Joads in John Steinbeck’s The Grapes of Wrath, who lost their homesteads because of
329
Dust Bowl devastation. Many families, though, were part of America’s nonwhite rural poor—Native American, African American, and Mexican American migrant workers and sharecroppers. Life for boys in migrant families during the Depression was especially hard and uncertain. They spent most of the year “following the crops,” traveling from farm to farm to pick in-season produce. During winter months, they would receive a break from their difficult labor and could attend school for a few months uninterrupted. Most of the year, however, school attendance was irregular. As Cesar Chavez recalls, during the 1930s he and four siblings would “go to school two days sometimes, a week, two weeks, three weeks at the most. . . . We started counting how many schools we’d been to and we counted thirty-seven. Elementary schools. From first to eighth grade” (quoted in Terkel 1970, 56). For boys from migrant families, basic childhood pursuits like school attendance and afterschool play were real luxuries. Although most boys from transient families fell through the holes in Roosevelt’s “safety net,” more than 5 million other youths benefited from federal agencies such as the NYA and the CCC. A division of the Works Progress Administration (WPA), the NYA was established in June 1935 to assist impoverished youths between the ages of fourteen and twentyfour through a variety of programs. The NYA student work program helped impoverished high school and college students remain in school by providing them with part-time jobs paying from $6 to $30 per month. The tasks students performed varied greatly according to community need and geographical location, but they generally prepared boys and girls for the types of jobs they were likely to hold in
330
Guns
the future. In addition, the NYA out-ofschool work program allowed nonstudent teens to obtain supervised jobs with existing local hospital, school, public, and social agency programs. Boys’ primary areas of employment were conservation, woodworking, construction, and repair, and girls usually worked in child care, health care, and sewing. Unlike the NYA, which helped boys and girls in equal numbers, the CCC was specifically designed to assist teenage boys and young men. Started in April 1933, the CCC put unemployed males to work preserving U.S. natural resources. Initial corps members had to be physically fit, unmarried, male citizens between the ages of eighteen and twentyfive, but later the minimum age for junior enrollees was dropped to seventeen. Boys lived at camps throughout the country, where they served six-month to two-year terms doing outdoor work, calisthenics, and various forms of training. Although sometimes criticized for its military character and exclusion of young women, from 1933 to 1942 the CCC had tremendous benefits for its 3 million members, enrollees’ families, and the national landscape. Members not only learned marketable skills and earned degrees through educational programs, but they also improved their general health and fitness, gaining an average of 12 pounds in weight and 1 inch in height. Families likewise benefited from members’ enrollment; the program required that members send home a minimum of $22 out of their $30 monthly check. Furthermore, the CCC planted 2 billion trees, erected 66,000 miles of firewall, built 122,000 miles of minor roads, stocked streams and ponds with 1 billion fish, and constructed more than 300,000 permanent dams and 45,000 bridges. The CCC also touched the lives
of numerous boys and girls who visited the hundreds of parks, playgrounds, campsites, and recreational facilities built by corps members. Christina S. Jarvis See also Films; Toys References and further reading Badger, Anthony. 1989. The New Deal: The Depression Years, 1933–1940. New York: Noonday Press. Bernstein, Irving. 1985. A Caring Society: The New Deal, the Worker, and the Great Depression. Boston: Houghton Mifflin. Elder, Glen H. 1974. Children of the Great Depression: Social Change in Life Experience. Chicago: University of Chicago Press. Hawes, Joseph M. 1997. Children between the Wars: American Childhood, 1920–1940. New York: Twayne Publishers. Holland, Kenneth, and Frank Ernest Hill. 1942. Youth in the CCC. Washington, DC: American Council on Education. Rutland, Robert Allen. 1995. A Boyhood in the Dust Bowl. Boulder: University Press of Colorado. Terkel, Studs. 1970. Hard Times: An Oral History of the Great Depression. New York: Pantheon. Watkins, T. H. 1999. The Hungry Years: A Narrative History of the Great Depression in America. New York: Henry Holt.
Guns The controversy over guns and their standing in society becomes no more fervent than when it centers on children. During the 1990s, the United States witnessed many high-profile shooting sprees in which the shooters were teenage boys. The most shocking occurred in 1998 in Littleton, Colorado, where two Columbine High School students killed twelve fellow students, one teacher, and then themselves. Other shooting sprees occurred in Santee,
Guns
331
Farm boy shooting an air rifle, Utah, 1940 (Library of Congress)
California; Springfield, Oregon; Pearl, Mississippi; West Paducah, Kentucky; Jonesboro, Arkansas; Edinboro, Pennsylvania; Raleigh, Virginia; and Conyers, Georgia. These acts of violence reveal that “youth gun violence” really means “boy gun violence.” They also spawn many questions, the most important of which include: How prevalent is youth gun violence? Is its incidence waxing or waning? What causes the violence? The prominence of the gun culture and its impact on boys have waxed and waned during the history of the United States. Despite popular conceptions in the media and even in many textbooks, most boys before the Civil War had little contact with firearms, even during the
colonial period in the seventeenth and eighteenth centuries. Michael Bellesiles’s review of probate records in New England and Pennsylvania between 1765 and 1850 revealed that only a small fraction of Americans, at most one in ten, possessed firearms. Firearms possession became more popular after the Civil War as a generation of young men were exposed to guns and became proficient in their use. Many soldiers, in fact, were allowed to keep their weapons after the war ended. During the late nineteenth century the gun culture expanded, fueled by elements in popular culture such as dime novels depicting violence and the Wild West shows of Buffalo Bill Cody and other western heroes (Bellesiles 2000).
332
Guns
In the early twentieth century, firearms manufacturers began targeting boys as a new market for their products. For example, the Remington Company introduced a “Boy Scout Special” rifle. Shooting became popular in many scout troops, and in 1916 the Boy Scouts of America brought out a Marksmanship Merit Badge (Bellesiles 2000, 442–443). Gun ownership by households and by boys peaked in the years after World War II, although it has been in decline during the past thirty years. In the mid-1970s about one in every two households possessed at least one firearm. By 1998, however, only about one in every three households did so (“General Social Survey” 1999). The National Longitudinal Survey of Youth reveals that about one in every eleven boys between the ages of twelve and sixteen currently carries a handgun at some point during any twelve-month period. These boys live in urban as well as rural areas and are as likely to be white as they are to be African American. They are more likely than their nongun-carrying counterparts to be involved with drugs and gangs (Juvenile Offenders 1999). The most recent government data (1999) reveal that some 1,535 children from the ages of one to nineteen were murdered by guns, representing twothirds of all those killed in this age group. Most children who are murdered before the age of thirteen are beaten to death by their parents or a close relative. Between the ages of thirteen and nineteen, the period when three-fourths of all child homicides occur, murder is most often gang-related and overwhelmingly inflicted by gunshot (82 percent), with the victims overwhelmingly male (83 percent) and African American (58 percent, although this group constitutes only 15 percent of the youth population). The sta-
tistics on youthful murder offenders are similar: 1,711 youths committed a murder in 1999; 93 percent of them (1,591) were boys, and most of their murders were committed by guns (70 percent), especially handguns. The statistics on youth suicide are similar to those for murder: For every two murders, there is one suicide; these suicides are committed mostly in the teen years (95 percent), mostly by males (80 percent), and mostly by way of firearms (67 percent). U.S. youth homicide and suicide rates far exceed those for other economically developed nations: the U.S. youth homicide rate is five times the number of the typical western European nation (2.57 per 100,000 U.S. children versus 0.51 per 100,000 European children), and youth homicides involving firearms are sixteen times greater. The pattern is the same for suicide: the U.S. rate is twice the rate of other developed countries (0.55 versus 0.27), and eleven times the rate for suicides involving firearms (Crime 2000; Juvenile Offenders 1999). Murder rates for children, both as victims and as perpetrators, rose rapidly between 1984 and 1993 but declined thereafter. Virtually all the rise and all the subsequent decline were gun-related. More specifically, homicides by family members held constant (most cases involving young children being beaten to death), but homicides by acquaintances—the majority gang-related—increased substantially (Homicide Trends 2001). Most youth gun violence involves teenage boys from poor neighborhoods, which implies that the schoolyard violence that generates so much media attention is a rarity. And indeed it is: during the 1990s there were about forty-nine schoolyard shooting deaths per year com-
Guns pared to more than 2,000 annually away from school. The following two charts confirm both of these sets of gun violence facts (Kids and Guns 2000; Juvenile Offenders 1999): Figure 1 Shooting Deaths Inside School (left bar) vs. Away from School (right bar)
Figure 2 By-Gun Homicide Rates of White vs. Black Teenage Boys, Ages 14–17
Most of the boys involved in the highprofile schoolyard shootings grow up in areas where hunting and shooting are common and many youths own their own hunting rifles and shotguns. The inner-city youths involved in gun violence also grow up in areas where guns are commonplace and many youths carry guns for protection. Both kinds of youths are barraged by television shows and movies that depict gun violence as a routine way in which “real men” solve their
333
problems. Department of Justice studies reveal that after “protection,” the second most common reason youths give for carrying firearms is that “your crowd respects you if you have a gun” (Kids and Guns 2000). Recent psychological research provides a profile of the boys most likely to want to use a gun to solve their problems. Jeremy Shapiro et al. (1997) and his colleagues found that these boys are comfortable around guns; they feel excitement when they touch a gun; they believe that an insult merits an aggressive response; and finally their feelings of power and of security increase when they carry guns or when they are with friends who are carrying guns. To many observers, a key reason that too many American boys have become overly involved with guns is the lack of strict gun control in the United States. Most gun violence in the United States, both youth and adult, involves handguns. Handguns are all but barred from personal possession in most economically developed nations. The issue of whether strict gun control would reduce violence is controversial because many of the nations having strict gun control laws also have less inequality and less heterogeneity—both of which are strong predictors of violent crime. A close examination of state-level gun laws in the United States, however, reveals wide differences in the level of controls aimed at keeping guns out of the hands of youths. These laws include whether it is illegal for a minor to own a handgun, whether it is illegal to sell a handgun to a minor, and whether adults are required to store their firearms out of the reach of children. It is noteworthy that such laws correlate with youth gun violence. For example, Massachusetts has laws forbidding minors from
334
Guns
owning handguns, forbidding adults from selling handguns to minors, and requiring gun owners to prevent children from gaining access to their guns (e.g., by storing them in locked steel cabinets); its rate of gun deaths due to homicide, suicide, and accidents for children age one to nineteen is 1.4 per 100,000 population. In contrast, Louisiana has none of these gun control laws, and its equivalent gun death rate is 10.8 (Firearm Injuries 2000). Several organizations, including Handgun Control and the Children’s Defense Fund, rate the fifty states on the strictness of their gun laws. All such ratings correlate inversely with the rate of child gun deaths. The following plot shows how one such rating correlates significantly with child gun deaths from homicide, suicide, and accidents for the years 1996–1998 (Mountjoy 2000).
Youth Gun Deaths 96–98 per 100,000 population
Figure 3 Gun Death Rates of Youths 0–19 by Strictness of Gun Control
The state-level gun control data speak to one solution for reducing youth gun violence—that is, the passing and enforcing of laws aimed at keeping firearms out of the hands of youths. Recent social science research shows two other promising solutions. Since not all youthful gun owners are equally dangerous, it makes
sense to zero in on those who are. In 1996 the city of Boston began a program called “Operation Ceasefire” to do just that. The program involves notifying gang members that carrying firearms will precipitate a swift and severe response—including federal prosecution and the disruption of drug activities. Within two years, homicides of teenage boys and young men under the age of twenty-four fell by two-thirds. Second, research in Norway in the early 1980s revealed that schools that directed their attention toward reducing “bullying” witnessed significant reductions in fighting and violent behavior. Based on this research, a federally sponsored program in South Carolina was begun in 1997 to reduce bullying behavior in a sample of middle schools. The program has shown early success in reducing antisocial and violent behavior (Accessibility of Firearms 2000). The long-term impact has yet to be determined, but the Norwegian research would support optimism that the effects will be enduring. Gregg Lee Carter See also Bullying References and further reading Accessibility of Firearms and the Use of Firearms by or against Juveniles. 2000. Washington, DC: Office of Juvenile Justice and Delinquency Prevention, U.S. Department of Justice. Bellesiles, Michael A. 2000. Arming America: The Origins of the National Gun Culture. New York: Alfred A. Knopf. Crime in the United States 1999. 2000. Washington, DC: Federal Bureau of Investigation, U.S. Department of Justice. Firearm Injuries and Fatalities. 2000. Atlanta: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.
Guns Garbarino, James. 1999. Lost Boys: Why Our Sons Turn Violent and How We Can Save Them. New York: Free Press. “General Social Survey.” 1999. http://www. icpsr.umich.edu/GSS99/index.html. Homicide Trends in the United States. 2001. Washington, DC: Bureau of Justice Statistics. Juvenile Offenders and Victims: 1999 National Report. 1999. Washington, DC: Office of Juvenile Justice and Delinquency Prevention, U.S. Department of Justice. Kids and Guns. 2000. Washington, DC: Office of Juvenile Justice and
335
Delinquency Prevention, U.S. Department of Justice. Mountjoy, John J. 2000. “Shooting for Better Gun Control.” Spectrum 73: 1–3. Shapiro, Jeremy, Rebekah L. Dorman, William H. Burkey, Carolyn J. Welker, and Joseph B. Clough. 1997. “Development and Factor Analysis of a Measure of Youth Attitudes Toward Guns and Violence.” Journal of Clinical Child Psychology 26: 311–320. Spitzer, Robert J. 1999. “The Gun Dispute.” American Educator 23: 10–15.
H Haircuts
Among the contemporary cycle of holidays, three holidays involve children with particular intensity: Easter (a movable feast day, falling in the span of days from March 22 to April 25), Halloween (October 31), and Christmas (December 25). Anthropologist Anthony Wallace (1966) dubbed this triad of festivals the North American “children’s cult,” since children play a central part in the contemporary practice of these three holidays. The rest of the entry will briefly review the historical evolution and the modern-day celebration of each of the three major American children’s festivals.
See Clothing
Hip-Hop See African American Boys
Holidays A holiday can be defined as a socially recognized day (or series of days) that marks the celebration of a significant past event or person or that marks a transition such as the turning of a season or year. Included in the mainstream cycle of American holidays are New Year’s Day, Valentine’s Day, Easter, Memorial Day, Independence Day, Labor Day, Halloween, Thanksgiving, and Christmas. Contemporary American holidays involve boys in varied respects. Boys celebrate some holidays in school, such as exchanging Valentine’s cards with classmates on February 14 (Valentine’s Day). Boys may also participate in family-based celebrations, such as attending community fireworks with the family on July 4 (Independence Day), eating turkey dinner on Thanksgiving, and staying awake until midnight on New Year’s Eve to make noise and shout “Happy New Year.” Boys who are members of particular religions may celebrate such holidays as Hanukkah (Jewish) and Ramadan (Muslim).
Easter Easter marks the social, if not the climatic, advent of spring. Easter takes place on the first Sunday that follows the full moon after the vernal equinox. By ancient derivation, Easter was timed by the Christian church to “coincide with previously existing pagan festivals for the sake of weaning the heathen from their old faith” (Frazer 1915, 9: 328). It is thought to derive its very name from the Anglo-Saxon goddess of spring, Eostre, or perhaps from the Norse word for the spring season. Easter was celebrated in the United States as early as 1855, although it was neglected by earlier American settlers. Early Puritan colonists, who stifled the celebration of Christmas, also played
337
338
Holidays
down the observance of Easter. Puritans correctly observed that these holidays were not celebrated by early Christians, nor did the dating of these festivals coincide with the actual time of these events in the life of Jesus. During the Civil War period, however, a movement began to reinstate the observance of Easter in the United States as a healing gesture for persons bereaved in the war (Myers 1972, 104). The symbolism of Easter was an appropriate iconography for beginning life anew after the war. With the revival of Easter as a festival, customs that had emigrated from Europe to the United States began to be disseminated, among them a German tradition of building nests for rabbits, hiding the nests, and encouraging boys and girls to believe that (upon their good behavior) the rabbit would “lay” Easter eggs in the nests. Customs involving Easter eggs had been recorded in western Europe beginning in the fifteenth century (Myers 1972, 104, 111). Boys and girls actively take part in contemporary American Easter celebrations and have a decisive impact on the Easter customs that their parents implement. It is often the boy who reminds the parent to carry out certain festival routines, such as coloring eggs or putting up Easter decorations. Central to the boy’s Easter customs is the nighttime, hidden visit of the Easter bunny, who is often believed to hide the eggs decorated by the boy so he can later “hunt” for eggs. The Easter bunny is also believed to place candy, or sometimes toys, in a basket filled with artificial grass (Clark 1995, 67–81). Boys’ and girls’ belief in the Easter bunny triggers the parents’ involvement in the Easter bunny ritual, since adults maintain the custom of the Easter bunny’s egg hiding and basket giving. Just as Easter is
a holiday celebrating the renewal of life, boys’ involvement is instrumental in shaping the celebration. Halloween The historical roots of Halloween can be traced far into the past to the prehistoric Celtic celebrations of Samhain, a harvest festival marking the time when the herds were transferred to their winter stalls as well as the first day of the new year in the Celtic calendar (Belk 1990, 2; Santino 1983, 4). The festival was named after the Celtic lord of the dead, to whom human sacrifices were once made. On Samhain night, the ghosts of the dead were believed to return to the living world, even as witches and other malevolent creatures were also at large. Pope Gregory I, who in 601 C.E. promoted a plan to spread Christianity by working in concert with preexisting native beliefs, encouraged missionaries to dovetail Christian beliefs with preexisting cultural practices. The Feast of All Saints was placed on the Christian calendar on November 1 as a substitute for Samhain, celebrating instead the virtuous saints of the Christian faith. The intent was to draw off loyalty from Samhain, but this effort failed. Later, in the ninth century, the Christian church established another holiday meant to siphon off involvement with Samhain, All Souls Day on November 2, when the living were to pray for the dead. Despite these calendar manipulations, people continued to observe nocturnal celebration of wandering dead and evil beings, which came to be timed for the night known as All Hallows Eve or Halloween (i.e., the night before the Feast of All Saints). Accounts of local Halloween celebrations in the United States date from as long ago as the 1880s (Santino 1983, 7, 8; Grider 1996, 3).
Holidays Trick-or-treating is now a well-established aspect of American Halloween activities, in which boys as well as girls masquerade in costume and go door to door getting treats. Boys may also wear costumes to school or even to some shopping malls that invite them to visit stores in costume. Boys’ choices of Halloween costume are known to be influenced by gender roles. Boys are more likely than girls to dress as a superhero (such as Superman or Batman), a monster (such as Dracula or a zombie), a scary animal (such as a lion or dragon), or a character portraying conventional male roles (sports player, pirate, and so on). Boys identify with scary or powerful roles in choosing these costumes. Halloween continues to embody themes of death and evil. A series of urban legends have circulated in the United States about the existence of Halloween sadists, who are thought to dispense harmful treats to boys and girls (such as razor blades embedded in apples or poisoned candy). Such legends resurface annually, and warnings are issued by local authorities that parents should check treats for tampering, but investigation of the dangers has revealed the tampering reports to be without basis (Best 1985; Best and Horiuchi 1985). The impulse toward anxiety about evil deeds and physical harm persists in urban legends (among adults as well as children) at a time of year when fears take the upper hand. Christmas Like Easter and Halloween, the timing of Christmas is historically significant. During the fourth century, the Christian church timed the celebration of Jesus’ birth to roughly coincide with Saturnalia, a period of feasting and revelry in Roman society (Restad 1995, 4). In America,
339
Christmas did not coalesce as a festival until the mid-nineteenth century because of opposition to Christmas by Puritans and other colonists of the seventeenth and eighteenth centuries. Puritan-led opposition to Christmas may have been responsible for its secularization, by which religious symbols came to be commingled with nonbiblical symbols (such as Santa Claus being interspersed with the symbols of the Nativity). As an important secular symbol of American Christmas, Santa Claus evolved into his modern form during the mid1800s. Santa Claus has roots deriving from three sources. First, the festival of Saint Nicholas was brought to America by Dutch immigrants. The festival celebrated the patron saint of children, Nicholas, on his December 6 feast day. Sometimes known as Sinter Klass (a version of “Sint Nicholass”), this saint was said to fill children’s shoes with gifts, after the shoes were left out on the eve of December 6. Through cross-cultural contact, the Dutch custom evolved into a ritual by which Saint Nicholas visited on other days as well, including December 25 (Clark 1995, 25). A second influence on the modern version of Santa Claus was the 1822 poem “A Visit from St. Nicholas” (Barnett 1954, 26–27). The poem, published anonymously in 1823, was probably written by Clement Clark Moore, an ordained minister and professor at the General Theological Seminary in New York. “A Visit from St. Nicholas” begins with the line, “’Twas the night before Christmas.” The poem crystallized popular impressions of Santa Claus and established the traditions of the reindeer, the chimney as Santa’s entry point, the stockings hung up to be filled, and much of Santa’s appearance and personality.
340
Holidays
A young boy in a Christmas pageant (Skjold Photographs)
A third influence on the modern Santa Claus came from the nineteenth-century illustrator Thomas Nast. From 1863 to 1886, Nast did a series of cartoon drawings for Harper’s Weekly in which Santa evolved from the elflike figure of Clement’s poem into the ample, bearded, fur-attired, jolly persona that has become a fixture of modern times (Myers 1972, 321; Clark 1995, 26). In one of these drawings, Santa was shown spying on children, pausing atop a chimney, driving his magical sleigh through the sky, and so on. Modern-day sociologists have concluded that Christmas serves to celebrate and affirm family bonds, especially with regard to children. For example, parents
expect to give more valuable and more numerous gifts to their minor children than they themselves receive (Caplow 1984; Caplow, Bahr, and Chadwick 1983). Unreciprocated giving by adults to children is also symbolized by Santa Claus. One modern practice at Christmas among boys and girls is to draw up a written wish list of gifts desired from Santa Claus. Some of children’s written requests are sent as letters to Santa Claus, which are mailed and arrive at various branches of the U.S. Postal Service. The letters show that boys and girls differ in their requests to Santa Claus, with boys making more requests for toy vehicles, sports equipment, items involving spatial construction (such as building sets), military toys, action figures (of monsters, superheroes, or robots), and real vehicles (such as bikes). Boys do not differ from girls, however, in the number of items requested from Santa Claus (Richardson and Simpson 1982, 431–432, 436). In contrast to girls’ letters, boys’ letters to Santa Claus are shorter, less polite, and less indirect than girls’ letters (Otnes, Kim, and Kim 1994, 24). Commercial activity has had an impact on children’s modern celebration of Christmas. Through appearances of Santa impersonators at shopping centers and department stores, Santa Claus has come to be associated with commerce. Movies and commercials, such as the classic movie Miracle on 34th Street (in which Santa Claus is linked to Macy’s Department Store) or the depictions of Santa in Coca-Cola advertising (Belk 1987, 94) have encouraged this association. Other modern symbols of Christmas share the pattern of being related to commerce. In fact, one of the symbols most loved by contemporary children—Rudolph
Holidays the Red-Nosed Reindeer, who guides Santa’s sleigh—resulted from commercial goals. The story of Rudolph was written in 1939 by an employee of Montgomery Ward, then a mail-order firm. Robert L. May was the writer assigned to write a Christmas animal story. His narrative of the outcast red-nosed reindeer, who is chosen as Santa’s lead reindeer because his nose can illuminate the sky, became a promotional leaflet, of which 2,400,000 copies were distributed in 1939. In 1949 a song about Rudolph was composed by Johnny Marks and quickly became popular (Barnett 1954, 109). Contemporary boys and girls identify with the misfit reindeer. Some children leave an offering of food for Rudolph on Christmas Eve (along with food for Santa). Cindy Dell Clark See also Toys References and further reading Barnett, James. 1954. The American Christmas: A Study in National Culture. New York: Macmillan. Belk, Russell. 1987. “A Child’s Christmas in America: Santa Claus as Deity, Consumption as Religion.” Journal of American Culture 10, no. 1: 87–100. ———. 1990. “Halloween: An Evolving American Consumption Ritual.” In Advances in Consumer Research. Vol. 17. Edited by M. Goldberg, Gerald Gorn, and Richard Pollay. Chicago: University of Chicago Press. Best, Joel. 1985. “The Myth of the Halloween Sadist.” Psychology Today 19: 14–19. Best, Joel, and Gerald Horiuchi. 1985. “The Razor Blades in the Apple: The Social Construction of Urban Legends.” Social Problems 32: 488–499. Caplow, Theodore. 1984. “Rule Enforcement without Visible Means.” American Journal of Sociology 89, no. 6: 1306–1323. Caplow, Theodore, Howard Bahr, and Bruce Chadwick. 1983. All Faithful
341
People: Change and Continuity in Middletown’s Religion. Minneapolis: University of Minnesota Press. Clark, Cindy Dell. 1995. Flights of Fancy, Leaps of Faith: Children’s Myths in Contemporary America. Chicago: University of Chicago Press. Frazer, Sir James. 1915. The Golden Bough: A Study of Magic and Religion. London: Macmillan. Grider, Sylvia Ann. 1996. “Conservation and Dynamism in the Contemporary Celebration of Halloween: Institutionalization, Commercialization, Gentrification.” Western Folklore 53, no. 1: 3–15. Levinson, Stacey, Stacey Mack, Daniel Reinhardt, Helen Suarez, and Grace Yeh. 1991. “Halloween as a Consumption Experience.” Undergraduate research thesis, Rutgers University School of Business. Myers, Robert. 1972. Celebrations: The Complete Book of American Holidays. New York: Doubleday. Ogletree, Shirley Matile, Larry Denton, and Sue Winkle Williams. 1993. “Age and Gender Differences in Children’s Halloween Costumes.” Journal of Psychology 127: 633–637. Otnes, Cele, Kyungseung Kim, and Young Cham Kim. 1994. “Yes Virginia, There Is a Gender Difference: Analyzing Children’s Requests to Santa Claus.” Journal of Popular Culture 28, no. 1: 17–29. Restad, Penne. 1995. Christmas in America: A History. New York: Oxford University Press. Richardson, John, and Carl Simpson. 1982. “Children, Gender and Social Structure: An Analysis of the Contents of Letters to Santa Claus.” Child Development 53: 429–436. Santino, Jack. 1983. “Halloween in America: Contemporary Customs and Performances.” Western Folklore 42, no. 1: 1–20. ———. 1994. Halloween and Other Festivals of Life and Death. Knoxville: University of Tennessee Press. ———. 1995. All around the Year: Holidays and Celebrations in American Life. Urbana: University of Illinois Press. Wallace, Anthony. 1966. Religion: An Anthropological View. New York: Random House.
342
Homosexuality
Homosexuality See Same-Sex Relationships
Horror Films A genre of American and European filmmaking that has lasted for decades, horror films continue to attract the attention and devotion of both children and adults. Boys have regularly been among the genre’s chief fans, perhaps because of the films’ oftentimes grisly and gruesome scenarios that continually conflate sexuality and violence. Many theorists feel that horror films, like fairy tales, offer viewers a cathartic experience that allows both real and imaginary fears to be explored within a safe or “make-believe” zone. Boys may seek out horror films in order to experience vicariously the sensations of a world gone awry, of “normality” overturned, and “monstrosity” run amok. Boys may thus identify with movie monsters as “unsocialized” or disruptive forces that buck traditional authority. Conversely, boys may also identify with those same forces of traditional authority as they eradicate cinematic deviancy and uphold patriarchal values. Furthermore, the very act of watching a horror film may encourage boys to define and express traditional gender identity roles: watching a horror film and not being moved by it may serve as a kind of test or performance of masculinity. Throughout the years, the increasing explicitness of sexuality and violence in the movies has caused concern for many parents, yet horror films have been able to reinvent themselves decade after decade, finding and exploiting new monsters that reflect the cultural concerns of the eras that produce them. Most classical (1930s–1940s) Hollywood horror films have their roots in the
gothic literature of western Europe. Films based on novels such as Frankenstein, Dracula, and The Strange Case of Dr. Jekyll and Mr. Hyde have been box office favorites throughout the history of the genre. In the United States, the first real boom of horror film production coincided with the Great Depression. In 1931, Universal Studios released Dracula and Frankenstein to huge box office revenues. They had previously made silent film versions of The Hunchback of Notre Dame (1923) and The Phantom of the Opera (1925). During the early 1930s, Universal became the leading studio in horror film production, making such films as The Mummy (1932), The Invisible Man (1933), The Black Cat (1934), and Bride of Frankenstein (1935). Other Hollywood studios quickly followed suit, releasing King Kong (RKO, 1933), The Mask of Fu Manchu (MGM, 1932), Mystery of the Wax Museum (Warner Brothers, 1933), and Island of Lost Souls (Paramount, 1933). The visual “look” of most of these films was a blend of gothic architecture and German Expressionist style. Indeed, the German Expressionist cinema of the previous decades had produced several seminal horror films, including The Cabinet of Dr. Caligari (1919) and Nosferatu (1922), the latter the first filmed version of Bram Stoker’s novel Dracula. In the classical Hollywood horror film, “normality” is embodied by heterosexual, middle-class, white couples and patriarchal figures and institutions that represent law, science, or religion. For example, Count Dracula menaces the bourgeois newlyweds Jonathan and Mina Harker and is eventually destroyed with science and religion by Professor Van Helsing. In most of these films, the monstrous threat usually comes from abroad—Dracula, Dr.
Horror Films Frankenstein, King Kong, Fu Manchu, the Mummy, the Hunchback, and the Phantom are all pointedly un-American. As such, the classical horror film might be said to tap into the racist and xenophobic fears of the era. Boris Karloff and Bela Lugosi, both European in name, character, and persona, quickly became the two leading stars of the classical Hollywood horror film. The general xenophobia of the genre might also be extended to include anything the dominant middleclass society defines as “the Other.” Thus movie monsters are frequently tinged with nonmale and nonheterosexual sexuality, different political ideologies, or the markings of the lower or upper classes. In more recent years, boys and girls themselves have become movie monsters, but in classical horror films they rarely appeared and then only as (offscreen) victims. Still, the classical horror film would have served as a powerful tool of acculturation, teaching boys the difference between normality (i.e., white patriarchy and its institutions) and deviance (nonwhite, nonmale Others). The World War II era saw two diverging strands in the Hollywood horror film. More adult and psychological horror films (including The Cat People [1942] and I Walked with a Zombie [1943]) were produced by Val Lewton at RKO. Despite their often lurid and sensationalistic titles, these films were crafted like fine short stories and relied more on implied horror and metaphorical fears. At Universal Studios, the mixing and matching of classical Hollywood monsters in films such as Frankenstein Meets the Wolf Man (1943) and House of Dracula (1945) eventually exhausted the classical array of movie monsters. Interestingly, both the RKO Val Lewton films and Universal’s monster matchups became more
343
Boris Karloff in The Mummy (National Film Archive, London)
overtly psychological during this period, perhaps reflective of the nation’s concomitant fascination with psychoanalysis. In 1948, Universal combined its most successful monsters with its most successful comedy team and released Abbott and Costello Meet Frankenstein. The monsters of Hollywood’s classical era— Frankenstein’s monster, Dracula, the Wolf Man, and so on—were now burlesqued and no longer considered frightening to adults (although perhaps they did remain so for children). Fortunately for Hollywood, postwar American boys had a host of new things to fear, most of them related either to fears of nuclear technology or the red scare. Horror quickly hybridized with science fiction, and the overwhelming
344
Horror Films
A scene from Night of the Living Dead (The Museum of Modern Art Film Stills Archive)
number of horror films from the 1950s might best be thought of as alien invasion films or bug-eyed monster movies. In films like the Japanese import Godzilla (1954) and The Amazing Colossal Man (1957), atom bombs or nuclear radiation are responsible for creating new and terrible monsters that threaten to destroy the world itself, not just a white bourgeois couple. Films like Invasion of the Body Snatchers (1956) and I Married a Monster from Outer Space (1958) depict small-town America being overtaken by insidious aliens who look “normal” but in reality are out to destroy the American way of life. Most theorists agree that these films in some way reflect the paranoia and red menace hysteria of
the United States during the McCarthy era. Like most horror movies, these films are perhaps best understood as metaphorical reworkings of real-life fears—in this case nuclear fears and the threat of communist invasion or takeover from within. Several of the films of this era figured children as protagonists—a young girl is traumatized by giant ants in Them! (1954), and in Invaders from Mars (1953), a young boy representing normality must fight off an alien invasion, including his thoughtcontrolled parents. By the decade’s end, teenage boys were being menaced by and turned into monsters themselves in films such as I Was a Teenage Werewolf and I Was a Teenage Frankenstein (both 1957). In these films, teenage angst is conflated
Horror Films with the deviant, even as older homoerotic couples are depicted as ultimately responsible for turning young men into monsters—yet another exploitation of homophobia within the genre and one tied specifically to 1950s ideas about gay men as child molesters. (This “boys-at-riskfrom-monstrously-erotic-men” formula would also be exploited more directly in 1980s teen horror films such as The Lost Boys [1987] and Nightmare on Elm Street, Part 2: Freddy’s Revenge [1985].) Also in the 1950s, the classical Hollywood monsters of the 1930s and 1940s began to be recycled on television, chiefly through the release of their films to local television stations. Often these local stations would create a special late night or Saturday afternoon “Shock Theater” program, a format in which a ghoulish host such as Vampira or Roland would introduce the films and then comment upon them. Although these shows allowed a new generation of horror movie fans to experience the classical Hollywood horror films, they also tended to situate the films within a comedic, parodic, or campy frame and perhaps further trivialized the “scare” quotient of the classical Hollywood monsters. In the 1960s, television situation comedies such as The Addams Family and The Munsters, which often suggested that their monstrous families (boys included) were just as “normal” as everyone else on the block, also worked to “naturalize” the monstrous. Books, magazines, toys, model kits, games, posters, and trading cards devoted to the genre allowed both boys and girls to play at being monsters, although it appears that boys tended to dominate the fan pages of magazines like Famous Monsters of Filmland. By the early 1970s, cartoon images of the classi-
345
cal Hollywood movie monsters could be found on Saturday morning television shows and even advertising breakfast cereals named after them. Although the juvenilization of the classical monsters was occurring on television, horror films became more and more adult in nature and form, in accordance with slackening censorship and the increasing prevalence of 1960s real-life horrors such as political assassinations, police brutality, civil rights unrest, and the Vietnam War. Great Britain’s Hammer Films, which had made full-color remakes of the classical monsters in the late 1950s, continued to mine those narratives into the early 1970s, often in increasingly sexualized and violent ways. In 1960, Alfred Hitchcock’s Psycho pushed the boundaries of acceptable screen violence and would become the template for the late 1970s to 1980s slasher film, the raison d’être of which seems to be the brutal onscreen murder of young seminude women. Gore films like Herschell Gordon Lewis’s Blood Feast (1963) prowled the late night exploitation market, while George Romero’s Night of the Living Dead (1968) broke new ground in terms of both onscreen gore and pessimistic political allegory. Early films by Romero, Tobe Hooper, and Wes Craven (as well as some of the blaxploitation horror films of the 1970s and individual works like Peter Bogdanovich’s Targets [1968]) have been championed by critics as political allegories of the era, often because they dramatized in horror movie terms how the prevailing social order of warmongering patriarchal capitalism had created its own kind of monsters. For example, the young male assassin in Targets is shown to be the logical product of American commercial gun culture, and a nuclear family is
346
Horror Films
destroyed by its zombie daughter in Night of the Living Dead. Occult horror also became big Hollywood business with Rosemary’s Baby (1968), The Other (1972), The Exorcist (1973), and The Omen (1976), all of which figure children as monsters. Critics have read these films as a reactionary response to the 1960s youth movement because they suggest that children are satanic and must be brought under control by adherence to religious faith and traditional patriarchal values. The B-movie horror films of the early 1970s, however, became increasingly bizarre and campy, with lesbian hippie vampires and mad scientist dandies cavorting through films such as Lust for a Vampire (1971), Dr. Jekyll and Sister Hyde (1972), and Dr. Phibes Rises Again! (1972). The end point of that trend was possibly the 1975 release of The Rocky Horror Picture Show. Based on a successful stage show, the film was a campy horror-musical hybrid that set out to deconstruct both genres by depicting its mad scientist as a bisexual transvestite who creates a blond hunk, seduces both members of a middle-class couple, and puts on a fabulous floor show. The film flopped on its original release but has become the quintessential cult film, still playing at midnight shows in urban theaters well into the twenty-first century. In the late 1970s and 1980s, films such as Halloween, Friday the Thirteenth, and Nightmare on Elm Street (and their sequels) helped to create a new subgenre of horror that lasted into the 1990s—the slasher film. The simplistic formula of most of these films involves a psychotic (but potentially human) killer who stalks teenagers (usually girls) and kills them one by one in outrageous displays of phallic violence, until a “final girl” is eventually able to subdue him. Critics
have argued that these films represent some sort of backlash against the gains that the women’s movement had made in the 1970s; watching the films with young male audiences that cheer on the brutalization of women can be very disturbing. Other critics have noted that the films’ prevalent “have sex and die” message is particularly revealing of a culture in the midst of hysteria about acquired immunodeficiency syndrome (AIDS): often the “final girl” to survive is a virgin, whereas her more sexual classmates end up dead. Like all horror films, the slashers exploit their audience’s desire and ability to identify with the monster as well as his victim. For example, the use of subjective camera shots that tie the audience into the killer’s point of view places the spectator inside the mind and body of the monster, allowing him or her to experience vicariously the violence directed against the teens. Although most 1980s slasher movies were low-budget exploitation films, Hollywood saw that the formula was popular and released its own big-budget, starstudded slasher film in 1991. When Silence of the Lambs made hundreds of millions of dollars and received the Oscar for best picture the next year, many Americans were shocked that something so violent and disturbing could be awarded such praise. Most recently, postmodern, or self-aware, slasher films like Scream (1996) have attempted to parody the subgenre while still invoking scares. In the 1990s, a few big-budget Hollywood remakes of the classical monster movies fueled interest in horror films once again. Bram Stoker’s Dracula (1992) and Mary Shelley’s Frankenstein (1994) attempted to recount more faithful versions of the classic stories, in so doing softening and romanticizing their mon-
Hunting strous threats. Interview with the Vampire (1994), based upon Anne Rice’s incredibly successful book, also romanticizes and (homo)eroticizes its vampires to unprecedented degrees. The biggest development in recent horror films would probably be their increasing dependence on special effects for shock and gore and the creation of spectacular monsters. Jurassic Park (1993) and The Lost World (1997) refigure the history of giant monster movies in self-aware ways, and Universal’s remake of The Mummy (1999) used state-ofthe-art special effects to create more of an Indiana Jones–type adventure than a classical horror film per se. Around 2000, perhaps spurred by apocalyptic fears of round numbers, a small cycle of occult thrillers was also popular, such as The Sixth Sense, in which a young boy protagonist has the ability to “see dead people.” Still, even as they have evolved and changed over the years, movie monsters (whether they come from outer space, nature, mad science, or the human mind itself) continue to populate American movie screens and remain powerful figures of fear and identification for boys. Harry M. Benshoff See also Films References and further reading Benshoff, Harry M. 1997. Monsters in the Closet: Homosexuality and the Horror Film. Manchester: Manchester University Press. Bernstein, Rhona J. 1996. Attack of the Leading Ladies: Gender, Sexuality and Spectatorship in Classic Horror Cinema. New York: Columbia University Press. Clover, Carol J. 1992. Men, Women, and Chainsaws: Gender in the Modern Horror Film. Princeton: Princeton University Press. Grant, Barry Keith, ed. 1996. The Dread of Difference. Austin: University of Texas Press.
347
King, Stephen. 1981. Danse Macabre. New York: Everest House Publishers. Skal, David J. 1993. The Monster Show: A Cultural History of Horror. New York: Penguin. ———. 1998. Screams of Reason: Mad Science and Modern Culture. New York: W. W. Norton. Tarratt, Margaret. 1970. “Monsters from the Id.” Pp. 330–349 in Film Genre Reader II. Edited by Barry Keith Grant. Austin: University of Texas Press. Wood, Robin. 1986. Hollywood from Vietnam to Reagan. New York: Columbia University Press.
Hunting Throughout the world, rural boys have probably always hunted. Even boys who are not hunters often go through a stage when they take pleasure in capturing or killing small animals. Among Americans of European descent, hunting has been an especially popular pastime for boys. As early as 1624, John Smith commented that Jamestown planters “do so traine up their servants and youth in shooting deere, and fowle, that the youths will kill them as well as their Masters” (Smith 1907, 178). Because game was abundant (particularly in areas where Indians had employed fires to create grassy, parklike habitats for deer) and because no colonial game laws restricted hunting to the elite, American boys soon learned to hunt. In New England, too, older boys and young men sought game, though Puritans tended to frown on hunting as sport. It is hard to say to what degree colonists imitated the hunting practices of Indians, but clearly for Indian youth, even more than for colonists, hunting was an important part of life. Among the Wampanoag, one of the Algonquian peoples of New England, young men underwent an initiation rite in which they were blindfolded and taken into the forest. Left
348
Hunting
A boy holds dead doves in one hand and a gun in the other. (Archive Photos)
in a remote corner of the tribe’s territory, these young men were expected to support themselves for a winter by foraging and hunting prior to entering the society of men (Simmons 1986, 47). To be a good hunter among most North American Indian peoples was, quite simply, to be a valued member of a tribe. Hunting was not as highly valued a skill among colonists as among Indians, yet in some frontier areas—especially where Finnish and Celtic immigrants settled—a boy’s first hunt for deer may have served as something of a rite of passage between boyhood and manhood. It would be misleading to say that this was universally true, however, since many backwoods settlers thought of hunting as a mundane affair, a way to obtain food rather than a prescription for manhood.
Though a few settlers seem to have hunted as a way of life, most placed a higher priority on setting up a farm. Indeed, it was cultivation that, according to Protestant divines and Enlightenment thinkers, gave Europeans the right to take land from Indians, who were mere hunters. By extension, Europeans who hunted as a way of life were thought to be savage men who respected neither law nor property. Significantly more evidence for hunting as a rite of passage exists for the early national and antebellum eras. Thomas Jefferson, for instance, wrote approvingly in 1814 that his young neighbor, Meriwether Lewis, had habitually hunted raccoons and opossum when he was a boy. “In this exercise,” Jefferson wrote, “no season or circumstance could obstruct his purpose, plunging thro’ the winter’s snows and frozen streams in pursuit of his object” (Jackson 1978, 593). This experience, according to Jefferson, had prepared Lewis to explore the continent as an adult. Jefferson’s words indicate that old prejudices against hunting and hunters were giving way as new definitions of manliness and empire appeared. By the time of the Lewis and Clark expedition, hunting had begun to compete with farming in the American imagination as a symbolic means of taking possession of the continent. As if taking cues from Jefferson’s encomium to Lewis, any number of nineteenth-century memoirists detailed their boyhood hunting experiences. Henry David Thoreau recalled that, when he was a youth, almost every boy had “shouldered a fowling-piece between the ages of ten and fourteen.” Hunting, he added, had been one of the best parts of his education and had given him his “closest acquaintance with Nature” (Thoreau 1962, 207–208). To parents who
Hunting asked whether they should allow their sons to hunt, Thoreau responded in the affirmative. Hunting, indeed, seems to have gained rather than lost popularity as the United States became more “civilized” and affluent. Contributing to hunting’s improving reputation was the rising popularity of hunter heroes like Daniel Boone, who seemed to exist in an autonomous realm, separated from kin and community. One Boone biography—Timothy Flint’s Biographical Memoir of Daniel Boone, the First Settler of Kentucky—went through fourteen printings between 1833 and 1868, despite competition from another half-dozen book-length biographies of Boone that appeared in the same years. James Fenimore Cooper’s Leatherstocking tales, which depicted the adventures of Natty Bumppo, an American frontiersman, and his idealized Native American companions, also sold widely in this era. Judging from the number of nineteenthcentury sport hunters who recalled having read this literature in their youth, it appears that tales of Boone and Leatherstocking had a profound psychological impact on white males who grew up during the market revolution (ca. 1820–1850). The proudest moment of his life, recalled the most popular Gilded Age novelist Ned Buntline (pseudonym for Edward Judson), was when he received a gun from his father as a present for his eighth birthday. Young Buntline hoped to follow the “illustrious example of Daniel Boone” by becoming a professional hunter, but his family’s move to Philadelphia prevented this (Pond 1919, 12). Buntline did go on, however, to become an avid sport hunter. Like Boone and Leatherstocking, young men of the market revolution era sought to become independent actors in an increasingly impersonal world. Individual
349
ambition and perpetual, almost nomadic movement, not geographical and social stasis or duty to community, became the organizing principles of society. It is important to note in this regard that both Boone and Leatherstocking, though they lived beyond the pale of institutions capable of guaranteeing good behavior, displayed internalized virtue. Boone, explained a Protestant minister in a lecture he gave in New York in the antebellum era, was “the one white man who dares to trust himself alone with nature” (Milburn 1857, 28). Like the ideal middle-class man, hunter heroes—with the notable exception of “half horse, half alligator” Davy Crockett—were models of good conduct, not demi-savages of the frontier. The Boone and Leatherstocking literature, coupled with the nascent enthusiasm for sport hunting in the antebellum era, set a pattern for the rest of the nineteenth century and perhaps the twentieth, too. By the time of the Civil War, for white boys in both the North and the South, hunting had become a rite of passage between a dependent boyhood (which had become feminized because mothers were the principal childrearers) and the rugged independence of manhood. “The youth,” explained a young hunter in 1851, “never tastes the enjoyment of absolute independence, just as he does when standing on the mountain’s brow, conscious of strength and exhilarated by strenuous exercise, grasping in his hand a rifle that he can trust, and knows how to use. Let some worthy game lie dead at his feet, and his proud feeling of self-reliance is complete” (Proctor 1998, 82). In the Gilded Age, hunting was a ritual not only of autonomy but also of nativism. Hunting seemed to set apart morally and physically healthy white American boys from boys of other races
350
Hunting
and nations. In Thomas W. Knox’s 1881 book The Young Nimrods of North America, for example, two city boys embark on adventures in a dangerous world filled with Irishmen, lumberjacks, and Indians. Hunting and fishing all the while, the boys—like Lewis and Clark—journey from Atlantic to Pacific, escorted by their chaperone, “the Doctor,” who gives them lessons in natural history. Like others of the genre, Knox’s book was not a lurid dime novel; it was intended to “be unexceptionable in point of morals” so that it could “be freely placed in the hands of the youth all over the land” (Knox 1881, Preface). With some 250 illustrations, in addition to a gilded cover and spine depicting a moose, a fish, a hare, an owl, a fox, an antelope, an Indian with a bow, and a charging buffalo, the book was clearly meant for children of parents with means. Other hunting novels for boys included Charles Austin Fosdick’s Sportsman’s Club Series, Rod and Gun Series, and Boy Hunter Series (written under the pseudonym Harry Castlemon); Edward S. Ellis’s Deerfoot Series and Young Pioneer Series; and works by George Bird Grinnell, William Temple Hornaday, Emerson Hough, and Stewart Edward White. Meanwhile, psychologist G. Stanley Hall updated Enlightenment ideas of social evolution by positing that humans, like societies, experience distinct stages of development. In one of these stages, wrote Hall, “the child revels in savagery.” Only by encouraging their children’s “tribal, predatory, hunting, fishing, fighting, roving, idle playing proclivities” could parents assure them of graduating to a higher stage and becoming happy and productive adults. Without these outlets for their youthful energies, children would lose interest in life, developing
“weakness of character” and “slowness of intellect” (Bederman 1995, 90). In order to promote (and capitalize on) the child’s need to revel in “savagery,” Ernest Thompson Seton founded the Woodcraft Indians, and Daniel Beard established the Sons of Daniel Boone. In these organizations, along with the Boy Scouts, which soon subsumed both in 1916, American boys were taught the skills of tracking, trapping, and taxidermy. Elsewhere, boys learned the “elevating” ethic of sportsmanship, which had been “the training school of the greatest nations of ancient and modern times,” according to The American Field magazine. “The man who wishes his boy to get the most benefit from his boyhood, in the way of preparation for later life,” the Field’s editor wrote in 1904, “will . . . give him an insight into its purest and most remunerative pleasures, by putting into his hands a gun, rifle, or rod” (“Early American Impressions” 1904, 389). Putting a rifle in the hands of a boy became ever less expensive as gun prices fell dramatically after the Civil War and again during and after World War I. Between 1910 and 1920, the cost of producing a gun fell by 50 percent. By 1945, about one-quarter of the adult male population of the United States engaged in sport hunting. It is difficult to say how many youths hunted, but the numbers must have been great. Hunting had become a venue for the demonstration of manliness, Americanness, and patriotism. Since the Vietnam War, however, hunting has dropped steadily in popularity. Partly this shift has occurred as sport hunting has become identified with rural and blue-collar men rather than the social elite, and partly it has reflected humanitarian and environmental concern about killing animals. As early as the seven-
Hunting teenth century, popular conduct books for young men had recommended against “excessive indulgence” in hunting, as this was said to create habits of cruelty and idleness. Such thinking was fairly common among elite Americans of the eighteenth century, declined in the nineteenth century, and has reappeared again among the American middle class in the twentieth century. As current debates over children and guns indicate, middle-class parents are more concerned today with domesticating their children than with indoctrinating them with hunting lore. The idea that Americans are, or should be, a hunting people and that American boys should know how to hunt, however, continues to appear in sporting magazines and books that, to this day, have a large market in the United States. Daniel J. Herman See also Boy Scouts; Fishing; Guns; Native American Boys References and further reading Bederman, Gail. 1995. Manliness and Civilization: A Cultural History of Gender and Race in the United States, 1880–1917. Chicago: University of Chicago Press. Cartmill, Matt. 1993. A View to a Death in the Morning: Hunting and Nature through History. Cambridge, MA: Harvard University Press. “Early American Impressions.” 1904. The American Field: The Sportsman’s Journal 61, no. 17 (April 23). Herman, Daniel Justin. 2001. Hunting and the American Imagination.
351
Washington, DC: Smithsonian Institution Press. Jackson, Donald, ed. 1978. Letters of the Lewis and Clark Expedition. Vol. 2. 1962. Reprint, Urbana: University of Illinois Press. Knox, Thomas W. 1881. The Young Nimrods of North America: A Book for Boys. New York: Harper and Brothers. Marks, Stuart A. 1991. Southern Hunting in Black and White: Nature, History, and Ritual in a Carolina Community. Princeton, NJ: Princeton University Press. Milburn, William Henry. 1857. The Rifle, Axe, and Saddle-Bags, and Other Lectures. New York: Derby and Jackson. Petersen, David, ed. 1996. A Hunter’s Heart: Honest Essays on Blood Sport. New York: Henry Holt. Pond, Fred E. (Will Wildwood, pseud.). 1919. Life and Adventures of “Ned Buntline” with Ned Buntline’s Anecdote of “Frank Forester” and Chapter of Angling Sketches. New York: Cadmus Book Shop. Proctor, Nicholas Wolfe. 1998. “Bathed in Blood: Hunting in the Antebellum South.” Ph.D. diss., Emory University. Simmons, William S. 1986. Spirit of the New England Tribes: Indian History and Folklore, 1620–1984. Hanover: University Press of New England. Smith, John. 1907. The Generall Historie of Virginia, New England and the Summer Isles Together with the True Travels, Adventures and Observations, and a Sea Grammar. Vol. 1. Glasgow: J. Maclehose and Sons. Thoreau, Henry David. 1962. Walden, or, Life in the Woods. New York: Time. [Note: Portions of this entry appear in Daniel Justin Herman, Hunting and the American Imagination (Washington, DC: Smithsonian Institution Press, 2001).]
I Ice Hockey
communications media such as newspapers, later radio, and then television groups such as the Canadian Broadcasting Corporation, the NHL established a monopoly over the pool of players, the entertainment product, and hockey culture itself. The sport still has the potential to define many father-son relationships. The game remains vital in the cultural fabric of many communities, where it is a Saturday night family ritual to watch professional hockey on television, from the exhibition games of September to the crowning of the Stanley Cup championships in early summer. Satellite television and the marketing of star players have brought some popularity to ice hockey in the southern United States, and the merchandizing of hockey paraphernalia has made the sport a multimillion-dollar entertainment industry. The first records of informal games of hockey came from the Canadian cities of Kingston, Dartmouth, and Montreal and from the regions of New England and New York before the middle of the nineteenth century. By the 1880s, when middle-class Canadian men promoted the British model of amateur sport, ice hockey was being played with some regularity in the cities of Montreal, Ottawa, Quebec, and Toronto. Hockey was a game that could be played in both rural and urban areas. Indeed, with
Ice hockey is known in sports circles as the “world’s fastest game” and is played on an enclosed frozen surface by two teams of six skaters, including a goaltender, two defense players, and three forwards. It has also been experienced, historically, as a game shared by fathers and sons, a symbolic and nostalgic place to spend time together. Hockey emerged in the late nineteenth century as a variation of the older sports of bandy, shinty, Irish hurley, and field hockey. The professional game became very popular in Canada and the northern United States in the early twentieth century; it was rough, sometimes violent, and drew its spectatorship from all classes of people. In both urban and rural Canada and the northern United States, as in other sports of the era such as football and boxing, hockey was imbued with social values associating physical prowess, technical skill, and toughness with a respected sense of manhood. Presumably, hockey trained boys to become men. Playing hockey for a team was a rite of passage for boys, while for towns and cities, hockey team victories became markers of success and urban prosperity. The National Hockey League (NHL) became the dominant hockey league of the twentieth century and successfully weathered the challenges of rival leagues. With the help of
353
354
Ice Hockey
A boy playing ice hockey on a frozen lake (Skjold Photographs)
the growth of leagues, professional associations, and media coverage of the big leagues and championship games, a model was provided for young boys who spent the winter months playing outdoors. Following the traditional values once espoused by British educators, Canadian boys were encouraged to take up sport for fun but, more importantly, for the development of good character. In Canada and the northern United States, frozen ponds, rivers, and lakes provided ready-made ice hockey facilities for boys and young men who already considered professional hockey players to be local and national icons. One only needed a pond, a stick, a puck, and a pair of skates. However, boys also played variations of a
“pickup” game called “shinny,” in which no equipment, referees, or even goaltenders were required. The skill emphasis was on fast skating, stick handling, and passing, and since there were few restrictions and bylaws enforced on rural ponds, the boys could play for hours without interruption, until cold feet brought the game to a close. The games played by thousands of boys across North America during the winter months were all about fun and camaraderie, but connections between sport and identity went much deeper. As part of men’s upbringing and part of what was considered normal socializing, sports in the early twentieth century tended to sustain social connections between men. It was common social practice for men to be knowledgeable about professional sports such as hockey and to actively support local teams. Women and girls were welcome to attend, but for the most part, hockey was considered to be a part of masculine subculture, where boys and men shared solidarity in an understanding of rough, physical practices and together celebrated manhood through vicarious participation in the sporting escapades of talented athletes. Sharing the sport of hockey through participation or spectatorship provided a comfortable focal point or common ground in the relationship between father and son. And the same argument can be made for the sport of baseball in the summer months. These important social connections between boys, and between boys and their fathers, lent more social weight to the thrills and dedication of play on the pond. As in the case of other organized sports, which became spectator events in the late nineteenth and early twentieth centuries, town boosterism, civic pride,
Ice Hockey and later profit ushered in the era of professional players and leagues. These factors made the outcomes of games even more significant for those who were active boosters in their hometowns and for those who followed the emerging National Hockey League in the sports pages. Town rivalries contributed mightily to the popularity of early hockey, but the game was also popular because of the speed, shooting, and passing of players and the excitement generated by rough play. Hard body checking was common, along with fistfights and the use of the stick to inflict physical harm. Judges preferred to permit ice hockey participants to police themselves. Thus, it was commonly held that the law had no place in the hockey rink. Players were seriously hurt and sometimes killed during games, but generally if players were charged with assault, manslaughter, or even murder, the charges were either reduced or dropped altogether. Provided that both players in an altercation were willing participants, defending oneself or one’s team was considered a point of manly honor, if not an integral aspect of competition. As such, the culture of hockey promoted by the NHL and the media celebrated a tradition of rough masculinity that was considered to be both acceptable and necessary. The NHL was also a model for boys’ hockey. The junior league teams, which were an extensive feeder system to the NHL, were controlled from top to bottom by the professional hockey league. This self-enclosed subculture of men and boys normalized particular social values. Because of the perceived cultural importance of professional hockey heroes who were all men, most boys grew up with dreams of playing hockey in the big leagues, and fathers who shared these values and followed hockey as a social
355
pastime were certain to encourage such dreams, if not to pressure their sons into playing. Fathers and sons and sometimes entire families followed the fortunes of the “Original Six” NHL hockey teams: the Toronto Maple Leafs, Montreal Canadiens, Boston Bruins, New York Rangers, Chicago Blackhawks, and Detroit Red Wings. These teams vied for the Stanley Cup, which was the hockey championship of North America. Newspaper reports and carefully constructed stories and later the radio broadcasts of NHL games made cultural icons and heroes out of “ordinary” men. Team rivalries were played up in media stories, with the result being greater gate receipts for owners. It was the advent of television, however, and its increasing consumer popularity in the 1960s that enhanced the sport economically and secured the place of professional hockey in consumer culture. NHL expansion through the late 1960s, 1970s, and 1980s brought hockey to many parts of the United States where hockey had not previously been played. Oakland and Los Angeles, for example, did not have the hockey history and the traditional winter following of the northern states and Canadian provinces. However, owners were willing to risk opening up new markets for hockey in places where fathers and sons had been sharing the social experiences of major league baseball, football, and basketball. International hockey has also played a role in fostering interest in participation and spectatorship. Canadian boys of the 1950s and 1960s, thanks to their upbringing, knew that Canada had enjoyed international hockey supremacy since the 1920s in Olympic Games competition and at world championships. Canadian fans generally considered hockey to be
356
Ice Hockey
the national sport and expected victory in international ice hockey competitions. Since sports were one of the major cultural signifiers of the Cold War period, a focal point for the symbolic competition between communist and capitalist nations, the Summit Series of 1972 featuring Canada and the Soviet Union was a defining moment for Canadian hockey and for the boys and men who identified themselves with hockey culture for generations. The Soviet challenge to Canadian talent was also a symbolic challenge to Canadian manhood and suggested that Canadian hockey supremacy was, in part, a cultural fantasy that was not entirely consistent with the state of international hockey. The series became a watershed event for the migration of European hockey players to North America to join the professional ranks of the NHL and the minor leagues of junior hockey. The United States also had a very successful record at the Olympic Games, winning the gold medal in 1960 at Squaw Valley in California. However, the defining moment for United States hockey came in 1980 at the Olympic Games in Lake Placid, New York. Called the “Miracle on Ice,” the U.S. victory over the Soviet Union was an important identity marker for boys who looked to the Olympic Games as inspiration and as a patriotic marker. Generally, for boys in the United States, ice hockey had always been less popular than baseball, football, and basketball. When news of the hockey victory over the Soviets emerged, 40 million Americans tuned in to watch the game on tape delay. The influence of professional hockey and international championships has undoubtedly had a direct impact on how boys in Canada and the United States understand the game, participate in it, or
consume it as spectators. Over the decades of the second half of the twentieth century, in weekend and evening road hockey games, thousands of boys and young men played out dreams of star players and favorite teams. Road hockey, a spontaneous urban form of dryland shinny that did not require skates, mimicked the passing, shooting, and stick handling of the on-ice game and could be played on any paved street or parking lot with old hockey sticks and tennis balls. In the last two decades of the twentieth century, the advent of roller blades, or inline skates, made road hockey look more like the pond shinny of an earlier era. More formal inline hockey leagues created opportunities in all regions, north and south, for boys to play hockey without ice. Yet the professional roller hockey teams were staffed by off-season ice hockey players, and ice hockey remained the iconographic model for the summer or southern versions of the sport. Ice hockey is still promoted through the star player system and the glamour of the Stanley Cup playoffs, and the lure of the NHL continues to play upon the hopes and dreams of American and Canadian boys and, more recently, girls. Collector hockey cards, autographed sticks, and sweaters still trade furiously in stores and even more fervently on the Internet. Hockey paraphernalia has monumentalized the names of Gordie Howe, Bobby Orr, Bobby Hull, Maurice (Rocket) Richard, and more recently Wayne Gretzky and Mario Lemieux. Some of the former star players have become owners, whereas others continue to promote the game through children’s camps and by donating equipment to underprivileged children. Equipment, ice-making technology, increasing player size and strength, new
Illegal Substances coaching techniques, and exercise physiology have changed the sport in the past half-century. Chronic and catastrophic injuries, concussions, spinal cord trauma, and lacerations are still prevalent at all levels of the sport in spite of increased safety awareness programs, coaching education, and an increased judicial interest in the sport. Even after a flourish in girls’ and women’s participation, the sport continues to be viewed as a rite of passage for young boys and a valid, systematic technique of promoting character development and fair play through ritualized aggression. Offshoots of the sport have been ringette, played with a bladeless stick and ring on skates, roller hockey, and the recreational activity of roller blading, or inline skating, using boots with four inline wheels. Road or street hockey, also an offshoot, played with varying rules, hockey sticks, and a ball, continues to be inexpensive and popular. Even at this level, among boys and girls and fathers and mothers, NHL team identifications are a prominent feature. Kevin B. Wamsley
References and further reading Burstyn, Varda. 1999. The Rites of Men: Manhood, Politics, and the Culture of Sport. Toronto: University of Toronto Press. Cruise, David, and Alison Griffiths. 1992. Net Worth: Exploding the Myths of Pro Hockey. Toronto: Penguin Books. Gruneau, Richard, and David Whitson. 1993. Hockey Night in Canada: Sport, Identities, and Cultural Politics. Toronto: Garamond Press. Kidd, Bruce. 1996. The Struggle for Canadian Sport. Toronto: University of Toronto Press. Kidd, Bruce, and John Macfarlane. 1972. The Death of Hockey. Toronto: New Press. Mason, Daniel, and Barbara Schrodt. 1996. “Hockey’s First Professional Team: The
357
Portage Lakes Hockey Club of Houghton, Michigan.” Sport History Review 27: 49–71. Metcalfe, Alan. 1987. Canada Learns to Play: The Emergence of Organized Sport 1807–1904. Toronto: McClelland and Stewart. Nickerson, Craig. 1995. “Red Dawn in Lake Placid: The Semi-Final Hockey Game at the 1980 Winter Olympics as Cold War Battleground.” Canadian Journal of History of Sport 26: 73–85. Simpson, Wayne. 1987. “Hockey.” Pp. 169–229 in A Concise History of Sport in Canada. Edited by Don Morrow, Mary Keyes, Wayne Simpson, Frank Cosentino, and Ron Lappage. Toronto: Oxford University Press.
Illegal Substances Although cardiovascular disease and cancer are the leading causes of death in adult men aged twenty-five and older (accounting for a combined 66 percent of deaths in 1999), death among adolescent boys and children more often results from preventable causes such as motor vehicle crashes (31 percent), homicide (18 percent), suicide (12 percent), and other injuries (11 percent) (CDC 1999). The link between youth mortality, risky behaviors, and substance abuse is well documented. Keeping boys (and the people around them) off drugs saves lives. Boys experiment with numerous illegal substances, but the two most common drugs of choice are marijuana and cocaine. The main ingredient in marijuana is tetrahydrocannabinol (THC), a psychoactive “mind-altering” drug. The most commonly used illegal drug in the United States, marijuana is commonly referred to as “pot,” “grass,” or “weed.” Cocaine is a powerful stimulant that affects the central nervous system. Its users experience a sense of euphoria (or
358
Illegal Substances
A teenager rolls a joint. (Skjold Photographs)
high), increased energy, and mental alertness. It may be inhaled, injected, or smoked (in the form of “crack”). People have experimented with many drugs to find relief from physical pain, even before recorded history began. Marijuana has been around for more than 5,000 years. First grown in Asia, it quickly found markets worldwide. It was first used in America for medicinal purposes, but recreational drug use overshadowed any health uses. Marijuana was been grown in America since the early 1600s, and its use was legal. By the mid-1800s, it was associated with “undesirables,” including poor Americans, African Americans, and immigrants. The negative associations made marijuana the ideal recreational drug of choice for boys and men who wanted to rebel
against the system. Although EuroAmericans regularly used the drug, marijuana was considered an urban problem, a racial problem, and a criminal problem. Once drug use became more common among middle-class Euro-Americans, it also became a “mental health” problem. Drug use was widespread during Prohibition. After its repeal, many states proscribed marijuana; by 1937 marijuana use was illegal in all states. Because marijuana has some medicinal value and is widely used, numerous attempts have been made since 1980 to legalize or reclassify the drug. Both the Drug Enforcement Administration (DEA) and Food and Drug Administration (FDA) have rejected every attempt to date. Although marijuana use is not as common as that of legal “gateway drugs” (to-
Illegal Substances bacco and alcohol), 47 percent of boys currently attending high school have at least tried marijuana. According to the Centers for Disease Control and Prevention (CDC), boys are significantly more likely than girls to experiment with and habitually use marijuana. Past the experimentation stage, more than 25 percent of boys use marijuana more than once a month. Marijuana is readily available on school property. In 1999, about 35 percent of high school boys were offered, sold, or given illegal drugs at school (CDC 1999). Experimentation and habitual use do not significantly differ across ethnic groups, although Latino boys tend to experiment with marijuana at earlier ages than boys of other races (CDC 1999). Experimentation with marijuana and habitual smoking both increase with age. By the twelfth grade, more than 60 percent of boys have tried marijuana. More than 40 percent, however, have already experimented with the drug by the time they enter high school. Surveys in urban areas indicate that around 10 percent of boys had already tried marijuana by their middle school years, some beginning by the age of nine or ten (Chapin 2000). Early experimentation with illegal substances is the first step on the road to addiction. Both genetic and environmental factors contribute to addiction, which leads to health, psychological, and social problems and possibly death from a drug overdose. When marijuana is smoked, the THC is quickly absorbed into the brain and other body tissues from the lungs, so effects occur within minutes. Marijuana decreases short-term memory, impairs the ability to think quickly, and interferes with mind-body coordination. Continued use produces feelings of panic, paranoia, stress, and sensory distortion. Serious
359
health threats, including respiratory and immune deficiencies, come with longterm use. Even small doses affect motor skills long after the high has subsided, resulting in minor to fatal accidents. As for cocaine, in the late nineteenth century, it was legal to buy medicines containing the drug in stores or through the mail. The popularity of early soft drink giant Coca-Cola is not surprising, considering that Coca-Cola contained cocaine until 1903 (when it was replaced with another drug, caffeine). Bayer sold heroin around the same time as a powerful cough suppressant. The Pure Food and Drug Act of 1906 resulted in drastic market changes and the development of nonnarcotic pain relievers (like the Bayer aspirin). As with marijuana, the use of cocaine (and other narcotics) came to be associated with social undesirables. The advent of the “crack house” in the 1980s made a form of cocaine readily available to the poor and the young. (Crack house is a common term for places where people could meet, obtain an inexpensive form of cocaine called “crack,” and use the drugs there with the safety of a lookout; run-down or condemned buildings in poor sections of town are typically used for crack houses.) The wide use of cocaine in the 1970s and 1980s and the widespread use of its various forms across class, age, and racial lines have resulted in a series of educational campaigns, stricter enforcement (the “war on drugs”), and long-term penalties for users and dealers. Nearly 10 percent of boys currently in high school have already experimented with cocaine. Boys are significantly more likely than girls to experiment with or regularly use cocaine, and nearly 5 percent of boys use the drug more than one time per month (CDC 1999). Unlike marijuana
360
Illegal Substances
use, there are ethnic differences in cocaine use. Latino students (18.3 percent) and Euro-American students (11 percent) are more likely to try cocaine than their African American peers (2.8 percent). Not surprisingly, the same pattern emerges in habitual cocaine use. Latino (8 percent) and Euro-American (5.3 percent) boys are more likely than African American boys (1 percent) to use cocaine on a regular basis (CDC 1999). Consistent with all substance abuse patterns, cocaine use increases with age. It also varies greatly from state to state and region to region. Cocaine use causes damage to the blood vessels, heart, respiration, and brain. The first use creates an intense craving for more of the drug. Even occasional use may result in heart attack, seizure, respiratory failure, or stroke. Frequent users experience mood swings, paranoia, weight loss, and loss of sex drive. Tobacco and alcohol are commonly referred to as “gateway drugs.” Their use frequently predicts experimentation with illegal drugs and later substance abuse. In a recent survey of middle school students, 30.9 percent of boys smoked, 54.5 percent drank, and 9.1 percent had already experimented with illegal drugs. Of the drug users, all reported also smoking and drinking. Of the students who never smoked cigarettes or consumed alcohol, none had experimented with illegal drugs (Chapin 2000). Many boys use inhalants as an additional gateway to illegal substances. According to the CDC, nearly 15 percent of boys have sniffed glue, breathed the contents of aerosol spray cans, or inhaled paints or sprays with the intention of getting high. About one-third of the boys who experiment with inhalants become regular users. Use of inhalants is difficult to track statistically because surveys rely on self-reported data.
Unless immediate health problems or accidents occur, inhalant use often goes unnoticed and unreported (CDC 1999). Euro-American (16.4 percent) and Latino (16.1 percent) boys are more likely than their African American peers (4.5 percent) to experiment with inhalants. Similarly, Latino (4.9 percent) and EuroAmerican boys (4.1 percent) are more likely to report using inhalants more than one time per month than their African American peers (2.3 percent). Inhalant use also varies greatly from state to state and region to region (CDC 1999). Unlike patterns for most other substances, inhalant use tends to peak in the late middle school and early high school years and then decrease by the end of the high school years. Unfortunately, inhalants are frequently replaced with marijuana or cocaine or both (CDC 1999). Drug addiction is considered a brain disease. Virtually all American boys are faced with decisions to smoke, drink, and take drugs. Some may have a genetic vulnerability to drug use. Other factors such as personality, family, peer pressure, economics, and environment increase or decrease the risks of a person developing an addictive disorder. Although more questions than answers remain about the causes of illegal substance abuse by adolescent boys, some common elements have emerged from the research: Cigarettes addict adolescents and lead to the use of other drugs. Adolescents learn to abuse alcohol from their parents. They also underestimate the risk of addiction. The combination of peer, parental, and environmental influences encourages early experimentation with legal and then illegal substances, especially among adolescent boys. Most people involved in drug recovery (more than 80 percent) receive outpatient
Immigrants services, including individual or group counseling. About 10 percent receive twenty-four-hour treatment in a hospital, residential facility, or correctional institution. Detoxification (or “detox”) is frequently a first step but is not considered a treatment per se. Substance abusers who begin and end treatment with detox have little chance of beating their addiction. National studies of drug treatment facilities have shown major reductions in substance abuse and criminal activity, but many quickly return to their addictions following treatment (Mooney 1999). John Chapin See also Smoking and Drinking References and further reading Babbit, Nicki. 2000. Adolescent Drug and Alcohol Abuse: How to Spot It, Stop It, and Get Help for Your Family. Sebastopol, CA: O’Reilly. Burnham, John. 1993. Bad Habits: Drinking, Smoking, Taking Drugs, Gambling, Sexual Misbehavior, and Swearing in American History. New York: New York University Press. CDC (Centers for Disease Control). 1999. “Division of Adolescent and School Health’s Information Service Report.” Silver Springs, MD: Government Printing Office. Chapin, John. 2000. “Third-Person Perception and Optimistic Bias among Urban-Minority ‘At-Risk’ Youth.” Communication Research 27, no. 1: 51–81. Gall, Timothy, and Daniel Lucas, eds. 1996. Statistics on Alcohol, Drug and Tobacco Use. Detroit: Thompson. Hanson, Glen, and Peter Venturelli. 1995. Drugs and Society. 4th ed. Boston: Jones and Bartlett. Klier, Barbara, Mark Siegel, and Jacquelyn Quiram, eds. 1999. Illegal Drugs: America’s Anguish. Wylie, TX: Information Plus. Mooney, Cynthia, ed. 1999. Drugs, Alcohol and Tobacco: Macmillan Health Encyclopedia. New York: Macmillan.
361
Pacula, Rosalie. 1998. Adolescent Alcohol and Marijuana Consumption: Is There Really a Gateway Effect? Cambridge, MA: National Bureau of Economic Research. Rudgley, Richard. 1994. Essential Substances: A Cultural History of Intoxicants in Society. New York: Kodansha International. Siegel, Mark, Alison Landes, and Nancy Jacobs. 1995. Illegal Drugs and Alcohol: America’s Anguish. Wylie, TX: Information Plus. Winters, Paul, ed. 1997. Teen Addiction. San Diego, CA: Greenhaven Press.
Immigrants Because historians of immigration have not paid much attention to boys—or girls, for that matter—much of this entry is impressionistic. Since the late nineteenth century, contemporary reformers and subsequent scholars have studied immigrants and their children without much differentiation as to place of birth or gender. Although more boys than girls have been immigrants to America, the percentage of child immigrants has always been quite low. There are no good data for the colonial and early national eras, but all authorities agree, as a 1920 census monograph put it, that “very few children under 14 or 16” immigrated to the United States. For the decade 1910–1919, for example, immigrants under fourteen years of age represented just 13.4 percent of the total, yet among the “native whites of white parentage” living in the United States in the same decade, young persons under fourteen years of age constituted 35.6 percent of the population. This pattern is not surprising; immigration, except for that of refugees, has been primarily an activity of young adults. Generally the ratio of immigrant boys to immigrant girls has been closely balanced, with a slight preponderance of boys: for the
An immigrant family looks at the Statue of Liberty from Ellis Island. (Library of Congress)
Immigrants 1910–1919 decade, the ratio was 101.7 to 100; in 1997, a representative recent year, it was 101.2 to 100. In that year 79,006 boys under fourteen were recorded as entering immigrants, or 9.9 percent of the total (Hutchinson 1956; U.S. Immigration and Naturalization Service 1999, Table 12, 52). There is only scattered information about the numbers of boy immigrants in the colonial era. In one group of early voyages to New England, there were 176 boys under eighteen among 996 passengers. In the colonies south of New England, where there were higher numbers of indentured servants and slaves and fewer families immigrating, the percentage of immigrant boys would have been smaller. The vast majority of indentured servants were young adults and worked for their masters for four to seven years. However, indentured boys could be held for longer terms, often until they were eighteen or twenty-one years of age. Among poor German immigrants to Pennsylvania and Maryland in the late colonial and early national periods, many families arranged to have some of their sons indentured to pay for their passage to America. Once they were established, these immigrant Germans often intended to buy out or redeem part of their sons’ terms of service. Because more than 90 percent of the population was rural, most immigrant boys, like other American boys, worked on farms or in homebased handicrafts. A smaller number were apprenticed in a variety of urban and maritime trades. Because of the shortage of adult labor in the colonies, the labor of boys was highly valued. As Adam Smith observed in Wealth of Nations (1776): In America “the labor of each child, before it can leave [its par-
363
ents’] house is computed to be worth a hundred pounds pure gain” (1937, 70–71). In the nineteenth century, more and more Americans—and an even greater proportion of immigrants—lived in cities and worked at urban occupations. Much is made in American legend of immigrant boys who succeeded. There were some spectacular successes, of whom Andrew Carnegie (1835–1919) is the most celebrated example. His life was and is often used as the real-life exemplar of the “rags to riches” Horatio Alger story, although most of Alger’s formula heroes were native-born and gained only a respectable middle-class competence. Carnegie’s all but incredible rise from “bobbin boy” to “the richest man in the world” is at the opposite pole from the usual immigrant boy experience. Brought to the United States at age twelve, Carnegie had some formal education, good handwriting, and grew up in an intellectually inclined Scottish family of hand-loom weavers who had fallen upon hard times. Most nineteenth-century immigrant boys did not come to America speaking English, much less writing it, and typically received less education than their younger, American-born brothers. Later in the century, when some states began to enact compulsory school attendance laws, the lack of birth records for immigrant boys made it easier for parents who wished to do so to evade those laws, which were not strictly enforced in any event. A recurrent motif of ethnic American autobiographies describes older male siblings whose opportunities for education were sacrificed as they were sent to work at an early age to help support their parents and enable their younger brothers to attend school. Most immigrant boys in the nineteenth century came with families or at
364
Immigrants
least an older relative, but thousands of particularly vulnerable boys, largely from Italy and Greece, were brought to the United States in the charge of a master usually described with the Italian term padrone. Although some boys were apparently kidnapped, others had been indentured to the padrone by needy parents or guardians who remained in Europe. Such indentures were legal in Europe but not enforceable in American courts. Under the padrone system, boys went out daily to peddle, beg, or perform on the streets; indentured boy workers also shined shoes, sold peanuts and candy in theaters, and occasionally worked in factories. The most detailed description of these boys is The Little Slaves of the Harp by John Zucci (1992), which is a good antidote to Alger’s mawkish novel Phil, the Fiddler (1872). Boys in the padrone system were usually poorly clothed, poorly fed, and sometimes subjected to physical abuse. The often sensational press coverage of their deplorable living conditions was an impetus to the various child-saving crusades of the late nineteenth century and the long fight against child labor of all kinds up to and beyond the New Deal era. Immigrant boys are still exploited in American agriculture, although now they usually work in family units. Most twentieth-century immigrant boys came with their families, and it is difficult at this stage of scholarship to write meaningfully about them. The remainder of this entry will focus on certain types of unaccompanied boy immigrants and two celebrated boys whose immigration and status became frontpage news. In the 1930s and 1940s, Adolf Hitler’s rise to power and the systematic persecution of Jews and others created large num-
bers of refugees, most of whom could not find adequate asylum in the Western democracies. A special effort was made in the months before the outbreak of war in September 1939 to pass the WagnerRogers Bill, which would have brought 20,000 Jewish refugee children to the United States. It never even came to a vote in Congress, despite support from such notables as Eleanor Roosevelt and Secretary of Labor Frances Perkins, because congressional majorities were determined to keep immigrant bars up, especially against Jews. Perhaps 250,000 Jewish refugees, all told, came to the United States before, during, and after the war. No reliable data exist about the number of children among them, but they included two boys who later became cabinet officers: Henry Kissinger, secretary of state in the Nixon and Ford administrations; and W. Michael Blumenthal, secretary of the treasury in the Carter administration. In addition, several thousand unaccompanied English children, mostly middle-class, found temporary refuge from German bombs in individual homes all across the United States. Since it was assumed, correctly, that these English boys and girls would return to Britain, there was no significant opposition to their temporary asylum in the United States. In the immediate postwar years Americans devised a variety of refugee programs, including those for displaced persons, Hungarians, and Cubans. These Cold War programs were not specifically designed for children, but the programs did ultimately aid a number of boys and girls. Youngsters have had a higher incidence among refugees than among immigrants generally. A special Cold War refugee program, little publicized at the time, was dubbed “Operation Pedro
Immigrants Pan.” Between 1960 and 1962, this program brought some 11,000 unaccompanied Cuban boys and perhaps 3,000 Cuban girls to the United States and Costa Rica. Although a few of the children were toddlers, about 60 percent were teenage boys. Most came via stillexisting air links. Cuban parents apparently believed rumors, perhaps originated by the Central Intelligence Agency, that the Castro government would seize all children, separate them from their families, and keep them in special camps to turn them into dedicated communists. Bryan Walsh, an Irish-born priest who was in charge of Catholic charities in Miami, told a New York Times reporter in April 2000 that the U.S. government agreed to waive visas for the children as long as the program was administered by a nongovernmental agency, in this case the Catholic welfare apparatus. Most of the children came from Catholic schools and went to live in Catholic family homes, but about 500 were either Protestant or Jewish and were placed by the United Way and the Hebrew Immigrant Aid Society in families of appropriate religious backgrounds. Many if not most of the parents expected to be separated from their children for a relatively short time because they assumed that Castro would soon fall. Many parents eventually followed their children to the United States, but perhaps half were never reunited with their children. Charities placed most boys and girls in middle-class American homes in which Spanish was not spoken. Two immigrant boys were the focus of highly publicized court cases with very different results, which established certain parameters for the legal status of immigrant children and the rights of their parents. In 1980, when he was twelve
365
years old, Walter Polovchak and his two siblings came to Chicago with their parents, who were citizens of the Soviet Union. Five months later, when his parents decided to return to the Soviet Union, Walter and an older sister tried to defect and moved in with a cousin who lived in Chicago. When the parents complained to the police, they took Walter into custody, but on the advice of the Immigration and Naturalization Service (INS) and the State Department, he was not returned to his parents’ custody. He filed for asylum and was given resident alien status, and an Illinois trial court made Walter a ward of the court. An Illinois appellate court reversed this ruling, whereupon the State Department issued a “departure control order” that prevented Walter from being taken out of the country. Walter’s parents then filed a suit in federal court. Not until five years later, on July 17, 1985, as Walter approached his eighteenth birthday (October 3, 1985), which would end his childhood under both American and Soviet law and make the matter moot, did the district court reverse the previous rulings and find for the parents. The federal government appealed, freezing the district court’s order, and got the case argued before the Seventh Circuit Court of Appeals just fiftyfour days later. The very next day, September 10, 1985, the appeals court reversed the lower court’s action, although it found that the government’s actions had violated the elder Polovchaks’ rights to due process. It remanded the matter to the district court for further action, which, in view of Walter’s approaching legal maturity, meant no action at all. Communist parents could not expect fair treatment during the Cold War. Fifteen years later the saga of six-yearold Elián Gonzales was played out more
366
Immigrants
quickly—between November 1999 and June 2000—but with even more publicity and high melodrama. Elián was the sole and seemingly miraculous survivor of a disastrous attempt to flee Cuba by his mother and others. After Elián was saved by a fisherman, officials placed him in the custody of a cousin who was a member of the Cuban exile community in Miami. Elián’s father, who had remained in Cuba, wanted the boy returned. The INS allowed the boy to remain with the cousin, but it supported the father’s claim and argued that a six-year-old did not have the capacity to apply for asylum, a view upheld by both a federal district court and a circuit court of appeals. In the end, Elián and his father, who came to the United States to press his claim, returned to Cuba amid national rejoicing there. However, there were bitter regrets in the Miami Cuban community, which had threatened to keep the boy in defiance of the courts until armed government agents seized him. In an earlier era, the Elián saga might have been played out differently. Interestingly, when reporters interviewed some Pedro Pan immigrants in 2000, they said that they thought it would be better for Elián to return to Cuba. Although those interviewed may not have been representative of the whole group, clearly some of them found permanent separation from their homeland and their families not a happy experience. One other type of unaccompanied boy immigrant merits attention: those adopted by American parents who, in most instances, were of a different ethnicity and/or race than their new children. Adoption of foreign children, which was statistically insignificant until after the Korean War of 1950–1953, boomed in the 1990s. In the first year of the decade, 7,093
adopted children entered; for 1999 the number was 16,369. Of the 12,596 adopted immigrants in 1997, 97 percent were under ten years of age, and nearly half— 46.6 percent—were less than one year old. Only a little more than one-third—36.5 percent—of these adopted children were boys. The imbalance was due to the great preponderance of girls in Asian adoptions: 77.3 percent of the nearly 6,000 adoptions from Asia involved girls. For the rest of the world, the numbers were nearly equal. Table 1 shows the numbers of boys adopted from global regions and selected nations in 1997. TABLE 1: Adoptions of Boys in 1997 by Region/Nation
Place ASIA Korea (ROK) Vietnam China (PRC) EUROPE Russia Romania NORTH AMERICA Guatemala Mexico Haiti SOUTH AMERICA Colombia AFRICA Ethiopia OCEANIA Total
Percent of Number Adoptions from of Boys Region/Nation 1,341 800 134 70 2,406 1,783 263
22.7 53.1 36.3 2.1 48.9 49.2 47.1
530 349 67 53
46.5 48.1 47.2 40.8
259 119 57 20 1 4,594
51.7 56.1 41.9 39.2 33.3
Source: U.S. Immigration and Naturalization Service 1999, Table 15, p. 57. Roger Daniels See also Asian American Boys; Chinese American Boys; Mexican American Boys
Indentured Servants References and further reading Alger, Horatio. 1872. Phil, the Fiddler: or, the Story of a Young Street Musician. New York: Federal Book Company. Bailey, Anthony. 1980. America, Lost and Found. New York: Random House. Berrol, Selma C. 1995. Growing Up American: Immigrant Children in America Then and Now. New York: Twayne. Conde, Yvonne M. 1999. Operation Pedro Pan: The Untold Exodus of 14,048 Cuban Children. New York: Routledge. Hahamovitch, Cindy. 1997. The Fruits of Their Labor: Atlantic Coast Farmworkers and the Making of Migrant Poverty, 1870–1945. Chapel Hill: University of North Carolina Press. Hutchinson, Edward P. 1956. Immigrants and Their Children, 1850–1950. New York: Wiley. Saloutos, Theodore. 1964. The Greeks in the United States. Cambridge, MA: Harvard University Press. Smith, Adam. Wealth of Nations. 1776. Reprint, New York: Modern Library, 1937. Triay, Victor Andres. 1998. Fleeing Castro: Operation Pedro Pan and the Cuban Children’s Program. Gainesville: University Press of Florida. U.S. Congress. House of Representatives. Committee on Immigration and Naturalization. 1939. Admission of German Refugee Children. Hearings before the Committee on Immigration and Naturalization, House of Representatives, 76th Congress, 1st Session on H.J. Res. 165 and H.J. Res. 168, Joint Resolutions to Authorize the Admission to the United States of a Limited Number of German Refugee Children. May 24–June 1, 1939. Washington, DC: Government Printing Office. U.S. Immigration and Naturalization Service. 1999. 1997 Statistical Yearbook. Table 12, p. 52. Washington, DC: Government Printing Office. U.S. Immigration Commission. 1911. Abstracts of Reports of the Immigration Commission. Washington, DC: Government Printing Office. Zucci, John E. 1992. The Little Slaves of the Harp: Italian Street Musicians in Nineteenth-Century Paris, London, and
367
New York. Montreal: McGill-Queen’s University Press.
Indentured Servants During the colonial period, indentured servants were a major part of European migration to British America, constituting nearly half of all white immigrants prior to the nineteenth century. In exchange for the cost of their passage, these servants had agreed to labor contracts, called “indentures” after the indented cut marks that identified the contract itself. In exchange they promised a fixed term of labor for a named master or his assigns. These young emigrants were primarily boys in their late teens and early twenties, and they comprised a significant part of the labor force that established tobacco as a profitable staple crop in the Chesapeake colonies. A shortage of these bound laborers in the Caribbean and the Chesapeake in the late seventeenth century led planters to substitute slaves for them, setting slavery on a firm economic footing in British America. When the flow of emigration resumed in the early eighteenth century, English indentured servants were joined by new streams of poor immigrants from Ireland and Germany, many of whom found their passage fees as “redemptioners,” a new form of indentured labor. The stream of bound immigrant labor was now directed to new locations, including Pennsylvania and North and South Carolina. By the time of the American Revolution, however, indentured servitude had declined in significance because of a decrease in overall immigration to the British American colonies and increased reliance upon slave labor in the South and free wage labor in the northern colonies.
368
Indentured Servants
Indentured servitude was probably derived from early modern English farm servitude, a customary arrangement in which boys and girls above ten years of age would leave home to work for wages and room and board on a short-term contractual basis. After the founding of Virginia, a shortage of agricultural labor led to a series of experiments for encouraging emigration to America by the joint-stock company that founded the colony. The high cost of transportation to the colony ultimately led to an adaptation of the farm servitude institution, in which a longer duration of service compensated the master for the high initial investment in the servant’s passage, and the transferability of that contract enabled colonial planters to rely upon agents abroad to secure their labor force. Contemporaries generally viewed indentured servants unfavorably. William Bullock, for example, described them as “idle, lazie, simple people,” while others perceived them as “convicts, paupers, and dissolute persons of every type” (Horn 1979, 56–57). Convicted criminals were, in fact, sent to the colonies as indentured servants, particularly after the Transportation Act of 1717 regularized the felonies for which the punishment was applicable and standardized the terms of servitude for these offenses. Although as many as 30,000 convicts were transported to the colonies in the eighteenth century, they were only a small portion of the total flow of indentured servant emigration. Indentured servants came from across the social spectrum of English society but for the most part tended to be ordinary, propertyless young men who were representative of the majority of the English population. As a result of population growth, agricultural modernization that reduced agricultural
labor demand, declining overseas trade, and a lack of work in cities and towns, these young unskilled workers, agricultural laborers, and craftspeople were forced to move throughout England in search of employment. The decision to journey across the Atlantic as an indentured servant was an extension of this pattern of migration in search of work. Very little is known about how indentured servants were recruited. Although seventeenth-century commentators feared the “Inveigling, purloining, carrying and Stealing away [of] Boys, Maides and other persons,” probably very few servants were actually kidnapped and taken to America (Horn 1979, 93). More likely, potential servants learned of the opportunity for work from recruiters, billboards, or word of mouth. Nonetheless, fears to the contrary led to the passage of laws requiring the registration of indentured servants and helped regularize and standardize contracts, protecting the emigrants. After servants were recruited, they signed the indenture contract, promising to serve their master in employments he might assign, often in a specific colony, for a stated length of time—usually four to seven years. In return, the master promised to transport, clothe, and feed the servant and frequently to reward him or her at the completion of his term with “freedom dues”: a sum of money, clothing, land, or tools. After signing, servants made the nightmarish eight- to fourteen-week voyage across the Atlantic. Suffering from overcrowded conditions, inadequate or poor-quality provisions, and poor sanitation, many servants were plagued by illness and death. As many as one-quarter of those transported to Pennsylvania, for example, did not survive the trip. Upon arrival, servants who survived the journey were subject “like a parcel of sheep” to the
Indentured Servants servant market, in which prospective buyers inspected the human cargo and, if satisfied, bought the servants and took them home. After purchase, indentured servants became, essentially, the property of their masters. They could be bought and sold in almost every colony, were subject to corporal punishment, and could not engage in trade, vote, or marry without permission. Indentured servants did have certain rights that protected them against abuse and cruelty according to the custom of the particular colony, including limitations on whipping, the right to hold property, and the right to complain in court. These rights rarely conflicted with the master’s property rights, however; even when a master was charged with some abuse, he was frequently just admonished to refrain from the behavior. In the rare case when a cruel master did lose his servant, he recovered his costs by selling him elsewhere. Indentured servants did resist the harsh conditions of their servitude, but only very rarely as part of any organized rebellion. Much more frequently, disgruntled indentured servants simply ran away. The consequences could be harsh; if caught, servants would have the length of their term of service extended to compensate the master for their lost labor. If servants survived their terms of indenture, they began a new life in colonial society with very little other than their freedom dues. During the seventeenth century, this alone was no easy task; as many as 40 percent of Maryland’s indentured servants died of disease prior to completing their terms (Carr and Menard 1979). Despite their meager resources, however, freedmen in the first decades after settlement managed to attain a good degree of economic mobility. Nearly half
369
of Pennsylvania’s seventeenth-century freedmen acquired property, and their peers in Maryland prior to the 1660s managed to acquire considerable property and establish themselves as small planters, achieving status as masters of their own families and as officeholders in local government. Some even managed to become wealthy and achieve acceptance as part of the Chesapeake gentry. Although indentured servants emigrated to nearly every colony in the initial years of settlement, during these early years servants traveled primarily to the West Indies and then the Chesapeake, where demand for unskilled agricultural labor to clear forests and cultivate staple crops was highest. In the Chesapeake by the 1680s and even earlier in the Caribbean colonies, however, opportunities for freedmen declined as growth in the region’s staple crop ceased. As earlier emigrants’ own children became a significant part of the population, moreover, the status gap between freedmen and the rest of society widened even further, given the native-born children’s advantages in inheriting their parents’ property. Declining economic opportunity, in combination with a reduction in unemployment and rising wages in England, led to a decline in the number of people willing to emigrate as indentured servants to the Chesapeake. Planters desperate for agricultural labor turned instead to black slaves, following the example of the West Indian colonies. By the turn of the century, the institution of slavery had been established there on a firm footing. By the early eighteenth century, the nature of the indentured servant population had changed. An even larger proportion of English indentured servants were male, and these youths on average
370
Indentured Servants
became both more likely to have occupational skills and less likely to have come from agricultural backgrounds. For the most part, those servants who traveled to the Chesapeake did not work in the fields, as they had during the previous century, but rather performed skilled labor on plantations where the unskilled fieldwork was done by slaves. By the 1720s an increasing proportion of indentured servants were attracted not to the Chesapeake but to the newer mid-Atlantic colonies and to North and South Carolina, where open land promised economic opportunity. In Pennsylvania in particular, a large proportion of the indentured servants served in an urban environment, especially after midcentury when Philadelphia’s growth and prosperity as a port town created labor demand among the city’s merchants and craftspeople. The English indentured servants were joined in the first quarter of the eighteenth century by a rising tide of Irish and German indentured emigrants, driven by depressed trade, famine, and, in the case of the Germans, the hardships of war. Those German servants emigrated under a slightly different labor arrangement that was necessitated by the different conditions of their immigration. Unlike the predominantly single, young, male British migrants, most Germans emigrated as families who were likely to have at least some small financial resources. Rather than binding themselves for a term of years prior to the journey, these emigrants partially paid for their passage, contracted to pay the balance in America, and were given a short period of time after arrival to find the balance. If the necessary funds could not be found, the shipper could then sell the emigrants,
usually for a period of time commensurate with the remaining debt. By the time of the American Revolution, indentured servitude was in decline. Recurring war throughout the eighteenth century periodically disrupted the flow of shipping to the colonies and with it the volume of potential emigrants. In the Chesapeake, planters began to substitute skilled slave labor for indentured white servants, whereas in the mid-Atlantic colonies farmers increasingly relied upon free wage laborers, who were better suited to the erratic seasonal demands of wheat cultivation. When depression struck in the wake of the Seven Years’ War (1756–1763), moreover, a rising number of impoverished Philadelphians provided a ready supply of cheap urban wage labor. Indenture of European immigrants persisted into the early nineteenth century, but it became numerically insignificant, both because of a decline in the total volume of immigration after the revolution and because of the increased reliance in the Chesapeake on black slave labor and in the northern colonies on free wage labor. Indentured servitude of nativeborn boys, however, persisted in the nineteenth century. As northern states gradually emancipated their slaves, an owner’s labor investment was protected by allowing him or her to retain slaves born before such laws went into effect and to keep in bondage until a specified age all children born to slave mothers thereafter. The result was a blurring of slave and indentured labor and a brisk market in black children not yet free who would serve for a large portion of their productive lives. Not only did indenture of free or manumitted children provide a source of cheap labor for white
Intelligence Testing masters, but the practice also was used by black parents emerging from slavery as they struggled to gain an economic foothold. In addition, indenture continued to be used, as it had been throughout the colonial period, to place orphaned or destitute children. Welfare agencies, public and private, indentured older boys placed under their care to work for farmers and sometimes for craftspeople throughout the nineteenth century. Sharon Braslaw Sundue See also Jobs in the Seventeenth and Eighteenth Centuries; Plantations; Slavery References and further reading Carr, Lois Green, and Russell R. Menard. 1979. “Immigration and Opportunity: The Freedman in Early Colonial Maryland.” Pp. 206–242 in The Chesapeake in the Seventeenth Century. Edited by Thad W. Tate and David L. Ammerman. New York: W. W. Norton. Galenson, David. 1981. White Servitude in Colonial America: An Economic Analysis. Cambridge: Cambridge University Press. Horn, James. 1979. “Servant Emigration to the Chesapeake in the Seventeenth Century.” Pp. 51–95 in The Chesapeake in the Seventeenth Century. Edited by Thad W. Tate and David L. Ammerman. New York: W. W. Norton. Klepp, Susan, and Billy Smith, eds. 1992. The Infortunate: The Voyage and Adventures of William Moraley, an Indentured Servant. University Park: Pennsylvania State University Press. Morgan, Edmund S. 1975. American Slavery, American Freedom. New York: W. W. Norton. Salinger, Sharon. 1987. “To Serve Well and Faithfully”: Labor and Indentured Servants in Pennsylvania 1682–1800. Cambridge: Cambridge University Press. Smith, Abbott Emerson. 1947. Colonists in Bondage: White Servitude and Convict Labor in America 1607–1776. New York: W. W. Norton.
371
Indians See Native American Boys
Intelligence Testing Intelligence is a social construction that can be defined as a social consensus arrived at either through a system of values or a formal test that indicates that an individual possesses certain qualities of value to the group. Intelligence may include the ability of an individual to learn particular tasks or concepts recognized as important on an informal or formal basis. Social determinations of intelligence change over time and take place in multiple contexts such as the family, school, and workplace. Conflicting concepts of intellectual ability can vary widely between different cultures and between different groups within any one society. Political, social, and economic institutions generally favor the perception of intelligence that is held by the dominant group, and it is this version of intelligence that is reinforced in institutions (Lemann 2000). In male-dominated societies, males have certain advantages over females in the process of being identified as intelligent. Males also face certain hazards because of expectations for their performance. This is also highly complicated by the politics of social class and ethnicity or cultural background. An individual can be designated as intelligent in one context, such as the home or workplace, and as unintelligent in another, such as the school. In the twentieth century, the ascription of intelligence was institutionalized as a demonstration of ability in school and often determined by a psychometric test. Formal intelligence tests are less than 100 years old, but they are the most in-
372
Intelligence Testing
fluential of standardized testing procedures that try to create universal measures of human ability. In England, the work of Charles Darwin (1809–1882) on evolution, based on the idea of the survival of the fittest (first published in 1859), offered a theoretical framework to investigate ability as innate. Sir Francis Galton (1822–1911), Darwin’s cousin, constructed tests of moral traits and statistical measures of sensory perception in order to investigate differences between social classes. In 1890, at the University of Pennsylvania, James McKeen Cattell (1860–1944) introduced the term mental test and developed a battery of fifty instruments to measure intellectual performance. The varied performance of schoolchildren in a newly formed system of public education in France attracted the attention of the minister of public education. He commissioned Alfred Binet (1857–1911) and his colleague Théodore Simon to develop an objective means to identify children who were not likely to profit from normal instruction. In 1905 Binet completed a scaled set of tasks that were introduced as a test. A 1908 revision grouped Binet’s scales by difficulty, and a 1911 version included the suggestion of German psychologist William Stern to divide the score by age. Charles Spearman introduced the idea that the tests reflected a general “g” factor of innate intelligence. In 1916 Lewis Terman (1877–1956), a psychologist at Stanford University, revised Binet’s scales, including Spearman’s notion of the “g” factor and the statistical device of dividing the score by age, which resulted in the familiar intelligence quotient score known as IQ. During World War I, under the direction of Robert Yerks in the United States, group tests using the Binet scales were devised to test men drafted into the
army. Terman subsequently adapted these tests, known as the Army Alpha and Beta Tests, for use in schools and renamed them the Stanford-Binet Intelligence Scales. In 1920, 400,000 of these new tests were administered to schoolchildren. Professional educators found the tests an efficient means to standardize instruction by identifying children according to their scores. By 1937 Terman and Maud Merrill’s revision of the Stanford-Binet was the “golden” standard for IQ measurement (Chapman 1988). The use and number of standardized tests increased exponentially after 1920. By 1945 more than 5,000 tests were available to educators. The Stanford-Binet, which focuses on general intelligence identified with verbal ability, was followed by the Wechsler Intelligence Scale for Children (WISC), which looks at both verbal and nonverbal aspects of intelligence. The idea that intellectual ability can take multiple forms created a surge of test development as well as controversy in the 1990s (Flanagan and Genshaft 1997, 146–149). Today, the most popular intelligence test, the third version of the Wechsler Intelligence Scale for Children (WISC III), uses multiple subtests under the theory that there are multiple intelligences. More than 100 million tests are given annually in the United States, of which 44 percent are intelligence tests. Testing continues to contribute to “tracking,” or the sorting of students into different curricula, such as the college preparatory or vocational education. Testing has been especially important in the United States, and it continues unabated in spite of criticism that the tests are unreliable because they are inconsistent in what they measure. Critics argue that even if the tests are internally reliable, the
Intelligence Testing scores do not reflect an individual’s innate capacity to learn. A strong case can be made that test scores reflect cultural difference in ability or language proficiency that varies through socialization and social class, not heredity. The cultural component of intelligence as measured by both the original and new tests is also an important factor if intelligence testing is examined in relationship to gender. Male socialization in patriarchal, class-based societies both trains boys for dominance and puts them at risk. The perception of intelligence is a central aspect of middle- and upper-class male privilege. In modern industrial and postindustrial societies, formal socialization takes place in school. Although girls do better in school than boys in the early grades, they fall behind in the upper grades. Even though there are few differences in males and females on intelligence tests and no reason to believe that gender is a factor in intelligence, females perform lower on all types of standardized achievement tests in the upper grades. Test results reflect differences in the classroom climate that is female-centered in the elementary grades but male-centered in middle school and high school. Boys in the upper grades enjoy more access to educational opportunities and fields of knowledge such as math and science. This disparity reflects a male-oriented knowledge system in school that is based on formality and abstraction to the exclusion of narrative and contextual thinking. It also shows the effect of age as well as social class and gender, as boys and girls in the upper grades begin to conform to social expectations of smart, capable males and not-so-smart, dependent females. In fact, female achievement drops at age thirteen and reaches a plateau in secondary-level math, reading, social studies, and science. Chal-
373
lenging new fields such as computer science are 63 percent male-dominated. Male high school graduates are twice as likely as females to be reading above grade level. Reading level is not associated with dropping out of school for girls, but it is a significant factor for boys, where only 8 percent of poor readers graduate and 50 percent drop out (Fine 1991, 20, 244). Boys who are able to conform to middle-class gender expectations increase their advantage over time, whereas those who cannot or do not conform lose ground. When males fail to live up to the criteria for access to privilege, including being considered able and intelligent, they are placed at risk. Boys are more likely to test into special education, and they outnumber girls in classes for the mentally retarded, learning disabled, and emotionally disturbed by ratios as high as 30 to 70 percent. African American students, or ones with cultural or language differences, are more likely to be in classes for the mentally retarded or emotionally disturbed than Anglo-European students. Workingclass and poor children are more likely to score lower than middle-class and upperclass children on intelligence tests and other standardized tests in general. A longitudinal study of adolescence in Oakland, California, in the 1930s found that middle-class girls scored lower on the Stanford-Binet than middle-class boys (Elder 1974). This gender difference was even greater for working-class boys and girls. Overall, working-class children scored as many as ten points lower than middleclass children. Such indications of class, gender, and racial bias in the tests prompted many critics to question the use of intelligence testing by the 1970s. Some states, such as California, banned the use of the tests in the special education placement of certain populations. For example,
374
Intelligence Testing
African American students consistently score one standard deviation behind Anglo-European students and are persistently overrepresented in special education classes. These scores reflect learned behavior and cultural expectations as well as class and gender biases. Throughout the history of Western civilization, boys and girls have been socialized differently. Children still are assigned tasks by gender in work and play environments. They are judged as intelligent or capable by family members, peers, and other adults according to their ability to solve problems, accomplish tasks, and behave with minimal supervision or disruption. But school performance or performance on psychometric tests may not coincide with these family or peer interpretations. School-identified intelligence can differ widely from the social judgments of peers if the individual’s social class and experiences vary from the mainstream culture. In fact, youth cultures, subcultures, and countercultures often operate in opposition to mainstream society. In Fugitive Cultures: Race, Violence and Youth (1996), Henry Giroux argues that popular culture directed at youth manipulates youth culture and exploits violence, sexism, and racism. Youthful resistance to dominant ideologies is encouraged by the popular media and puts lower-class and workingclass adolescents, and especially minority males, at risk. The rise in school violence by boys provides recent evidence that the middle and upper classes excuse what might otherwise be judged as aberrant behaviors—competitiveness, aggressiveness, emotional dissonance, recklessness, toughness, and physical prowess—to the point of self-endangerment. Males are indulged in extroverted behavior and encouraged under circumstances in which
females are punished or ignored. Youthful males learn that they are supposed to fend off competition, control their expression of fear, and override their shortcomings with bravado. Although it was proven early that criminal behavior is not correlated with a low IQ, this mythology continues in popular thought. Nature-nurture controversies on whether behavior is inborn or learned, first set off by the increase in intelligence testing in the 1920s, also continue. Boys who do not conform to middle-class ideals risk being identified as dangerous, even mentally defective or unstable. It does not matter if nonconformity is due to social class background rather than innate ability. Nowhere is this more clearly demonstrated than in the United States in some inner-city environments, where an African American boy is more likely to end up in jail than to graduate from high school. Other groups such as Mexican American migrant workers and American Indians on reservations are similarly disadvantaged. The cost to boys who do not match Anglo-European middle-class standards can be a socially (not genetically) hereditary form of guaranteed failure. These outcomes are legitimated by low IQ and other standardized test scores. Boys must match the test scores as well as the ideals of the dominant culture if they are to take their place in the world of male privilege. Theresa Richardson Erwin V. Johanningmeier
References and further reading Beal, C. R. 1994. Boys and Girls: The Development of Gender Roles. New York: McGraw-Hill. Canada, G. 1998. Reaching Up for Manhood: Transforming the Lives of Boys in America. Boston: Beacon Press.
Internet Chapman, P. D. 1988. Schools as Sorters: Lewis M. Terman, Applied Psychology and the Intelligence Testing Movement, 1890–1930. New York: New York University Press. Elder, G., Jr. 1974. Children of the Great Depression: Social Change in Life Experience. Chicago: University Press of Chicago. Fine, M. 1991. Framing Dropouts: Notes on the Politics of an Urban Public High School. Albany: State University of New York Press. Flanagan, D. P., and J. L. Genshaft, eds. 1997. “Issues in the Use and Interpretation of Intelligence Testing in Schools.” School Psychology Review 26: 2. Flanagan, D. P., J. Genshaft, and P. L. Harrison, eds. 1997. Contemporary Intellectual Assessment: Theories, Tests and Issues. New York: Guilford.
375
Gilmore, D. 1990. Manhood in the Making: Cultural Concepts of Masculinity. New Haven: Yale University Press. Giroux, Henry A. 1996. Fugitive Cultures: Race, Violence and Youth. New York: Routledge. Kincheloe, J. L., S. R. Steinberg, and A. D. Gresson III. 1997. Measured Lies: The Bell Curve Examined. New York: St. Martin’s Press. Lemann, N. 2000. The Big Test: The Secret History of the American Meritocracy. New York: Farrar, Straus and Giroux.
Internet See Computers
J Jefferson, Thomas
solidified with his courtship and marriage in 1739 to William’s cousin Jane. He named his Albemarle frontier plantation “Shadwell,” after the parish of his wife’s East London birthplace, and moved his family to a home built on the north bank of the Rivanna River in the fall of 1742. His son Thomas later built Monticello on a small mountain on the south side of the river, within view of his boyhood home and about 5 miles southeast of the present city of Charlottesville. In his brief autobiography written when he was seventy-seven, Thomas Jefferson reported that his father was “the 3rd or 4th settler of the part of the county in which I live” and described him in admiration as a man whose “education had been quite neglected; but being of a strong mind, sound judgment and eager after information, he read much and improved himself insomuch that he was chosen with Joshua Fry professor of Mathem[atics]. in W[illiam] & M[ary] college to continue the boundary line between Virginia & N. Caroline . . . and was afterwards employed with the same Mr. Fry to make the 1st map of Virginia which had ever been made.” Of his mother, Jane, he commented only that when she married Peter at the age of nineteen, she was the “daur [daughter] of Isahm Randolph one of the seven sons of that name & family settled at Dungeoness in Goochl[an]d. They trace
Revolutionary leader and writer of the Declaration of Independence, third president of the United States (1801–1809), and one of the founders of its first political parties, Thomas Jefferson (1743–1826) has been revered internationally for his intellectual contributions to the ideals of liberty and has been condemned by recent scholars for his failure as a southern plantation slaveholder to practice those ideals. These and other aspects of his complex personality and contributions to American identity were shaped in important ways by his boyhood experiences in colonial Virginia, which include his frontier birth, his childhood daily life within large households, the influence of his parents, and the impact of his earliest teachers, William Douglas and James Maury. Born April 13, 1743, the third child and first son of Peter and Jane Randolph Jefferson, the infant Thomas entered the scattered community at the western edge of Virginia settlement where 106 whites lived in Albemarle County in 1745. Peter had patented there in 1736 a tract of 1,000 acres on the Rivanna River, 60 miles west of the land he inherited from his father on Fine Creek in Goochland County. While living at Fine Creek, he had befriended William Randolph, the heir to nearby Tuckahoe plantation. Peter’s connection with the powerful Randolph family was
377
378
Jefferson, Thomas
Bust of Thomas Jefferson by Jean-Antoine Houdon, 1783 (Library of Congress)
their pedigree far back in England & Scotland, to which let every one ascribe the faith & merit he chooses” (Peterson 1984, 3). In August 1745, when Thomas was two years old, Peter Jefferson moved his family, now including a newborn third daughter, to the Tuckahoe plantation. William Randolph had died, leaving his three orphaned children and his plantations to the guardianship of Peter, his friend and cousin by marriage. Thomas Jefferson recalled to his grandchildren that his earliest memory was of being carried to Tuckahoe on horseback by a slave. The Jefferson family remained at Tuckahoe until 1752. While there, Jane bore another daughter, Martha, and two
sons who both died within a month of their birth. After her return to Shadwell, she bore another daughter and twins (a boy and a girl). Thus in fifteen years she bore ten children, eight of whom survived infancy and childhood, a factor that surely influenced the boyhood experiences of her eldest son. At Tuckahoe, the extended family in which Thomas spent his earliest childhood included two boys and six girls who in 1752 ranged in age from his sixteenand fourteen-year-old Randolph cousins to six-year-old Martha. Thomas Mann Randolph, the heir to Tuckahoe and the only other boy, was two years older than his cousin Thomas Jefferson. In his adulthood, Thomas Jefferson spoke or wrote little of his Tuckahoe boyhood years. His experiences there can be partially reconstructed from the survival of its household plantation buildings into the present. A row of slave cabins directly to the west and a small schoolhouse just east of the main house, immediately outside the door to the central salon where the family gathered, probably formed the immediate boundaries for the roaming of two small boys, watched over at first by slaves and perhaps later by their older sisters. Jefferson’s oldest sister Jane remained a favorite and friend until her death in 1765. He remembered learning to sing the psalms from her, and perhaps from her came his lifelong love of music. Whether he learned, perhaps from slaves, to play the violin at Tuckahoe or later at Shadwell is unknown. The schoolhouse was built to accommodate the stipulation in William Randolph’s will that his son be taught at home rather than be sent away to England or to the College of William and Mary at Williamsburg for his education. Boys and girls alike learned their letters
Jefferson, Thomas there from the tutor John Staples, about whom little is known. He almost certainly used as a textbook for his pupils The Royal Primer: Or, an Easy and Pleasant Guide to the Art of Reading, Authoriz’d by his Majesty King George II. To Be Used throughout His Majesty’s Dominions, rather than the presently betterknown New England Primer. Lessons were not all literary, however: Peter Jefferson carefully reported to the other Randolph trustees the frequent employment of a dancing master. The only surviving reminiscence of his earliest education in this “English school” is a tale told by Jefferson’s great-granddaughter that “when five years old, he one day became impatient for his school to be out, and, going out, knelt behind the house, and there repeated the Lord’s Prayer in an effort to hasten the noonday dinner” (Randolph 1978, 23). Peter Jefferson could have spent relatively little time at Tuckahoe, for he balanced his guardianship responsibilities and oversight of the Fine Creek plantation in Goochland County with increased official duties as surveyor, justice of the peace, and lieutenant colonel of the militia in Albemarle County and with extended surveying trips with Joshua Fry into the wilderness farther west. When he was in residence, it was frequently with a party of gentlemen; indeed, Thomas Jefferson reported his father’s later habits at Shadwell in this way: “My father, had a devoted friend, to whose house he would go, dine, spend the night, dine with him again on the second day, and return to Shadwell in the evening. His friend, in the course of a day or two, returned the visit, and spent the same length of time at his house. This occurred once every week and thus you see, they were together four days out of
379
the seven” (Randolph 1978, 24). The children’s adult contacts at Tuckahoe consisted primarily of his mother’s extended Randolph family, who lived within relatively easy traveling distance. Her father Isham Randolph had died in 1742 just before Thomas’s birth, but her mother Jane lived until 1760. Although there is no mention of his grandmother in any surviving records, as an adult Thomas Jefferson remained in contact with and became a frequent source of assistance to his many Randolph first and second cousins. When the Jefferson family returned to Shadwell in 1752, Thomas remained behind at Tuckahoe, to be tutored with his cousin Thomas Mann Randolph by William Douglas, the “Latin school” of Jefferson’s autobiography. Although Peter Jefferson’s account book records payment to Douglas of £16 for tuition and purchase of books for his son, it is not clear whether the two boys boarded with Douglas or rode the dozen miles daily from Tuckahoe. Born and educated in Scotland, Douglas became the energetic rector of St. James Northam parish in 1750. He preached in two churches, for which he initiated and kept careful records, and carried on a correspondence with other Virginia Anglican clergy. Between 1752 and 1757 he tutored Thomas Jefferson and Thomas Mann Randolph in Greek, Latin, and French. The Tory convictions that deprived him of his Goochland parish in 1777 during the revolution may have colored his pupil’s dismissal of him as “a superficial Latinist, less instructed in Greek” (Peterson 1984, 4). In summer Thomas Jefferson returned to his family in the mountains of Albemarle County. Shadwell was a busy frontier economic center, for Peter had built a mill on the banks of the Rivanna not far
380
Jefferson, Thomas
from his house. The weekly visits of Joshua Fry, the frequent connections with nearby neighbor Thomas Walker, the occasional encampments of Indians traveling to Williamsburg to treat with the colonial government—all made indelible impressions on the growing boy. Jefferson described in an 1812 letter to John Adams his youthful presence at the farewell speech of “the great Outassete, the warrior and orator of the Cherokees,” the evening before the chief’s departure for England. “The moon was in full splendor. . . . His sounding voice, distinct articulation, animated actions, and the solemn silence of his people at their several fires, filled me with awe and veneration, altho’ I did not understand a word he uttered” (Peterson 1984, 1263). Jefferson also remembered from those years roaming the woods of the Shadwell plantation, hunting deer, fox, partridge, and hare. This remembered idyllic time ended abruptly with the sudden death of Peter Jefferson in the summer of 1757. He left an estate of nearly 7,500 acres, primarily in Albemarle County, and eight children to the guardianship of his wife and several trustees: John Harvie, Walker, and his wife’s cousins Peter Randolph and Thomas Turpin. Between them, they determined in January 1758 to send fourteen-year-old Thomas to school with the Reverend James Maury, whose Fredericksville parish was a dozen miles from Shadwell. With eight children of his own, Maury supplemented his income by constructing a log schoolhouse and taking into his household as boarders young Jefferson, Dabney Carr, the James Madison who later became the Anglican bishop of Virginia and president of the College of William and Mary, and John Walker. These boys and Maury’s two elder sons became Jefferson’s lifelong friends. Once
again he was part of a large household, but this one was dominated by boys rather than girls. The elder Maury had been born in Dublin in an émigré French Huguenot family. His parents migrated to North America, where he was educated at William and Mary College. The parish he had served since 1752 was an extensive one, requiring travel between three churches and a chapel that were miles distant from each other. As a teacher, Maury valued the Greek and Latin to be learned from classical authors but insisted that it was far more practical for Virginia young men to master their own language and to learn to think rather than simply to memorize. Maury held passionately expressed convictions about the necessity to convert and baptize slave as well as white children, the threat to the Church of England from the growing influence of itinerant Baptist and Presbyterian dissenting clergy in western Virginia, and the dangers to England of French incursions in the interior of North America. While Jefferson was a pupil in his household, Maury became embroiled in the controversy over passage of the “Two-Penny” act limiting payment to clergy, which resulted, after Jefferson had left Maury’s household, in the celebrated “Parson’s Cause” (in which Maury was the plaintiff), a case that catapulted Patrick Henry to notoriety for his oratory in 1763. In short, during two key years of his early adolescence, Thomas Jefferson was challenged to think about a broad variety of issues: slavery, the value of lands west of the mountains, the importance of reason in education, and the relationship between church and state. Maury’s library of 400 volumes, the largest Jefferson had seen, became a model for his own collection of books; quite probably it was Maury who started him on his early
Jobs in the Seventeenth and Eighteenth Centuries habit of “commonplacing,” or copying memorable passages from his reading. A few months before his seventeenth birthday, Thomas Jefferson wrote in January 1760 to his principal trustee John Harvie, giving well-reasoned arguments for going to the College of William and Mary in Williamsburg. Shortly after his matriculation at the college, he began the adult momentum of his life from student to lawyer, revolutionary, politician, and statesman. In his autobiography, Jefferson attributed to the teachers he met there, William Small and George Wythe, the most important influences on his youth, which “probably fixed the destinies of my life” (Peterson 1984, 4). Important as those teachers were, they succeeded in part because they were able to build on lessons learned and experiences gained in the years of Jefferson’s earlier boyhood in the hills, plantations, and households of western Virginia that forever remained in his heart and mind as home. Constance B. Schulz References and further reading Kimball, Marie. 1943. Jefferson: The Road to Glory, 1743 to 1776. New York: Coward-McCann. Malone, Dumas. 1948. Jefferson the Virginian. Boston: Little, Brown. Maury, Ann. 1853. Memoirs of a Huguenot Family. New York: G. P. Putnam. Moore, John Hammond. 1976. Albemarle: Jefferson’s County, 1727–1976. Charlottesville: University Press of Virginia. Peterson, Merrill D. 1970. Thomas Jefferson and the New Nation. New York: Oxford University Press. ———, ed. 1984. Thomas Jefferson: Writings. New York: Library of America. Randall, Henry S. 1858. The Life of Thomas Jefferson. New York: Derby and Jackson.
381
Randall, Willard Sterne. 1993. Thomas Jefferson: A Life. New York: Henry Holt. Randolph, Sarah N. 1978. The Domestic Life of Thomas Jefferson, Compiled from Family Letters and Reminiscences, by His GreatGranddaughter. 1871. Reprint, Charlottesville: University Press of Virginia. Sowerby, Millicent E., comp. 1952–1959. Catalogue of the Library of Thomas Jefferson. Washington, DC: Library of Congress. Wilson, Douglas L., ed. 1989. Jefferson’s Literary Commonplace Book. New Jersey: Princeton University Press. A volume in the Second Series of Julian P. Boyd et al., eds., 1950– , The Papers of Thomas Jefferson. Princeton: Princeton University Press.
Jobs in the Seventeenth and Eighteenth Centuries In colonial America, all but the wealthiest boys and girls spent much of their time at labor in their parents’ households or as apprentices, servants, or slaves for the benefit of their masters. Work was one of the most salient aspects of a young person’s life. In a world in which children under sixteen years of age comprised half the total population, their labor contributions were necessary to the successful functioning of the economy. Boys worked in nearly every part of the economy: as agricultural laborers, in craft production, and in manufacturing. By the late eighteenth century, as opportunities for extensive formal schooling increased, more and more middle-class parents in urban settings began sending their sons to school, keeping them out of the labor force for longer periods. These parents had begun to perceive schooling as the route to advancement in status and income, foreshadowing the nineteenth-century belief that childhood should be spent
382
Jobs in the Seventeenth and Eighteenth Centuries
An American farm scene: reaping with oxen, Columbia Magazine, September 1788 (Library of Congress)
not in work but in the classroom. Nonetheless, in the late eighteenth century, children at the margins of society—the poor and African Americans—continued to work at a young age. In a society in which extensive, fulltime formal schooling was available only to children from the most elite backgrounds, work was thought to provide children and youth with the vocational training that would be necessary in their adult lives. The educational function of work was bolstered by a moral purpose: work would keep young people from the sin of idleness, thought to be the root cause of crime and poverty. Isaac Watts, a well-known contemporary hymn writer, made the point clearly through a child’s words: “In works of Labour or of Skill, I would be busy too: For Satan finds some Mischief still, For idle Hands to do” (Cunningham 1991, 23). Such beliefs lay behind poor relief laws passed throughout the British American colonies in the seventeenth century and enforced through-
out the period. These acts stipulated that children whose parents could not afford to maintain them or, in the case of Massachusetts and Virginia who brought them up in idleness, could be removed from their homes and placed until adulthood as apprentices or servants in more proper environments. Not only would these poor children, through labor, be given training perceived as suitable to their social station, but their placement relieved the community of the financial burden of supporting them while they earned their own keep in their masters’ households. Boys’ work was especially important in the process of creating and sustaining farms in America. In New England during the seventeenth century, farmers were particularly dependent on the labor of their sons. Given a shortage of indentured servants after the first decades of settlement and the high cost of hired hands, a family’s economic success relied upon cooperation between the generations. A little boy began to help his father
Jobs in the Seventeenth and Eighteenth Centuries when he was very young: about age five or six, under the watchful eye of parents or siblings he might fetch tools, pick fruit, or help drive animals home from the fields. By eleven or twelve he would begin to do heavy chores, including helping with the plowing and cutting and carrying grain and wood. He would also begin to take on independent responsibility for the care of livestock. During his early teens, a boy could practically do the work of a grown man; the value of his labor more than offset the cost of his keep. By his midteens, a boy could do most farm tasks by himself. Unlike his counterpart in England, however, he did not strike off on his own to earn wages as a servant in husbandry. Rather, he remained committed to the family farm, investing his labor in developing family property that he would eventually inherit. At this point, he would likely take responsibility for supervising and teaching his younger brothers, acting as a manager for the family labor force under the general orders of his father. The family labor system did not eliminate the need for additional supplies of workers. During certain periods of the agricultural season, including planting and the harvest, a family’s labor supply might be temporarily insufficient. A young couple with few, if any, sons of working age would likewise find themselves in need of additional hands. Families with more than enough sons of laboring age helped alleviate their relatives’ or neighbors’ labor shortage by occasionally hiring out their teenage sons to work. During the seventeenth century, these boys did not receive their own wages; rather the income was tallied on their fathers’ accounts in the interests of the family farm. By the second quarter of the eighteenth century, however, as the econ-
383
omy matured and farmlands were increasingly subdivided, many farmers began to have insufficient work at home to fully occupy their older sons. Recognizing that the likelihood of inheriting a sufficiently large portion of the family farm had diminished, more boys in their late teens began hiring themselves out for wages or working as farm laborers on the expanding frontier. As Boston began to experience a growing poverty problem during the eighteenth century, these boys’ labor was supplemented by the work of a growing number of poor young servants and apprentices, bound out for their support by either town officials or their own parents. Together, these young workers formed a nascent regional farm labor market. To the South, in both the mid-Atlantic colonies and the Chesapeake, sons’ labor was likewise critical in the initial process of farm formation. In Maryland in the mid-seventeenth century, for example, Robert Cole’s stepson Francis began working in the tobacco fields at twelve years of age, where he could contribute half the labor of an adult. In the absence of their deceased mother, seven-year-old William and four-year-old Edward, too young for fieldwork, did most of the housework, eliminating the need for a housekeeper (Carr, Menard, and Walsh 1991, 44–45). Even more important was the work done by young indentured servants, particularly in the Chesapeake during the seventeenth century. These servants, who had contracted to provide approximately four to seven years of labor in exchange for the cost of their passage to America, were, for the most part, young males between fifteen and twenty-five years old. In the absence of a supply of free laborers, their work was necessary for growing the labor-intensive
384
Jobs in the Seventeenth and Eighteenth Centuries
tobacco crop upon which the colonies’ success was dependent. Indentured servants became an increasingly important part of the agricultural labor force in the mid-Atlantic colonies during the eighteenth century. The supply of these young workers in the Chesapeake had diminished by the 1680s, however, leading planters to substitute African slave labor in the fields. By the early eighteenth century, colonists in both the Chesapeake and farther South in the rice-growing Carolinas had become dependent upon the institution of slavery. Slave children worked from a very young age in domestic service, waiting on the white family or carrying messages for their masters when they were as young as six years old, too young to do productive agricultural labor. Slave boys typically began working in the fields at nine or ten, beginning with small jobs such as picking weeds, scaring birds away from crops, feeding chickens, and picking vermin off plants. By age sixteen, a slave would be counted as a “full share” laborer, capable of an adult load of fieldwork. As plantations grew in size during the mid-eighteenth century, skilled occupational opportunities for slaves increased, leading some planters to place nine- or ten-year-old slave boys in craft apprenticeships with coopers, carpenters, blacksmiths, tanners, bakers, and shoemakers. By training in these jobs, slave boys might hope to have a relative degree of privilege and independence in their work as adults. Nonetheless, all these young slaves ultimately worked for the benefit and profit of their white masters. Both white and African American boys’ work was equally important in the development of American craft manufacturing. Like farmers’ sons, boys in artisans’ families began to work at a young
age, beginning to help in their fathers’ workshops at about six or seven. At this young age, they performed necessary, unskilled tasks, including sweeping the floor, running errands, and tending the shop. By age thirteen or fourteen, boys would enter craft apprenticeships, promising to serve a master craftsman until their early twenties in order to learn the craft by laboring in it and forgoing wages in exchange for their upkeep. These labor agreements might be made informally between a craftsman and his son or formally via an indenture contract. By accepting apprentices, craftsmen obtained a stable, bound labor force while helping to guarantee the future quality of workmanship in their trade. In America’s cities, expansion in the commercial sector during the eighteenth century led to a proliferation of trades open to apprentice labor. In Philadelphia, for example, apprentices began training in more than twice as many different kinds of crafts in 1771 as they had in 1745. Children’s labor was also crucial to the development of larger-scale manufacturing enterprises. In the textile industry, for example, young labor was considered critical from the early years of British settlement. Beginning in the seventeenth century, Americans noted the high cost of importing cloth from abroad and by the mid-eighteenth century were experimenting with plans in both Philadelphia and Boston to establish manufactories where wool or linen might be spun and woven. Drawing on the example of British “working schools,” the young sons and daughters of the poor were identified as the best source of the necessary labor. It was hoped that these children would thus earn their keep while learning habits of industry that would keep them from becoming a public charge in
Jobs in the Nineteenth Century adulthood. Although these first experimental manufactories ultimately failed, the use of children’s labor in textile manufacturing was repeated by Almy, Brown, and Slater in the first mechanized cotton yarn factory in the United States. At the mill in Pawtucket, Rhode Island, Samuel Slater relied exclusively upon children seven to twelve years of age in the first week of production in 1790, and thereafter, youth would remain a significant part of the unskilled wage labor force used in the industry for generations. During the eighteenth century, a growing number of Americans began following the advice of John Locke, who emphasized the importance of an extensive, liberal education that would inculcate both virtue and reason in children prior to their assimilation into the adult world. By the late eighteenth century, the number of schools offering formal instruction had increased dramatically, particularly in urban areas, offering a growing range of subjects. More and more parents began to see this academic training as the key to social and economic advancement for their sons. Schooling could teach them subjects such as bookkeeping and advanced mathematics with practical vocational application in the increasingly complex economy or train them in accomplishments necessary for social acceptance among the elite. In American cities, parents in both the upper and middle classes began sending their sons to school for longer periods of time, withdrawing them from labor until much later in their teen years. By doing so, they were on the cusp of a movement that would ultimately lead in the nineteenth century to a childhood dominated by schooling for most white Americans. During the eighteenth century, however, slave children were consciously excluded
385
from these schooling opportunities, and poor white children’s parents could simply not afford either tuition or the loss of their offspring’s labor. These children continued to work when they were very young, as children of the poor would for more than a century. Sharon Braslaw Sundue See also Apprenticeship; Farm Boys; Indentured Servants; Slavery References and further reading Bagnall, William. 1893. The Textile Industries of the United States. Cambridge: Riverside Press. Carr, Lois Green, Russell R. Menard, and Lorena S. Walsh. 1991. Robert Cole’s World: Agriculture and Society in Early Maryland. Chapel Hill: University of North Carolina Press. Cunningham, Hugh. 1991. The Children of the Poor: Representations of Childhood since the Seventeenth Century. Oxford: Basil Blackwell. Morgan, Phillip D. 1998. Slave Counterpoint: Black Culture in the Eighteenth-Century Chesapeake and Lowcountry. Chapel Hill: University of North Carolina Press. Ulrich, Laurel Thatcher. 1999. “Sheep in the Parlor, Wheels on the Common: Pastoralism and Poverty in Eighteenth Century Boston.” Pp. 182–200 in Inequality in Early America. Edited by Carla Gardia Pestana and Sharon V. Salinger. Hanover: University Press of New England. Vickers, Daniel. 1994. Farmers and Fishermen: Two Centuries of Work in Essex County, Massachusetts, 1630–1850. Chapel Hill: University of North Carolina Press. Walett, Francis G., ed. 1974. The Diary of Ebenezer Parkman, 1703–1782. Worcester, MA: American Antiquarian Society.
Jobs in the Nineteenth Century When the nineteenth century began, virtually all boys, except the sons of the very wealthy, went to work as soon as they
386
Jobs in the Nineteenth Century
Breaker boys, Woodward Coal Mines, Kingston, Pennsylvania, nineteenth century (Library of Congress)
were physically able, usually between the ages of seven and fourteen. Since most Americans made their living in agriculture, most boys worked on farms. Early in the century, some boys also prepared to become self-supporting through apprenticeship to craftsmen who taught them a skill. However, as industrialization and wage labor spread, especially after 1830, apprenticeship declined. Industrialization privileged white, middle-class men who earned good salaries as accountants, lawyers, and managers for burgeoning manufacturing enterprises. Their sons became exempt from labor outside the home and customarily remained in school until their late teens. Conversely,
industrialization meant low wages for ordinary, unskilled workers. Their sons typically had to go to work at a young age to supplement family income. Not only class differences but also differences in race and ethnicity separated the work experiences of nineteenth-century boys. African American boys in the South, both in slavery and in freedom, began working very young, worked very long hours, and rarely obtained much schooling. The sons of immigrants were also more likely to cut short their education and work outside the home than were native-born youths. Among whites, on family farms throughout the country, parents expected their sons to help out in the home and in the fields. Typically, both farm boys and girls began their work lives in early childhood by helping their mothers around the home. They milked cows, churned butter, made cheese, collected eggs, hauled in wood for fires, built fires, scrubbed clothes, and weeded gardens. On the frontier, boys and girls also helped feed their families by collecting edible plants and hunting wild animals. As they grew bigger, boys did more labor in the fields under the direction of their fathers, and by puberty, most boys worked there exclusively planting seeds, plowing fields, helping with the harvest, and tending and feeding livestock. Rarely did they help their mothers around the house anymore. Early in the century, New England farmers earned extra income by manufacturing items at home. Both boys and girls helped make such items as brooms, tools, and shoes, usually in the winter when there was less to do on farms. However, just as in farming, there was a gender division of labor in items manufactured at home. In shoemaking, women and girls
Jobs in the Nineteenth Century bound the shoes, and men and boys made the lasts (wooden models of the human foot) to which the leather was molded. Typically, boys’ labor on northern and western farms kept them fully occupied except during the three to four months of winter. Only then, if they were not engaged in home manufacture, were farm boys able to attend school. Because they could enroll in school for such a short time each year, it was not until they were seventeen or eighteen that northern and western farm boys achieved the equivalent of a sixth-grade education. Typically, farm boys labored with family members, and sometimes, in the South before the Civil War, with a few slaves, and although they might find the work they performed to be dull and repetitive, at least they performed it alongside brothers, sisters, fathers, and mothers. However, when farm families were very large and farmers were either tenants or not very well off, they often had to send their teenage sons out to work as hired hands for other, more prosperous farmers. In 1858, Frank G. O’Brien, age fourteen, left his family home to work for another farmer in Minnesota. He walked to the new farm, washed up in the rain barrel with homemade soap, received dinner, and then went to sleep in an attic room that he shared with several mice. He rose about four a.m. and worked in the fields until dusk. He probably did about the same labor as other farm boys his age, but Frank and other young hired hands had shorter childhoods and lived apart from their families, unlike the sons of more well-off farmers (Schob 1975, 189). African American slave boys also worked on farms, often performing about the same jobs as free white farm boys but under harsher conditions. Slave boys went to work year-round between ages
387
six and ten, when they cleaned yards, carried water to field hands, shelled corn, and helped their parents out with gardening. When they were physically able, slave boys and girls performed tougher jobs such as raking, hoeing, pulling weeds, and picking cotton. Slave boys and girls did about the same things, regardless of age, but boys more than girls worked at blacksmithing and tending livestock. And although they began by laboring under the direction of their parents, slave boys soon worked under the direction of whites who might treat them harshly. Slave parents had no way to protect their sons from exploitation by whites, unlike parents of free white boys who could always withdraw them from hired labor for a harsh boss. Unlike white farm boys, African American slave boys also had no time out for schooling because by the 1830s, southern states prohibited slaves from gaining an education. After the Civil War ended slavery, African American boys gained the privilege white farm boys had long enjoyed: they could work for their parents alongside other family members. And work they did, for freedpersons usually ended up as sharecroppers or tenants who could not make ends meet unless all family members began work by the age of five or six. Boys and girls tended younger siblings, carried water to the house and fields, planted seeds, or wormed tobacco (killed worms on the underside of leaves). Nonetheless, even in freedom, black farm boys enjoyed fewer privileges than did their white contemporaries. Their families were so poor that African American boys were often hired out to work for whites, where they were customarily treated with less consideration than were white hired hands in the Midwest and Far West. In addition, during slack months
388
Jobs in the Nineteenth Century
Boys and men with spinning machines in a textile factory (Archive Photos)
on the farm, instead of going to school as white farm boys typically did, black youths and their fathers were so desperately poor that they had to seek out daylabor jobs wherever they could find them. Although most nineteenth-century boys who lived on farms worked very hard, some did not. Before the Civil War, the sons of white plantation owners customarily labored very little. Slave labor replaced their own, privileging planters’
sons to play and attend school. After 1850 in the Northeast and Midwest, the sons of farmers who could afford to buy horse-powered machinery and employ hired hands worked much less than did other farm boys. Many of them attended school for much of the year and developed entrepreneurial skills by farming their own gardens for profit. The work experience of rural youths was rather different from that of boys raised in small towns or cities. Early in
Jobs in the Nineteenth Century the century, many urban-born youths were apprenticed to skilled craftsmen such as carpenters, bricklayers, furniture makers, barrel makers, and others. Parents signed a contract with a craftsman who obtained the boy’s labor, usually from his early teens to age twenty-one, in return for teaching the boy a craft and providing him with schooling, room, and board. When a boy completed an apprenticeship, he was presumably ready to earn his living at the next step up the occupational ladder: as a journeyman. However, also in the first years of the nineteenth century, the shift to free wage labor began. Many craftsmen found it cheaper to hire boys or young men on a daily basis rather than supply them yearround with room, board, and schooling. Apprenticeship did not disappear all at once, but by the 1840s in cities, only shipbuilders and butchers still took on many apprentices. By the late nineteenth century, apprenticeship increasingly became the province of boys sixteen and older who were trained by skilled craftsmen in trade unions and employed by businesses that required a steady supply of skilled laborers. The Industrial Revolution, which made apprenticeship all but obsolete, divided the experience of nineteenth-century urban boys. As production moved from small craftsmen’s shops into factories, men who managed these larger enterprises were well paid and able to support a wife and children comfortably in large urban and suburban homes. The sons of businessmen did not have to work outside the home and spent most of their youth in school preparing themselves for middle-class employment. However, men who worked in factories, usually at unskilled jobs, were paid very little and were frequently unemployed.
389
Unskilled industrial workers usually could not support a wife and family without help. Typically, that help came in the form of wages earned by their children. White working-class mothers in the nineteenth century rarely worked outside the home because they were fully occupied caring for youngsters, cooking from scratch, and cleaning and washing without benefit of electricity. Very young boys and girls did not usually work for wages outside the home because they could earn little. Instead they helped out their families by collecting stray pieces of coal and wood around railroad yards, scavenging for overripe fruit and vegetables at open-air markets, and searching for junk in their neighborhoods to sell to junkmen, ragpickers, and pawnbrokers. Most working-class boys between the ages of seven and fourteen combined odd jobs with school attendance. However, once they reached their teens, white working-class boys usually sought paying jobs. Parents were quicker to send their teenage sons than their daughters out to work because boys earned more than girls. In Philadelphia in 1880, boys over the age of fifteen earned 30 percent more than did girls the same age. Foreign-born boys were more likely to work outside the home than were nativeborn youths, probably because immigrant families were especially likely to be impoverished. However, ethnic groups diverged somewhat in their attitude toward child labor. In the 1880s and 1890s, eastern European Jews, although very poor, were more likely than other immigrant groups to keep their sons in school through their teen years. Jews so valued learning that they made huge sacrifices to keep their boys in school. Conversely, Irish and Italian immigrants, regardless of their income, were inclined to send
390
Jobs in the Nineteenth Century
their sons out to work. Both groups apparently believed that children should help repay their parents for the cost of their upbringing as soon as possible. Neither group believed that extended education was very useful for children. The experience of African American boys in the workforce in the nineteenth century was unique. In southern cities in the census years of 1870 and 1880, African American youths were more likely to work outside the home than were white youths. The greater poverty of black families and the low wages paid both African American men and women, as well as the ready availability of low-paying jobs for black boys and girls, probably account for this difference. However, in 1880 in northern cities like Philadelphia and Providence, Rhode Island, African American boys and girls were less likely to work outside the home than were immigrant children, even though both groups were quite poor. Although black mothers and fathers were equally likely to work and to receive low wages for their labor in the North and in the South, child labor was much more common among black children in the South than in the North. The best explanation for this disparity is that textile mills and other factories in the North rarely hired black children, so few jobs were open to them, and it made more sense for them to remain in school. In addition, many black parents probably moved to the North in part to find better educational opportunities for their sons and daughters than existed in the South. In the nineteenth century, not all jobs were open even to white boys, but a few industries were especially receptive to child laborers. Cotton textile manufacturing was one of them. In 1801, Slater’s Mills in Pawtucket, Rhode Island, em-
ployed 100 children under the age of twelve. As textile mills spread in New England, the mid-Atlantic states, and later the South, so did the employment of children in them. By 1890 there were more children working in textile mills than in any other industry. Virtually all of them were white. Usually, mill owners recruited whole families to work and paid each family member based on gender, age, and experience. Older men and boys earned the most; very young girls earned the least. Young boys and girls helped their parents and their older siblings in the mill and thereby learned the trade informally. When youngsters were old enough to work full-time, usually in their teens, they worked twelve hours a day, six days a week. Boys usually performed different jobs in textile mills than did girls. By their late teens, boys earned more money than girls because boys had gained experience doing the most highly paid tasks of tending machines and working in the carding rooms. Boys also might become supervisors; girls could not. Work in the textile mills had both advantages and disadvantages. Boys and girls often liked it because they worked alongside family members and with other children their own age. Youngsters hung on belts moving up to the ceiling and played catch with balls of yarn. Of course, too much playfulness led to punishment by foremen, and too little attention to textile machinery often resulted in accidents. Textile mill labor also prevented children from attending school regularly, and it did not prepare boys for future high-paying employment. Another industry that employed many boys in the nineteenth century was anthracite coal mining. Coal miners, many of whom were Welsh and accustomed to this type of labor in their homeland, were
Jobs in the Nineteenth Century so poorly paid that they had a strong incentive to put their sons to work. In western Pennsylvania, boys aged eight to thirteen worked in coal-processing buildings called “breakers.” Breaker boys sat on planks along coal chutes. Machinery broke the coal into pieces and then it passed along the chutes, where the boys picked the slate out from among the coal. Breakers were noisy, dusty, hot in summer, and cold in winter. Breaker boys often emerged from work with cut and bleeding hands and fingers. Once they were in their late teens, boys went underground to work in the coal mines, which were thick with dust and gas and explosions and where rockfalls often led to injury or death. Besides the textile and mining industries, there were also countless smaller enterprises that employed many boys. In cities, clothing and cigar manufacturers provided materials for home manufacture to families living in apartments. There, mothers, fathers, and children labored to make ready-made clothing or roll cigars. Other urban boys and girls worked in street trades, usually selling small objects such as gum, peanuts, and crackers. Boys could also find ready employment selling newspapers. Papers counted on boys to get their product to the reading public. For most of the century, selling newspapers was a full-time job for boys, but after 1890, when most daily papers were sold in the afternoon, newsboys could attend school as well as sell. Newsboys were independent entrepreneurs who decided how many papers to buy on any given day and then sold the papers at a spot where there were lots of potential customers—usually commuters. Papers sold for a penny each, and customers often gave boys a nickel. A newsboy’s profit was in the tip.
391
In the last years of the nineteenth century, reformers made some efforts to limit the labor of boys and girls in the most dangerous occupations, including mining and some types of manufacturing. Twenty-eight states in the North, Midwest, and Far West passed laws setting a minimum working age of ten or twelve years and a maximum workday of ten hours. However, working-class parents who required the labor of their sons and daughters objected to the laws, and they were not regularly enforced. Moreover, in the South, where so many children were employed in cotton mills, no laws restricting child labor were passed. Priscilla Ferguson Clement See also African American Boys, Apprenticeship; Farm Boys; Frontier Boyhood; Newsboys; Plantations References and further reading Clement, Priscilla Ferguson. 1997. Growing Pains: Children in the Industrial Age, 1850–1890. New York: Twayne Publishers. King, Wilma. 1995. Stolen Childhood: Slave Youth in Nineteenth-Century America. Bloomington: Indiana University Press. Nasaw, David. 1985. Children of the City at Work and at Play. Garden City, NY: Anchor Press Doubleday. Reinier, Jacqueline. 1996. From Virtue to Character. American Childhood, 1775–1850. New York: Twayne Publishers. Schob, David E. 1975. Hired Hands and Plowboys: Farm Labor in the Midwest, 1815–1860. Urbana: University of Illinois Press. Trattner, Walter I. 1970. Crusade for the Children: A History of the National Child Labor Committee and Child Labor Reform in America. Chicago: Quadrangle Books. West, Elliott. 1989. Growing Up with the Country: Childhood on the Far Western Frontier. Albuquerque: University of New Mexico Press.
392
Jobs in the Twentieth Century
Jobs in the Twentieth Century Significant class, racial, and regional differences in youth labor involvement make the story of twentieth-century work by youth in the United States one of great complexity indeed. Without a doubt, during the twentieth century there has been a profound change in the way in which Americans view work by young people, especially the recruitment of young boys into the labor force. Previously, most Americans believed child labor was a necessity for keeping households financially solvent and for teaching young boys practical skills that indelibly shaped and ultimately benefited their character. The United States, after all, was built on the backs of industrious parents and their equally hardworking children, who worked on farms and, by the late nineteenth century, in industry as well. By 1900, the Census Bureau reported that nearly one out of every five children age ten to fifteen was “gainfully employed.” As Elliot West (1996) writes in his excellent history of children’s experiences in the twentieth-century United States, the 1900 census approximation of young workers’ involvement in the workforce was no doubt a severe underestimate of children’s actual economic contributions. For example, the census figure did not take into account the work of those under ten years of age, the efforts of many Mexican migrant child laborers, or the daily chores of farmers’ sons and daughters. In the late 1800s, reformers began to seek to limit or do away with completely the labor of boys in the most dangerous work environments. After 1900, complaints of reformers grew even louder. Many of the more outspoken of these individuals were members of an emerging
middle class that saw children as “economically ‘worthless’ but emotionally ‘priceless’” (Zelizer 1985, 3). Reformers were also, by and large, financially successful professionals, middle managers, and politicians who were earning enough money to support their families without the need for secondary wage earnings. Opposition to child labor reformers came from the socioeconomically disadvantaged immigrants who relocated from eastern Europe and Russia into East Coast cities and required the labor of boys and girls to survive, from textile manufacturers in the South who valued child labor, and from western farmers. Revisions to the Homestead Law in 1908 further opened up the West to farming, and working the new farmlands of the West without the aid of one’s children was simply unthinkable—for more than a century sons had toiled alongside their fathers on ranches and plantations in the United States. Historical evidence shows that even some of the most staunch child labor reformers considered farmwork normative and even beneficial for American children (see West 1996; Zelizer 1985). This belief, of course, did not guarantee that the kind of work children engaged in on family farms was completely benign (indeed, much of it was positively backbreaking); it simply meant that a long tradition of this kind of child labor existed in the United States. In the North and South some working conditions for children were more obviously dangerous and, as such, were the principal targets of reformers. For example, in the early twentieth century in northern and midwestern states, young boys put their health at significant risk working in coal mines where they sifted through anthracite for impurities. At the same time in New Jersey, Pennsylvania,
Jobs in the Twentieth Century
393
A boy in a sweatshop, early twentieth century (Archive Photos)
Indiana, and West Virginia, other boys were exposed to the stifling heat of furnaces in glassmaking plants. Similar conditions existed in the South in the textile factories that were opening up in record numbers. There young children were prized for their “nimble fingers” and worked as “spinners,” “doffers,” and “threaders.” In these occupations that required young children and adolescents to retie, replace, and thread spindles, injuries were numerous, including deep puncture wounds and the amputation of fingers (see West 1996). At the highest reaches of government, the battle over child labor was hard fought. Two early congressional bills designed to regulate child labor were even-
tually dismissed by the Supreme Court as unconstitutional. In 1928, a constitutional amendment authorizing congressional control of youth labor also failed to receive support by enough states. Not until the Depression were the National Industrial Recovery Act and later the Fair Labor Standards Act of 1938 passed and child labor officially federally regulated (see Zelizer 1985 for details). Ultimately, the reformers prevailed, at least in terms of the letter of the law. By 1930, official census tallies reported that numbers of child workers had plummeted to only 667,118 (representing about 5 percent of all children age ten to fifteen). Careful consideration of the evidence, however, reveals that although children
394
Jobs in the Twentieth Century
An African American boy shining shoes (Archive Photos)
now indeed represented a smaller portion of the labor force, they were increasingly employed in jobs that were not counted by officials and were therefore unregulated (West 1996). To be sure, by the 1930s children were less likely to be employed in extremely dangerous occupations such as coal mining. However, in the North, many children still worked in poorly regulated tenement labor, produc-
ing crafts in their homes. Many enterprising boys of the North also hawked newspapers and other wares on the streets of U.S. cities with little governmental interference or regulation. Furthermore, economic pressures on many boys became especially acute in 1929 when the stock market collapsed and the Great Depression began. Glen Elder’s (1974) research on Oakland youth
Jobs in the Twentieth Century growing up in the Depression years reveals that in economically hard-hit families, boys were expected to contribute to the household economy by working in the streets or factories, whereas girls were expected to help more with household chores. In these years, huge regional differences in child labor existed. In the South through the early 1940s, for example, official census figures actually revealed increases in child labor in factories of all kinds (e.g., pencil factories, textile mills, etc.; see West 1996). Similarly, in the West during the same period, children continued to be heavily involved with work on family farms, often to the detriment of their school attendance. Thus, although official records point to huge declines in child labor force participation across the first half of the twentieth century, considerable variability existed. Businesspeople found children to be attractive workers because they could be paid significantly less than adults. The relative economic affluence of the United States in the post–World War II period helped preserve the general trend away from child labor, and by 1950, the census reported that only about 1 percent of the nation’s children were employed. In the North, the development of new factory technologies required skilled labor that young boys simply could not provide. On the farms of the West, mechanization, social proscriptions against work by children, and strict labor laws also limited the number of children in the labor force. Moreover, for the first time, children actually represented a significant expense to these farmers, whose sons no longer produced a positive return on parents’ financial investments. One important exception to this midcentury trend away from child labor in-
395
volved the children of sharecroppers living in the South. In the decades after the Civil War, numerous African American and white families remained in a state of quasi–indentured servitude as land tenants and earned measly incomes. In the first half of the twentieth century, to survive, all members of sharecropping families, including children, had to work. However, after the widespread migration of African Americans northward to northeastern and midwestern cities in the 1940s and 1950s, sharecropping virtually disappeared. The 1950s and 1960s also witnessed another significant trend with regard to boys’ work—now “child” labor was primarily the domain of the adolescent. Many of these young men moved out of the “old workplaces” in the factories and farms and into the “new workplaces” in the service and retail fields, flipping burgers, pumping gas, and selling merchandise in department stores. For the first time, these adolescents were spending the money they earned themselves, since they were far less obligated than previous child workers to help support their families. Paradoxically, adolescents from middle-class families were now much more likely to be working than young men from poorer neighborhoods. This change occurred because jobs in industry that poor boys had once taken were fast disappearing. Jobs in service and retail remained available to adolescents but did not teach them skills that were readily transferable to full-time adult work. During this time, sociologists and others detailed an alarming trend among poorer youth toward involvement in the drug trade and other illegal “work” activities. Feeling the strain of tremendous economic pressures, youth were once again turning toward the city streets to sell
396
Jobs in the Twentieth Century
their wares as the “newsies” had up to 100 years before (West 1996). In reaction to the growing social dangers associated with unemployment among disadvantaged youth, officials in the 1970s sought ways to integrate young people into the adult workforce. Federal commissions and critics of U.S. high schools recommended that schools participate in a national movement geared toward teaching young people the necessary skills to succeed in the world of adult work. In the 1970s and early 1980s, career education programs, work experience gained through youth employment and training programs, and work experience gained through part-time employment were methods developed to improve employment opportunities for impoverished youth (Steinberg 1982). In the early 1980s it was widely assumed by many that adolescent work fostered socialization, enhanced educational outcomes, and generally functioned to smooth the transition into the adult workplace (see Steinberg 1982). Clear evidence of these benefits, however, was generally lacking. In the late 1980s, there was a significant backlash against the rather grand and unsubstantiated promises of the youth work integration movement. In 1986 Ellen Greenberger and Lawrence Steinberg published When Teenagers Work, a book that roundly criticized proponents of adolescent work and, in doing so, touched off a sometimes heated debate among academic child psychologists and sociologists on the importance, influence, and potential dangers of work in adolescence. Amazingly, the issues that emerged (and continue to emerge) in the adolescent work debate paralleled in virtually every way those broached at the turn of the century regarding the labor of young children.
A widely cited series of studies by Steinberg and his colleagues presented a view of adolescent work that called into question the assumption that such labor involvement should be considered a stepping-stone to adult roles and responsibilities. For instance, Steinberg, Suzanne Fegley, and Sanford Dornbusch provided evidence that although adolescent workers tend to look poorer along a number of dimensions before entering the workforce (adolescent work attracts individuals who are already experiencing academic and social difficulties), working more than twenty hours per week further “disengages youngsters from school, increases delinquency and drug use, furthers autonomy from parents, and diminishes selfreliance” across time (1993, 171). Often in direct contrast to Steinberg’s research, however, were the results from Jeylan Mortimer’s Youth Development Study based in St. Paul, Minnesota (see Mortimer et al. 1996). These researchers were unable to replicate key elements of Steinberg’s research and found no consistent effect of adolescent work intensity on educational adjustment outcomes. One interesting exception was that, in Mortimer’s sample, adolescents working less than twenty hours per week earned higher grades on average than both adolescents working more than twenty hours per week and unemployed adolescents. Cumulatively, what these studies and a host of others have revealed is a very complex picture indeed. Most agree that working too many hours in adolescence is associated with increased substance use and greater independence from family. However, work can also have positive consequences: it appears to have a positive impact on the school grades of young workers who are saving money for college (Marsh 1991, as cited in Mortimer
Jokes and Finch 1996). What is indisputable is that youth are still very involved in the labor force. Current estimates suggest that more than half of all young men attending high school work during the school year, many over twenty hours per week. Remarkably, after more than 100 years of societal preoccupation with the topic, controversy still surrounds the issue of work by youth, with little indication that key issues will soon be completely resolved. Glenn Ian Roisman See also Accidents; African American Boys; Farm Boys; Great Depression; Illegal Substances; Immigrants; Newsboys References and further reading Elder, Glen H., Jr. 1974. Children of the Great Depression. Chicago: University of Chicago Press. Greenberger, Ellen, and Lawrence Steinberg. 1986. When Teenagers Work: The Psychological and Social Costs of Adolescent Employment. New York: Basic Books. Marsh, Herbert W. 1991. “Employment during High School: Character Building or a Subversion of Academic Goals?” Sociology of Education 64: 172–189. Mortimer, Jeylan T., and Michael D. Finch. 1996. “Work, Family, and Adolescent Development.” Pp. 1–24 in Adolescents, Work, and Family: An Intergenerational Developmental Analysis. Edited by Mortimer and Finch. Thousand Oaks, CA: Sage. Mortimer, Jeylan T., Michael D. Finch, Ryu Seongryeol, Michael J. Shanahan, and Kathleen Thiede Call. 1996. “The Effects of Work Intensity on Adolescent Mental Health, Achievement, and Behavioral Adjustment: New Evidence from a Prospective Study.” Child Development 67: 1243–1261. Steinberg, Lawrence. 1982. “Jumping Off the Work Experience Bandwagon.” Journal of Youth and Adolescence 11, no. 3: 183–205. Steinberg, Lawrence, Suzanne Fegley, and Sanford M. Dornbusch. 1993. “Negative
397
Impact of Part-time Work on Adolescent Adjustment: Evidence from a Longitudinal Study.” Developmental Psychology 29, no. 2: 171–180. West, Elliot. 1996. Growing Up in Twentieth-Century America: A History and Reference Guide. Westport, CT: Greenwood Press. Zelizer, Viviana A. 1985. Pricing the Priceless Child: The Changing Social Value of Children. New York: Basic Books.
Jokes Boys use jokes—one of a number of genres of conversational folklore—to manage their interpersonal relations with other children and with adults. Sometimes jokes both reflect and contribute to the cognitive development of the boy, as he learns to understand wordplay and, eventually, to play with words and concepts himself. At other times, the content of the jokes reveals themes (e.g., boys’ bodies, sex, race, power) connected to the anxieties experienced by boys. Jokes also both reflect and contribute to the social development of the boy, as he learns to understand and use jokes to bond with friends, to defend himself in threatening situations, and to assault another person with verbal aggression. Like a stand-up comic, boys acquire a repertoire of jokes over the years, and again like stand-up comics, some boys are very good at telling jokes and others can barely get by. It is best to understand boys’ joking behavior (as opposed to the jokes as texts apart from their performances) in terms of the circumstances for boys’ joking and the likely functions of the joking. This entry will provide a set of perspectives for understanding any joking performances by boys. Underlying this approach is the goal of being able to answer the folklorist’s typical question:
398
Jokes
Boys delighting in a shared joke (Robert Maas/Corbis)
who performed what piece of traditional folklore (i.e., jokes), how, in what context, for what audience, and toward what end? As a folklore genre in everyday conversation, jokes usually rely on what anthropologist Gregory Bateson (1972) calls a “frame,” that is, a mutually understood set of rules for interpreting the meanings of what people are saying. In this case, the participants in a conversation must understand when they have entered a “joking frame.” Things said within a joking frame mean something different than they would in other frames, such as an everyday conversation or a verbal fight. The participants in the frame give each other an unspoken license to joke. Jokes become aggressive in those cases where there is not a license to joke, or when
people are included in a frame against their will. In some cases, a person will begin a joke without a clear signal that the teller has entered the joking frame; the joke depends upon the listener’s thinking that the communication is in some other frame and realizing only with the punchline that the communication was a joke, after all. Learning to joke and learning how to get jokes are important cognitive and social skills. This whole process by which jokes seem to stand apart from everyday conversation gives them an important role in cognitive development and in the acquisition of language and other communication skills. The joke frame makes them a very useful folklore genre for managing the psychological and social tensions that arise in a boy’s everyday life. Children acquire their thought and language skills through an interaction between thoughts and emotions and social action. Learning to play with words and ideas—to enter the “play frame” where things do not mean what they would mean outside that frame—is a crucial skill. Folklorists and developmental psychologists notice a developmental sequence in the ability of children to understand riddles and jokes. For example, the adult can see this clearly with “knockknock” jokes (e.g., A: “Knock-knock.” B: “Who’s there?” A: “Boo.” B: “‘Boo’ who?” A: “Oh, don’t cry, little boy.”), as the boy slowly acquires the ability to understand and then to generate “competent” knockknock jokes. The boy learns the questionanswer formula of the joke long before he understands the wordplay. Thus, one fiveyear-old boy could pose to parents the joke—A: “Knock-knock.” B: “Who’s there?” A: “Mickey.” B: “Mickey who?” A: “Mickey’s underwear”—and find it hilarious because of the forbidden content.
Jokes The boy realized that he could use the knock-knock joke formula as a frame for uttering a slightly forbidden topic (more on this later), but he had yet to understand the role of wordplay in the genre. His two-year-old brother could not quite master even the formula, so he offered this attenuated version of the joke: A: “Knock-knock.” B: “Who’s there?” A: “Mickey’s underwear.” This sort of joking provides examples for the first of the two major theories of joking—“appropriate incongruity” (Oring 1992, 2)—which holds that the humor in some jokes rests on the sudden perception that the listener has framed the subject matter incorrectly. Anthropologist Mary Douglas, in fact, goes so far as to say that there can be no joke between people in a society unless there is a joke—that is, an incongruity—somewhere in the social structure (Douglas 1975, 98). Incongruity theory applies well to some genres of children’s jokes, such as riddle jokes: Q: “When is a door not a door?” A: “When it’s ajar” (a play on “ajar” and “a jar”). Sigmund Freud’s extended theory and analysis of joking and the relation between jokes and the unconscious is the second major theory of humor. One value of folklore, including jokes, in everyday living is that folklore is formulaic and traditional, making it less personal than an individual expression. Folklore provides impersonal genres for making indirect comments on the causes of the individual’s or the group’s uneasiness, especially in those close folk groups (e.g., families, friendship groups) where the direct, personal expression of criticism or anxiety would harm the group’s necessary cohesiveness. Jokes serve these purposes well. Freud’s approach would investigate what sorts of psychological and social anxieties or tensions the boy might be
399
experiencing, leading him to make jokes as a way to ameliorate his anxiety or to manage the interpersonal relations he is having with parents, siblings, friends, bullies, teachers, and others in his everyday life. For boys, these tensions tend to cluster around the boy’s body and around issues of power and hierarchy (Mechling 1986, 2001). First, the boy’s body is the source of much of the anxiety that comes out in jokes. The boy’s body is a constant source of interest and potential embarrassment to him. Adults work at socializing the child’s major body systems—oral, anal, and genital—which makes psychoanalytic theory especially useful in understanding so much of the body humor found in children’s jokes. Martha Wolfenstein’s classic study, Children’s Humor: A Psychological Analysis (1954), employs psychoanalytic concepts of projection, displacement, and so on to explain a great deal of the content of children’s jokes, from boys’ jokes about penis size to jokes using bad names to the mother’s pregnancy to sex itself. For example, Wolfenstein finds deeply coded, highly condensed symbolic references to sex in the “moron” riddle jokes that are so popular in the period beginning at about age six, when “the earlier sexual preoccupation undergoes repression” (Wolfenstein 1954, 95). Another type of body jokes, cruel or sick joke cycles, addresses anxieties about parental cruelty, parents’ deaths, or conflicted feelings about a sibling—“But Mommy, I don’t like little brother—Shut up and eat what’s put before you” (Bronner 1988, 124). In the same fashion, elephant jokes might comment on race (Oring 1992, 16–28), dead baby jokes might comment on abortion (Dundes 1987), and Helen Keller or related joke cycles about deformed people (e.g., “What’s the name of a man with no arms
400
Jokes
and legs hanging on a wall?—Art”) might comment on the rising visibility of disabled people in everyday life (Bronner 1988, 123–133). The second great cluster of themes in boys’ joking has to do with power (a theme not unrelated to the body, of course). Freud was very interested in the socialization of dependence and independence, and recent work on the social construction of masculinity suggests a powerful source of anxiety that might motivate the occasions and content of boys’ jokes. As “object relations” theorists in psychoanalytic theory explain, the boy’s “proper” psychodynamic development demands that he separate from his mother, who embodies (literally and figuratively) the feminine. The boy must separate from his mother and begin identifying with his father. This means, however, that masculinity is constructed largely as a negative, as not female rather than as some positive list of traits understood as masculine. In this view, masculinity is a far more fragile construction than previously thought, and the growing boy finds that “proving” his masculinity is an ongoing, lifelong project (Beneke 1997). Jokes become a valuable resource in his proving his manhood, which accounts for the large number of antifemale (misogynist) and antihomosexual (homophobic, heterosexist) jokes in boys’ joke repertoires. Both sorts of jokes separate and distance the boy from the feminine— in one case by joking at the expense of girls and in the other by joking at the expense of feminine boys. Psychologists and others who work with boys have identified a “boy code” (summarized by Pollack 1998, 23–25) that emphasizes stoic independence (including, sometimes, a “cool pose”), a seemingly courageous bravado, a tendency to want to
be in control (always), and an avoidance of anything thought to be “feminine,” including dependence and the free expression of emotions. Boys tend to create friendship groups larger than girls (who prefer dyads), and boys experience these groups as extremely hierarchical yet safe. Every boy knows his place in the group. A powerful socializing force within the male friendship group is what Pollack calls its “shame culture,” by which he means the tendency of boys to use shame as a way of enforcing and reinforcing the boy code. Not surprisingly, boys enforce the code using a range of folklore genres, including jokes, teases, taunts, boasts, pranks, and stories. Status within the boy’s friendship group hierarchy often reflects the boy’s ability to perform the folklore repertoire of the group, including jokes. Boys also use jokes as a defense mechanism against the range of people outside the boy’s friendship group. Some boys learn that joking is an effective way to take power in a situation, especially if the boy finds himself in a power-inferior position to a parent, a teacher, or a schoolyard bully. Boys often use dirty jokes—jokes filled with taboo topics, like bodily functions, sex, or profanity—to shock the audience, including what folklorists call the “bystander” audience listening in on the joke performance (parents and teachers often are bystander audiences for these performances). The feeling of power in telling the forbidden joke might be fleeting because censorship or punishment is likely to follow, but the boy is learning to use language and especially wordplay to take power in situations in which he does not feel especially powerful. One suspects that there would be gender differences and perhaps ethnic differences in the ways children learn to joke in multicultural settings, but researchers
Juvenile Courts (even folklorists) have not done the fieldwork in natural settings (where kids lead their lives, rather than in the laboratory) to establish such differences. Even so good an ethnographic account as Barrie Thorne’s Gender Play (1993) does not address gender differences in joking on the playground she studied, and fieldworkbased studies of children of color have not studied joking behavior. One must consider, as well, the possible homogenizing effects of joking behavior seen on television and in films (though the humor in these media rarely takes the traditional joke form). The entire issue of popular, massmediated culture intersects a bit with the topic of boys’ joking behavior, but only minimally. Thus, there are joke books published for boys; the Boy Scout magazine, Boys’ Life, has featured a joke sections for decades; and other assorted magazines aimed at children and adolescents might have joke sections. But the folklorist knows that jokes are a living, oral tradition. Most boys get their jokes from other boys, but even when a boy performs a joke he read somewhere, the observer must ask that folklorist’s questions: why did that boy choose to tell that joke in that way to that audience, and what effect did he want the joke to have? Jay Mechling See also Bodies References and further reading Bateson, Gregory. 1972. “A Theory of Play and Fantasy.” Pp. 177–193 in Steps to an Ecology of Mind. New York: Ballantine. Beneke, Timothy. 1997. Proving Manhood: Reflections on Men and Sexism. Berkeley: University of California Press. Bronner, Simon J. 1988. American Children’s Folklore. Little Rock, AR: August House.
401
Douglas, Mary. 1975. “Jokes.” Pp. 90–114 in Implicit Meanings: Essays in Anthropology by Mary Douglas. London: Routledge and Kegan Paul. Dundes, Alan. 1987. The Dead Baby Joke Cycle.” Pp. 3–14 in Cracking Jokes. Berkeley, CA: Ten Speed Press. Fine, Gary Alan. 1987. With the Boys: Little League Baseball and Preadolescent Culture. Chicago: University of Chicago Press. Mechling, Jay. 1986. “Children’s Folklore.” Pp. 91–120 in Folk Groups and Folklore Genres. Edited by Elliott Oring. Logan: Utah State University Press. ———. 2001. On My Honor: The Boy Scouts and American Culture. Chicago: University of Chicago Press. Oring, Elliott. 1992. Jokes and Their Relations. University Press of Kentucky. Pollack, William. 1998. Real Boys: Rescuing Our Sons from the Myths of Boyhood. New York: Henry Holt. Sutton-Smith, Brian, Jay Mechling, Thomas W. Johnson, and Felicia R. McMahon, eds. 1999. Children’s Folklore: A Source Book. Logan: Utah State University Press. Thorne, Barrie. 1993. Gender Play: Girls and Boys in School. New Brunswick, NJ: Rutgers University Press. Wolfenstein, Martha. 1954. Children’s Humor: A Psychological Analysis. Bloomington: Indiana University Press.
Juvenile Courts Since the opening of the world’s first juvenile court in Chicago, Illinois, on July 3, 1899, these specialized courts have handled millions of cases of child abuse, neglect, and juvenile delinquency. Juvenile courts were initially designed to treat and control children and youth, especially wayward boys growing up in congested urban areas, and from the beginning their calendars have been dominated by delinquency cases, with boys’ cases three times as numerous as girls’. The vast majority of all delinquency cases have involved nonviolent offenses,
402
Juvenile Courts
Three boys sit in juvenile court chambers, Denver, Colorado. (Bettmann/Corbis)
and probation has been the most common disposition in adjudicated cases involving boys (Rosenheim et al. 2001). With regard to juvenile offending, the foundational principle of the juvenile court is that children and adolescents are qualitatively different from adults. Accordingly, since boys are still developing, their cases should be diverted from the adult criminal justice system and instead should be handled by a less punitive and more flexible system that emphasizes rehabilitation over punishment and allows “room to reform” (Zimring 1982). The founders of the juvenile court relied upon the legal doctrine of parens patriae (the state as parent) to proclaim that
the court would promote the best interests of boys, and most American juvenile courts have operated as courts of equity rather than criminal courts. Courts of equity generally hear cases that require flexible remedies, such as divorces. Such courts became a distinctive part of the American legal system by the late nineteenth century. This designation of juvenile courts as noncriminal courts allowed judges to conduct informal proceedings in hearings closed to the public that rarely included lawyers, juries, or the rules of evidence required in criminal courts. Jane Addams, the cofounder of the famous social settlement Hull House and a leader in the juvenile
Juvenile Courts court movement, captured the spirit of this new legal philosophy of individualized justice when she observed: “There was almost a change in mores when the Juvenile Court was established. The child was brought before the judge with no one to prosecute him and with no one to defend him—the judge and all concerned were merely trying to find out what could be done on his behalf. The element of conflict was absolutely eliminated and with it, all notion of punishment as such with its curiously belated connotation” (Fagan and Zimring 2000, 18). By the 1920s, almost every state had adopted a juvenile or family court, and most major Western nations had created juvenile courts inspired by the American example. During the second half of the twentieth century, there were three major developments in American juvenile justice. First, the U.S. Supreme Court mandated that juvenile courts conduct more formal proceedings in order to protect the rights of children and their families (In re Gault, 1967). This development contributed to juvenile court procedures becoming more like those in criminal courts and “brought lawyers into Juvenile Court with a vengeance” (Ayers 1997, 27). Second, since the passage of the federal Juvenile Justice and Delinquency Prevention Act in 1974, most states have removed the cases of “status offenders” from juvenile court. “Status offenders” are boys who have committed acts that are illegal only because of their status as minors. Such offenses include consuming alcohol, breaking curfews, and disobeying their parents. Third, as a response to a rapid rise in the serious and violent youth crime rate in the late 1980s and concerns that many boys were becoming remorseless “superpredators,” forty states be-
403
tween 1990 and 1996 passed laws making it easier to transfer boys’ cases out of the juvenile court and into criminal court (Fagan and Zimring 2000, 2). As a result of these three changes since midcentury, the modern American juvenile court resembles a criminal court, hears fewer cases of “status offenders,” and has lost some jurisdiction over the cases of serious and violent offenders. The juvenile court movement of the late nineteenth and early twentieth centuries was part of a larger child-saving crusade that aimed to prolong and to protect the dependent status of children. Through truancy, compulsory education, and child labor laws aimed to keep children off the city streets, in schools, and out of the labor market, these reformers used the power of the state to give boys more time to make the transition to adulthood. The founders of the juvenile court had in fact envisioned that the court, by removing boys from the adult criminal justice system, would be part of this larger effort to allow boys the time necessary to grow up. Child savers were concerned that the criminal justice system, especially its jails and prisons, transformed many innocent boys into hardened criminals. The penal reformer John P. Altgeld likened Chicago’s criminal justice system to “a great mill which, in one way or another, supplies its own grist, a maelstrom which draws from the outside, and then keeps its victims moving in a circle until swallowed in the vortex” (Tanenhaus 1998–1999, 7). Women’s organizations, such as the National Congress of Mothers and the General Federation of Women’s Clubs, and renowned juvenile court judges, such as Benjamin Lindsey of Denver and Julian Mack of Chicago, publicized how the criminal justice system harmed boys and, in this process, popularized the idea of separate
404
Juvenile Courts
justice systems for juveniles. They believed that it was possible to save these maturing boys from a life of crime, and that the state had a moral responsibility, in the words of Julian Mack, “not so much to punish as to reform, not to degrade but to uplift, not to crush but to develop, not to make [the boy] an offender but a worthy citizen” (Fagan and Zimring 2000, 4). Once a city opened a juvenile court, many residents considered it a child welfare center that could provide essential social services, including medical clinics, recreational programs, and employment bureaus for boys of working age. In order to help boys develop into productive citizens, juvenile courts in urban areas were often equipped with probation departments. With probation as an option, judges now had the legal authority to allow boys to remain at home under supervision, instead of committing them to juvenile correctional institutions. This option not only gave the boy another chance to fly right, but also made it possible for probation officers to monitor the behavior of both the boy and his family. Through the use of probation, the power of state officials to police the social lives of predominantly working-class families increased dramatically in the early twentieth century, but understaffed probation departments and staggering caseloads diluted this potentially powerful instrument for social control. Juvenile courts committed many delinquent boys, especially older recidivists, to a variety of juvenile institutions such as reformatories and detention centers. In the early twentieth century, juvenile courts committed about 15 percent of all cases to reformatories, but the rate of incarceration varied widely within states and across the nation (Rothman 1980).
The rate of incarceration of boys fluctuated over the course of the twentieth century but remained well below that of probation. Juvenile courts also relied upon pretrial detention much more heavily than did the adult criminal justice system, and this disparity has led criminologists to describe the juvenile justice system as uniquely front-loaded, as many boys spent time in a secure custodial facility or detention center before their cases were heard. Thus, the incarceration of boys, whether in reformatories or detention centers, has been a central feature of American juvenile justice in the twentieth century. Critics have questioned the procedural fairness of juvenile courts since their creation but failed to alter the practice of juvenile justice until after World War II. In the postwar era, a new generation of reformers, including lawyers, focused attention on the procedural arbitrariness of American juvenile courts, setting the stage for the U.S. Supreme Court to deliver its landmark decision, In re Gault. The case involved the informal process by which an Arizona juvenile court committed fifteen-year-old Gerald Gault to a juvenile reformatory for an indeterminate sentence. Without the intervention of the Supreme Court, Gerald, who had been accused by a neighbor of making an obscene phone call, could have remained incarcerated until he turned twenty-one. An adult charged with the same offense in Arizona, however, would have been jailed for only two months or ordered to pay a $50 fine. The Supreme Court declared that the paternalistic rhetoric of juvenile justice no longer matched the reality of how these systems were working and that “neither the Fourteenth Amendment nor the Bill of Rights is for adults
Juvenile Courts alone” (Gault, 3). In order to protect the rights of children and their parents, the Supreme Court extended limited due process protections to minors in juvenile court, including the right to notice, counsel, confrontation, cross-examination of witnesses, and the privilege against selfincrimination. This victory for children’s rights, however, had the unintended consequence of changing the nature of juvenile justice. By altering “delinquency proceedings from an inquiry about social welfare into a criminal prosecution,” the Supreme Court inadvertently paved the way for states to make juvenile courts more like adult criminal courts (Feld 1999, 107). The removal of status offenders from juvenile court in the 1970s and 1980s also contributed to this process of criminalizing American juvenile justice because it emphasized that juvenile courts would primarily hear the cases of boys who were accused of committing criminal offenses. The juvenile court has, however, retained at least some of its paternalistic ideals. The Supreme Court, for example, stopped short of requiring juvenile courts to provide all the same procedural protections as criminal courts. It declared that jury trials were not constitutionally required in juvenile court, for if they were required, they would “remake the juvenile proceeding into a fully adversary process and [would] put an effective end to what has been the idealistic prospect of an intimate, informal protective proceeding” (McKeiver v. Pennsylvania, 1971). Thus, according to the Supreme Court, the juvenile court had to strive to retain the ideals of parens patriae, while incorporating most of the due process protections accorded adults in criminal court. Critics have argued that this com-
405
bination has “transformed the juvenile court from its original Progressive model as a social service agency into a deficient second-rate criminal court that provides young people with neither therapy nor justice” (Feld 1999, 15). Proponents of juvenile justice, however, contend that these courts should be preserved because they still offer boys more services and chances to succeed in life than the criminal justice system. In response to a soaring rate of juvenile arrests for serious and violent offenses in the late 1980s and early 1990s, including the doubling of the juvenile murder rate, some criminologists argued that growing numbers of American boys were turning into “super-predators—radically impulsive, brutally remorseless youngsters, including ever more pre-teenage boys, who murder, assault, rape, rob, burglarize, deal deadly drugs, join gun-toting gangs, and create serious communal disorders” (Bennett, DiIulio, and Wilson 1996, 27). As a result of these fears and concerns about the juvenile justice system’s capacity to handle these cases of serious and violent offenders in a manner that protected public safety, in the 1990s almost every state passed legislation making it easier to transfer cases out of juvenile court and into criminal court (Zimring 1998, 1). These laws, which made it possible to try boys as adults, were also enacted in part to combat a predicted wave of juvenile crime in the early twenty-first century by this so-called growing breed of superpredators. As these new transfer laws were being enacted, however, the nation’s rate of youth violence had already begun to decline rapidly. This drop in the youth crime rate called into question the characterization of contemporary boys as superpredators as well as the
406
Juvenile Delinquency
dire predictions about a coming storm of juvenile crime in the twenty-first century. In the 1990s, the juvenile court faced its greatest legitimacy crisis to date, including some calls for its abolition, but it survived as a unique and vital part of modern governance in every state of the Union and in most of the developed nations of the world. David S. Tanenhaus See also Juvenile Delinquency; Reformatories, Nineteenth-Century; Reformatories, Twentieth-Century References and further reading Ayers, William. 1997. A Kind and Just Parent: The Children of Juvenile Court. Boston: Beacon Press. Bennett, William, John DiIulio, and John Wilson. 1996. Body Count: Moral Poverty—and How to Win America’s War against Crime and Drugs. New York: Simon and Schuster. Fagan, Jeffrey, and Franklin E. Zimring, eds. 2000. The Changing Borders of Juvenile Justice: Transfer of Adolescents to the Criminal Court. Chicago: University of Chicago Press. Feld, Barry C. 1999. Bad Kids: Race and the Transformation of the Juvenile Court. New York: Oxford University Press. Manfredi, Christopher P. 1998. The Supreme Court and Juvenile Justice. Lawrence: University of Kansas Press. Rosenheim, Margaret K., Franklin E. Zimring, David S. Tanenhaus, and Bernardine Dohrn, eds. 2001. A Century of Juvenile Justice. Chicago: University of Chicago Press. Rothman, David J. 1980. Conscience and Convenience: The Asylum and Its Alternatives in Progressive America. Boston: Little, Brown. Tanenhaus, David S. 1998–1999. “Juvenile for the Child: The Beginning of the Juvenile Court in Chicago.” Chicago History 27: 4–19. Zimring, Franklin E. 1982. The Changing Legal World of Adolescence. New York: Free Press. ———. 1998. American Youth Violence. New York: Oxford University Press.
Juvenile Delinquency Juvenile delinquency usually refers to any violation of local, state, or federal law by persons in the juvenile age group. The term was not used until the late nineteenth century. Before the introduction of juvenile court jurisdictions, it was not possible to charge a child under the age of seven (ten in Illinois) with a crime. Girls and boys up to the age of fourteen were criminally liable only if the court was of the opinion that they were responsible for their actions. If convicted, juvenile offenders prior to 1890 usually went into the same place of incarceration as adult criminals. In 1899, Illinois passed the first law defining juvenile delinquency. By 1933 forty-six states had passed juvenile court laws. Today all states have juvenile courts. The major innovations of the 1899 law were introducing the juvenile age group as a legal category and applying to them the principle of parens patriae. This principle gives the state supreme guardianship over all subjects unable to decide their fate for themselves and the right to intervene in family relations whenever a minor’s wellbeing is in danger. Parens patriae penalizes the whole juvenile age group. It defines any young person as being in need of supervision, care, and treatment. In effect, the intent to protect has led to reverse consequences. A common example is the court’s decision to institutionalize a lower-class youth in order to shield him from family conditions caused by alcoholic, irresponsible, or criminal parents. The new juvenile legislation removed the individual boy acting out from the purview of criminal justice, but made any boy’s family condition as well as his behavior a matter of concern and business for the juvenile court.
Juvenile Delinquency
407
A sixteen-year-old after being apprehended following a holdup, 1943 (Bettmann/Corbis)
Juvenile delinquency was introduced at a time when the rapidly growing cities became centers of manufacturing and urban America attracted masses of immigrants. States began to pass anti–child labor laws and to require school attendance. Compulsory schooling and anti– child labor legislation created new sets of offenses. Playing truant and selling newspapers without official permission (obtained with proper school attendance record) are examples. Boys in the cities who encountered more and more difficulties in finding work after school enjoyed large amounts of free time. As one official put it, boys were under supervision only six hours out of twenty-four. Children and youths
who were no longer allowed to work on a regular basis swarmed over the streets in the congested quarters of cities and appropriated the streets for play. Restricted to street space and their own bodies as place and means of play, the boys could not escape conflicting with either property rights in general or business use of urban space in particular. Playing baseball in city streets, committing petty thefts, vandalizing property, and generally being a nuisance were the usual delinquencies that brought urban boys before the juvenile court. So too was street fighting caused both by lack of play facilities and by traditional ideas of masculinity that made defending one’s territory against the incursions of youths
408
Juvenile Delinquency
from other neighborhoods a requisite for male juveniles. According to contemporaries, in the 1920s in Manhattan and other parts of New York, there were at least three boy gangs or neighborhood groups between two street intersections in continuous warfare with their age group on the adjoining blocks. In these circumstances, the new juvenile court became the social mechanism for controlling juvenile conduct. From January 1920 to April 1933, prohibition was the law in the United States. It allowed for the creation of the first shadow economy in the cities. Shadow economies exploit the demand for illegal goods such as alcohol, drugs, or unlicensed firearms or illegal services such as gambling or prostitution. They offer high returns in money and power at low risk for most participants and at high risk only for some individuals. Prevalent in inner sections of large metropolitan areas lacking any effective social and cultural structuring of life, shadow economies have separated male juvenile delinquency into two subsections, the essential difference being participation or nonparticipation in a shadow economy. Thus, petty drug dealing by male juveniles has usually led to their familiarity with or involvement in serious crime and violence. The sociologist Frederic M. Thrasher first noticed that male adolescent gang members (under adult leadership) were involved in alcohol running in the 1920s and that gang membership as well as alcohol dealings provided positive social and financial rewards (Thrasher 1927). In more recent years, neighborhoods with high levels of transience, a high concentration of poverty, ethnic and racial segregation, a large percentage of unmarried mothers, single-parent families, and unemployed young male adults have great difficulties in resisting
shadow economies that offer some hold for youth and business for the community. The existence of shadow economies has resulted in locally higher levels of juvenile delinquency. Since World War II, a larger number of boys than ever before remain in school through high school and enjoy substantial free time after classes end. Many of them work after school in order to afford an automobile. Cars have become the symbol of adolescent freedom and offer the means to date and have sex outside parental purview. The wide availability of cars has introduced new forms of juvenile delinquency such as car theft and “joy riding.” Automobiles have also set up new opportunities for boys to appropriate public space in ways that conflict with local law, such as by cruising down streets or congregating in parking lots. Similarly, the automobile has led to the creation of shopping centers and malls where youths increasingly spend their spare time meeting in groups unsupervised by adults. In scope, juvenile delinquency ranges from antisocial conduct often associated with teenage boys and other low-level forms of misbehavior to more serious criminal offenses such as burglary and aggravated assault. A youth can be alleged to be delinquent because of all illegal actions for which an adult can be prosecuted, as well as actions that are violations of law only because they are carried on by a young person whose status is that of a minor. Youths who are sexually active with others, who run away, who are truant disrupting or cutting class, who drink alcohol or are disobedient (the juvenile court laws speak of boys as well as girls being ungovernable or incorrigible), and who violate curfews are examples of status offenders (so called because of the actor’s legal status as underage). Any of
Juvenile Delinquency these offenses would be perfectly legal for an adult. Status offenses have been less commonly prosecuted in recent decades. In 1990, there were some 87,000 cases of status offenses among the cases referred to the juvenile courts (Snyder et al. 1993). Most states set the upper limit of the juvenile age group at seventeen years of age, but exceptions are numerous. In some states the juvenile court will not consider youths younger than age ten or twelve as delinquents. A juvenile court also can retain jurisdiction until a young person reaches age twenty-one. The rate of delinquency is not easy to establish because many instances do not come to the knowledge of police and courts. The 1.35 million cases that were brought to the attention of the juvenile courts in 1990 mean that one of every twenty juveniles in the country has been alleged delinquent. Most laws allow juvenile courts to waive jurisdiction and refer a youth’s case to the criminal courts, where it will be tried as if committed by an adult. In 1990, 16,900 cases in the country were transferred to an adult court (or 1.3 percent of all referrals to juvenile courts). The very existence of such an option calls into question the basis for the juvenile justice system: that a juvenile’s age indicates some measure of nonliability and that immaturity disqualifies young persons from being punished in the same manner as adults. Making a youth as fully responsible as an adult for the most serious offenses is usually explained by reference to the boy in question (cannot be rehabilitated because of his character) or to the deed (its very seriousness makes it self-evident that it is criminal). In the past, most juvenile court laws provided for informal court procedures and generally ignored the necessity of legal representation or due process. Because the
409
legal prosecution of juvenile delinquents rests upon the belief that the court will act in the best interest of a youth, the framers of juvenile court laws did not see any necessity of protecting a young defendant’s constitutional rights. In the 1960s several U.S. Supreme Court rulings (especially In re Gault, 1967) extended some due process protection to juveniles. The rulings attempted to make delinquency proceedings as adversarial as criminal trials. Instead of acting to prevent future crime, the juvenile court has now to establish “legal guilt” beyond a reasonable doubt through the use of legal evidence and proof. The new procedural requirements also provide lower-class minority juveniles the same defense possibilities always had by members of their age group from better social backgrounds. Those with prosperous parents possessed good access to lawyers who could negotiate with the police concerning youths’ antisocial behavior before it came to the juvenile court. With the Gault decision, lower-class youth gained the right of having access to legal counsel. Social class has figured in juvenile delinquency in other ways as well. Most juvenile court laws call simply for adjudicating the youngster in question as delinquent. The outcome of a juvenile court proceeding is not immediate but a kind of suspended punishment. Most cases are put on probation, which allows the youth in question a chance to adapt himself to society’s needs and requirements. As part of jurisdiction the juvenile court cannot change the circumstances in which a boy grows up and that form his responses to destructive social forces. Consequently, although juvenile justice makes a young person less liable for his conduct than an adult, his ultimate treatment by the court depends on his resistance to destructive family patterns, peer relations, and com-
410
Juvenile Delinquency
munity structures. Poorer boys face greater obstacles to changing their lives and behavior than do more affluent boys. Therefore, juvenile delinquency has always been strongly related to social class, with the majority of juvenile delinquents coming from lower-income families, especially those living in the inner cities. Surveillance by the police or citizens has been greater in impoverished, high-crime areas, which in practice discriminates against lower-status youth. From their very beginnings, juvenile courts discriminated on the basis of gender because they defined what was called sexual delinquency very narrowly as premarital sex by an underage woman. Courts have usually ignored adolescent boys’ sexual activity unless it involved violence. Boys’ sexual actions have been treated as a normal variant of male youths’ propensity for risk taking and experimentation with adult behavior. As a rule, girls’ sexual activity was punished by the juvenile courts, usually by institutionalizing the female adolescents. Girls were singled out because the courts believed that law-obeying behavior for girls included not to risk getting pregnant or spreading venereal diseases. For both, boys were seen as not responsible, and their risk taking (unsafe sex) was considered only if it were an additional instance of being “incorrigible.” In recent years, the greatest change in juvenile delinquency has been the rise in serious offenses committed by young people. The rate of violent crime has risen faster than that of delinquency in general. Those most likely to be charged with violent crimes are young males between the ages of fifteen and nineteen, a disproportionate number of whom are black. Because the number of young males in the fifteen- to nineteen-year-old age group will continue to expand for
some time, in all probability so too will violent crime. The most disturbing growth of juvenile violence has been in the number of male children and adolescents murdered and the number of teenage males arrested for murder. The main reason for this growth is the spread of shadow economies (such as the explosive growth of drug dealing since the 1980s) in conjunction with the sale of firearms. Today many gangs of young males, united more by ethnic than neighborhood ties, engage in drug sales. The growth in juvenile violence is also due to the dissolution of moral norms or other means of inner control and the failure of parents and other responsible adults to monitor adolescent male behavior. The rate of reported delinquencies remains higher among boys than girls. For serious offenses such as burglary or assault, boys exceed girls in juvenile arrests by a ratio of fifteen to one. Boys also tend to engage in delinquent behavior at an earlier age than girls do. In addition, boys are also more likely than girls to take part in serious problem behavior for prolonged periods of time. Nonetheless, in recent decades there has been a significant narrowing of the gap between male and female rates of juvenile delinquency. For nonviolent offenses such as shoplifting or theft, the arrest ratio is now less than three boys for every girl. Most adolescent boys participate in problem behavior or juvenile conduct that risks being classified as delinquent without realizing that their conduct is dangerous to themselves or others. The difference between engaging in problem behavior and becoming delinquent is usually the difference between experimentation and habit. A thirteen-year-old boy who is caught shoplifting once will probably avoid getting into trouble again.
Juvenile Delinquency Nine- or ten-year-old boys who engage in similar delinquent acts are more likely to be headed for trouble than those with a later onset of problem behavior. Positive peer group relations and good social standing in school also make it easier to resist continued risk taking. Relations between parent and child as well as the neighborhood in which one grows up are closely connected with the development of delinquent behavior. Although true of both girls and boys, the greater incidence of male juvenile delinquency makes boys’ relations with their parents and the community in which they live of special importance. The extent of violent behavior such as physical punishment in a family and the extent and adequacy of parental supervision are reasons why problem behavior develops into delinquency. Exposure to violence within the home is an especially strong predictor of delinquency. Delinquent boys frequently have parents who deal with family crises poorly. Such parents neither maintain predictable family routines nor do they monitor their boys well. These mothers and fathers usually respond inconsistently to their sons’ conduct and have only minimal aspirations for their children. Such families also are quite likely to live in impoverished and stigmatized communities where there is strong peer pressure for boys to join gangs. Nonetheless, teenage gangs exist today in affluent white suburbs as well, which indicates that peer pressure on boys to conform to antisocial behavior can occur in neighborhoods of widely different income levels. Karl Tilman Winkler See also Gangs; Juvenile Courts; Reformatories, Nineteenth-Century; Reformatories, Twentieth-Century; Runaway Boys
411
References and further reading Dryfoos, Joy G. 1990. Adolescents at Risk: Prevalence and Prevention. New York: Oxford University Press. Empey, LaMar T., and M. C. Stafford. 1991. American Delinquency: Its Meaning and Construction. 3d ed. Belmont, CA: Wadsworth. Goldstein, Arnold P. 1990. Delinquents on Delinquency. Champaign, IL: Research Press. Gullotta, Thomas P., Gerald R. Adams, and Raymond Montemayor, eds. 1998. Delinquent Violent Youth: Theory and Interventions. Vol. 9, Advances in Adolescent Development. Thousand Oaks, CA: Sage. Lawrence, Richard. 1998. School Crime and Juvenile Justice. New York: Oxford University Press. Musick, David. 1995. An Introduction to the Sociology of Juvenile Delinquency. Albany: State University of New York Press. Platt, Anthony. 1977. The Child Savers: The Invention of Delinquency. 2d ed. Chicago: University of Chicago Press. Quay, H. C., ed. 1987. Handbook of Juvenile Delinquency. New York: Wiley. Ryerson, Ellen. 1978. The Best-Laid Plans: America’s Juvenile Court Experiment. New York: Hill and Wang. Schwarz, Ira M. 1989. (In)Justice for Juveniles: Rethinking the Best Interest of the Child. Lexington, MA: Lexington Books. ———, ed. 1992. Juvenile Justice and Public Policy. Lexington, MA: Lexington Books. Snyder, H. N., et al. 1993. Juvenile Court Statistics 1990. Washington, DC: U.S. Department of Justice. Thrasher, Frederic Milton. 1927. The Gang: A Study of 1,313 Gangs in Chicago. Chicago: University of Chicago Press. Winkler, Karl Tilman. 1996. “Reformers United: The American and the German Juvenile Court, 1882–1923.” Pp. 235–274 in Institutions of Confinement: Hospitals, Asylums, and Prisons in Western Europe and North America 1550–1900. Edited by Norbert Finzsch and Robert Juette. Publications of the German Historical Institute, Washington, DC. Cambridge: Cambridge University Press.
BOYHOOD IN AMERICA An Encyclopedia
The American Family The six titles that make up The American Family offer a revitalizing new take on U.S. history, surveying current culture from the perspective of the family and incorporating insights from psychology, sociology, and medicine. Each two-volume, A-to-Z encyclopedia features its own advisory board, editorial slant, and apparatus, including illustrations, bibliography, and index.
Parenthood in America edited by Lawrence Balter, New York University
Adolescence in America edited by Jacqueline V. Lerner, Boston College, and Richard M. Lerner, Tufts University; Jordan W. Finkelstein, Pennsylvania State University, Advisory Editor
Girlhood in America edited by Miriam Forman-Brunell, University of Missouri, Kansas City
Boyhood in America edited by Priscilla Ferguson Clement, Pennsylvania State University, Delaware County, and Jacqueline S. Reinier, California State University, Sacramento
Infancy in America edited by Alice Sterling Honig, Emerita, Syracuse University; Hiram E. Fitzgerald, Michigan State University; and Holly Brophy-Herb, Michigan State University
The Family in America edited by Joseph M. Hawes, University of Memphis, and Elizabeth F. Shores, Little Rock, Arkansas
BOYHOOD IN AMERICA An Encyclopedia
Volume 2 L–Z
Priscilla Ferguson Clement, editor Professor of History Pennsylvania State University–Delaware County Media, Pennsylvania
Jacqueline S. Reinier, editor Professor Emerita California State University–Sacramento Sacramento, California foreword by
Elliott West
University of Arkansas Fayetteville, Arkansas
Santa Barbara, California Denver, Colorado Oxford, England
© 2001 by Priscilla Ferguson Clement and Jacqueline S. Reinier “Masculinities” (page 425) © 2001 by Michael Kimmel All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, except for the inclusion of brief quotations in a review, without prior permission in writing from the publishers. Library of Congress Cataloging-in-Publication Data Boyhood in America : an encyclopedia / edited by Priscilla Ferguson Clement, Jacqueline S. Reinier ; foreword by Elliott West. p. cm. — (The American family) Includes bibliographical references and index. ISBN 1-57607-215-0 (hardcover : alk. paper) 1-57607-540-0 (e-book) 1. Boys—United States—Encyclopedias. I. Clement, Priscilla Ferguson, 1942– II. Reinier, Jacqueline S. III. American family (Santa Barbara, Calif.) HQ775 .B635 2001 305.23—dc21
07 06 05 04 03 02 01
10 9 8 7 6 5 4 3 2 1 (cloth)
ABC-CLIO, Inc. 130 Cremona Drive, P.O. Box 1911 Santa Barbara, California 93116-1911
This book is also available on the World Wide Web as an e-book. Visit www.abc-clio.com for details.
This book is printed on acid-free paper • Manufactured in the United States of America
Contents
A-to-Z List of Entries
vii
Volume 1: Entries A to K 1 Volume 2: Entries L to Z 413 Bibliography 769 Index 837 About the Editors 847
v
A-to-Z List of Entries
VOLUME 1, A–K
Boy Scouts Boys’ Choruses Boys Town Bullying
A
C
Abuse Accidents Adams, John Adolescence Adoption African American Boys Alger, Horatio Allowances Amusement and Theme Parks Apprenticeship Artists Asian American Boys
California Missions Camping Cars Chinese American Boys Circumcision Civil War Clothing Clubs Comic Books Competition Computers Cowboys
B
D
Bar Mitzvah Baseball Baseball Cards Basketball Bicycles Big Brothers Bodies Books and Reading, 1600s and 1700s Books and Reading, 1800s Books and Reading, 1900–1960 Books since 1960 Boxing
Discipline Disease and Death Divorce Douglass, Frederick Drag Racing
E Early Republic
vii
viii
A-to-Z List of Entries
Emerson, Ralph Waldo Emotions
F Farm Boys Fathers Fathers, Adolescent Films Fire Companies Fishing Football Foster Care 4-H in the Midwest Franklin, Benjamin Fraternities Frontier Boyhood
G Gambling Games Gangs Gold Rush Graffiti Grandparents Great Depression Guns
H Holidays Horror Films Hunting
I Ice Hockey Illegal Substances Immigrants Indentured Servants Intelligence Testing
J Jefferson, Thomas Jobs in the Seventeenth and Eighteenth Centuries Jobs in the Nineteenth Century Jobs in the Twentieth Century Jokes Juvenile Courts Juvenile Delinquency
VOLUME 2, L–Z L Learning Disabilities Left-Wing Education
M Manners and Gentility Masculinities Masturbation Melodrama Mexican American Boys Military Schools Mothers Muscular Christianity Music
N Nationalism and Boyhood: The “Young America” Movement Native American Boys Newsboys
O Orphanages Orthodontics
A-to-Z List of Entries
P Parachurch Ministry Performers and Actors Pets Photographs by Lewis Hine Placing Out Plantations Poliomyelitis Pornography Portraiture Poverty Preachers in the Early Republic Prostitution
R Reformatories, Nineteenth-Century Reformatories, Twentieth-Century Revolutionary War Rock Bands Roosevelt, Theodore Runaway Boys
S Same-Sex Relationships Schoolbooks Schools for Boys Schools, Public Scientific Reasoning Sexuality Sexually Transmitted Diseases Siblings Skateboarding Skiing Slave Trade Slavery Smoking and Drinking
Sports, Colonial Era to 1920 Sports, 1921 to the Present Suicide Sunday Schools Superheroes
T Teams Television: Cartoons Television: Domestic Comedy and Family Drama Television: Race and Ethnicity Television: Westerns Tennis Theatre Toys Transitions (through Adolescence)
V Vaudeville Video Games Violence, History of Violence, Theories of Vocational Education
W Washington, Booker T., and W. E. B. Du Bois World War II
Y Young Men’s Christian Association Young Men’s Hebrew Association
ix
L Learning Disabilities
tion and a reflection of larger gender trends in twentieth-century America. The label learning disabilities is of relatively recent origin. Its initial usage came in 1963, when Samuel Kirk of the University of Illinois used it to identify children of normal intelligence who had trouble learning, particularly learning to read. However, this was not the first attempt at labeling this diverse group of problem learners. In the late 1930s, Alfred Strauss and Heinz Werner, German refugees then working at the Wayne County Training School in Northville, Michigan, identified a group of students who had trouble learning. They categorized these children as brain-injured. By 1947, Strauss and a new collaborator, Laura Lehtinen, published their seminal Psychopathology and Education of the Brain-Injured Child, which not only identified these individuals but also recommended strategies for remediation. The research emanating from Northville, led by such followers of Strauss as William Cruickshank and Newell Kephart, continued to categorize these children as braininjured. Simultaneously, Samuel Orton, a University of Pennsylvania–trained neurologist, was examining children with reading difficulties in a variety of settings. Funded by grants from the Rockefeller Foundation, Orton hypothesized that these students suffered from developmental alexia, or dyslexia—an inability to read words due to
Learning disabilities (LD) currently consumes the largest amount of study in the field of special education. The very vagueness of the term learning disabilities and the imprecision with which the label has been applied have led to concerns about etiology, diagnosis, and remediation. According to federal law 94-142, a learning disability is “a disability in one or more of the following basic psychological processes involved in using language, spoken or written, which may manifest itself in an imperfect ability to listen, think, speak, read, write, spell, or to do mathematical calculations.” LD is also a field in which three different professions—medical, educational, and psychological—are attempting to stake their claim to identifying and solving the problems associated with this spectrum of disorders. Definitional imprecision has led to a wide interpretation of the very meaning of learning disabilities. Because of the lack of a definitive labeling procedure, gender becomes an important variable in the process of identification of those labeled as having LD. Increasingly identified with the labels of attention deficit disorder (ADD) and attention deficit hyperactivity disorder (ADHD), LD is often viewed as a problem of squirmy little boys who can’t learn to read because they can’t sit still. The prevalence of boys labeled as having LD is both a manifesta-
413
414
Learning Disabilities
problems in the different hemispheres of the brain. This medical model of the disorder held prominence in the field until the 1960s when Kirk “de-medicalized” the problem. Although Strauss and Orton set about to medically define a group of individuals with specific learning problems, they did not do so without regard to social factors. The social issues impinging on midcentury America helped shape the emerging category of learning disabilities and tie it to gender. The general development of special education and the specific development of learning disabilities are tied to trends within the larger field of general education. By the late 1930s, educators had developed a system of tracking, whereby students were placed in a rank-ordered curriculum designed to put students in appropriate vocational or educational settings upon high school completion. Usually, tracking verified existing race and class structures within American society. Twenty years later, education achieved paramount importance as the Cold War and the “space race” pointed to the need to train a highly literate population in order to fight off the encroachments of communism. With both tracking and the reforms initiated in the face of Soviet scientific achievement, those students who had problems in reading remained problematic, especially if they showed no overt signs of intellectual disability. Unable to classify them as retarded—the nomenclature was slowly changing from the earlier term feebleminded—psychologists and educators searched for both a label and a rationale for failure. Latching on to Strauss’s medical model, they embraced the idea of “exogenous braininjury” and shaped it into the category of “learning disabilities.” The variables of class and race were important factors in
the emerging definition of learning disabilities. Though not overtly established to provide remediation for white middleclass children, the category of learning disabilities did just that. By 1973, according to published journal article samples, over 96 percent of students with LD were white and over 90 percent were middleclass or above. Poor and minority students with similar educational problems were labeled as culturally deprived or educably mentally retarded, with the etiology assumed to be cultural or environmental. Presently, the medical model of LD has seen a revival with the conflation of ADD, ADHD, and the term learning disabilities itself. Increasingly, medical intervention—usually pharmacological in nature—has been seen as a major part of the answer to learning problems. The very conception of “learning disabilities” is therefore related to this struggle between proponents of organic etiology and those emphasizing its societal components. Intertwined with this debate is the issue of gender LD; more specifically, why significantly larger numbers of boys are labeled, categorized, and remediated as “learning disabled.” This has been assumed as a given since the initiation of the label, yet relatively little substantive research has been done on the subject, especially compared to that compiled on race and class and their relationship to learning disabilities. Two closely related questions can be generated from this analysis. First, why is there a gender discrepancy? Second, why is this gap considered unimportant or— to use a more interesting word—natural? The answers to these questions go far beyond a simple analysis of learning disabilities to shed light on broad issues of the meaning of education in twenty-firstcentury America.
Learning Disabilities The number of males enrolled in programs designed for those labeled as having LD has consistently ranged from 65 percent to 90 percent. These numbers have changed relatively little since Kirk applied the name to this range of educational disorders. There have been many reasons for the gender discrepancy, and they relate to both the imprecision of the term’s definition and the social context of education in America. Most of the public discourse on this issue—especially from right-wing pundits such as John Rosemond—point to the referral process in which students are labeled and categorized as having LD as the cause for discrepancies. Boys are often referred because their inappropriate classroom conduct and hyperactive out-of-seat behavior inhibit their ability to learn, especially their ability in learning to read. Teachers, the primary agents of the referral process, tend to refer students with overt behaviors that not only inhibit learning but also disrupt the class environment itself. Boys make up the preponderant majority of this population. Behavior problems and learning problems often go hand in hand in referrals of male students. Conversely, girls labeled as having LD usually only exhibit learning difficulties with few concomitant behavioral problems. Because they do not act out in a disruptive manner, girls are often overlooked in the referral process. This conflation of learning problems with behavioral disorders has been exacerbated by the recent trend to equate LD with ADD and ADHD. Learning disorders are manifested in inappropriate classroom behaviors—for example, the classic “hyperactive boy.” This explanation for the large numbers of males labeled as having LD speaks to the broader issues of gender roles in
415
American education and society. But is this preponderance of males simply a social phenomenon? Is the skewing caused by teacher referrals based on classroom behavior problems? Many researchers have made this conclusion, based particularly on the relationship among LD, ADD, and ADHD. But others, particularly Robert Nass, who use a more physiological approach, have come to different conclusions. They hypothesize that hormonal sex differences, differing maturation rates, and genetic factors place males at a higher risk for learning disorders. Nass also believes that girls may have a sex-based advantage in the acquisition of language. Still, more tentative research points to differences in brain physiology based on sexual difference as a causal factor in male reading difficulty. Finally, researchers are examining the possibility that there is a gene that puts individuals at risk for reading difficulties and that this characteristic may be sex-linked. Are males therefore more “at risk” for learning disabilities? Or is the large number of males classified as having LD a function of referral bias and the conflation of behavioral disorders and learning problems? Answers are not easily available, especially to questions about such a vague and ill-defined disorder as learning disabilities. What is readily apparent, however, is that males as well as females suffer from an education “gender gap.” When tied to the increasing medicalization of educational difficulties, the gap causes serious educational consequences for male students. The social implications of male LD have yet to be determined, but they are bound to have implications for educational policy in twenty-first-century America. Steven Noll
416
Left-Wing Education
References and further reading Anderson, Kristen. 1997. “Gender Bias and Special Education Referrals.” Annals of Dyslexia 47: 151–162. Carrier, James. 1986. Learning Disability: Social Class and the Construction of Inequality in American Education. Westport, CT: Greenwood Press. Franklin, Barry. 1987. Learning Disabilities: Dissenting Essays. New York: Falmer Press. Nass, Robert. 1993. “Sex Differences in Learning Abilities and Disabilities.” Annals of Dyslexia 43: 61–77. Riordan, Cornelius. 1999. “The Silent Gender Gap: Reading, Writing and Other Problems for Boys.” Education Week 19 (November): 46–49. Sleeter, Christine. 1986. “Learning Disabilities: The Social Construction of a Special Education Category.” Exceptional Children 53: 46–54.
Left-Wing Education By the early twentieth century, boys were not only experiencing mainstream education, either public or private, but also were enrolled in schools and summer camps and read publications sponsored by left-wing political groups and individuals. First the Socialist Party and then the Communist Party and other radical groups established various forms of weekday as well as Sunday schools in order to educate boys as well as girls to oppose capitalism; discern their particular political, social, and cultural views; or just avoid the authoritarian structures of public schools. Such left-wing schools and publications began to disappear following World War II with the coming of the anticommunist red scare, although the summer camps continued in some form. While in existence, however, they offered an alternative educational exposure and experience for thousands of boys with varying degrees of success. Leftwing educational endeavors were not
specifically organized for boys, since girls were equally welcomed, but boys were considered to be somewhat more politically oriented and active, particularly in leadership positions. The Socialist Party of America was founded in 1901 and within the following two decades established about 100 English-speaking Sunday schools throughout the country, designed to supplement the public schooling of working-class boys and girls. Socialists had organized various schools before 1900, but it took the Socialist Party to launch more organized Sunday schools, starting in San Jose, California, in 1902, quickly followed by Chicago, Cincinnati, Newark, Los Angeles, New York City, Cleveland, and scattered smaller towns. The schools were written about in the Young Socialists’ Magazine, published by the Young People’s Socialist League (YPSL), the party’s youth wing, particularly in 1918, when schools reached their peak. But splits within the party, combined with government repression, soon led to their rapid decline, although some survived through the 1920s. The most active schools, in Rochester, New York, Milwaukee, and New York City, concentrated on criticizing capitalism, establishing a sense of community, and spreading an understanding of socialist principles and positions. Ranging in age from five to fourteen, the “little comrades” and “kiddie socialists” who attended were exposed to organized lessons dealing with economic subjects and socialist tenets as well as games, songs, plays, and festivals. There were texts specifically written for the schools, for example, Nicholas Klein’s The Socialist Primer and John Spargo’s Socialist Readings for Children. They also had songbooks, such as Josephine Cole’s Socialist Songs, Dialogues, and
Left-Wing Education Recitations, with tunes like “My Money Lies over the Ocean” and “Kid Comrade.” Paralleling the socialist Sunday schools were Workmen’s Circle Sunday schools, which started in 1906. Schools started by the Workmen’s Circle, a Jewish fraternal society, initially used English and emphasized the teaching of socialism but within ten years had basically switched their focus to Yiddish-language instruction and Jewish culture, with socialism still part of the curriculum. Somewhat similar to the socialist Sunday schools in appeal and outreach were the libertarian-/anarchist-influenced modern schools based on the educational principles of Francisco Ferrer y Guardina. Executed by the Spanish government in 1909 for radical activities, Ferrer had promoted rational schools based on self-learning and limited adult authority. The Radical Library of Philadelphia, an anarchist branch of the Workmen’s Circle, opened the first Modern Sunday School in 1910. Soon there were Sunday and even weekday schools in New York, Portland, Oregon, and throughout the country, most of them short-lived. The Modern School magazine, founded in 1912, promoted Ferrer’s ideas and criticized the tyranny of public schooling. A summer camp was opened in 1927 in Lincoln Park, New Jersey, that charged $3 a week and had pictures of Karl Marx and Friedrich Engels hanging in the dining room. In 1915 the weekday Modern School moved from New York City to Stelton, New Jersey, teaching eighty-odd pupils during the first year in New Jersey. The most famous of the Ferrer full-time schools, it survived until 1953 under the auspices of the Modern School Association of North America. As the socialist and Ferrer schools limped along or began to fade, the newly emerging Communist Party, formed in
417
1919 out of the chaos of World War I and the destruction of the Socialist Party and spurred into existence by the emergence of the Soviet Union, developed a strategy for educating and influencing the young. Like the socialists and anarchists, communists desired to counter the conservative thrust of public schools while inculcating boys (and girls) with radical political ideas and countering restrictive bourgeois family values. Communists early established sundry children’s and youth organizations and camps, rather than formal schools, that served to supplement the public schools. The party founded the Young Workers League in 1922, which became the Young Pioneers of America (1926–1934), mostly comprising the younger children of party members. Affiliated publications, starting with the Young Comrade, helped to spread the word. There was also the Young Worker (1922–1936), written by and aimed at the party’s youth wing. Articles covered the country and included poems such as “Song of Youth.” In the late 1920s and throughout the 1930s, the Young Pioneers helped workers on strike and worked with other radical youth organizations, such as the International Workers Order (IWO) Juniors, the Worker’s International Relief Scouts, and the Junior Liberators, all aspects of the Communist Children’s Movement. Such organizations were designed to foster a revolutionary identity and mentality, initially separating both boys and girls from conservative family values and any particular ethnic identity. The Young Pioneers organized summer camps during the 1920s, open from two to six weeks a year, including the Workers’ Children’s Camp near Los Angeles in 1929—also supported by the Non-Partisan Jewish Workers’ School, the Miners’ Relief Scouts, Friends of Culture, and the
418
Left-Wing Education
Finnish, Ukrainian, and Czech Labor Schools—and others in Chicago; Grand Rapids, Michigan; Boston; Philadelphia; and around New York state. Starting with an early-morning salute to the Red Flag, in addition to their outdoor activities campers studied the class struggle, listening to stories from Herminia zur Muhlen’s Fairy Tales for Worker’s Children and William Montgomery Brown’s Science and History for Boys and Girls. Indeed, perhaps forty books of children’s literature were published by the party between 1925 and 1950. After being dissolved in 1934, the Young Pioneers was replaced in boys’ lives by the junior section of the IWO, first organized in 1930. The communistdominated IWO was a federation of ethnic fraternal benefit societies. The IWO organized after-school programs (art classes, music lessons, dance classes), the IWO Juniors, summer camps (such as Camp Robin Hood near Cleveland), and even drum-and-bugle corps and marching bands, emphasizing a radical cultural milieu with a sensitivity to ethnic family identities and values. Moreover, by December 1938 there were fifty-three IWO Jewish schools in New York City, including a kindergarten and three high schools, attended by more than 4,000 students, and others in Detroit, Los Angeles, Chicago, and Philadelphia, with an emphasis on Yiddish-language instruction and discussions of social issues. During World War II the IWO Juniors organized paper drives, raised money for the United Services Organization (USO), and knitted socks for soldiers before being dissolved in 1944. Within the IWO, the Jewish sections conducted after-school programs and summer camps, and the Russian Mutual Aid Society had a nationwide system of after-school programs.
When communist boys reached sixteen, they were eligible to join the Young Communist League (YCL), organized in 1922 and designed to educate and socialize youth through the semimonthly Young Worker and diverse political activities. Members supported labor strikes and civil rights, sold the Daily Worker and Sunday Worker, in the 1930s became active in antifascist demonstrations, and established clubhouses, with perhaps 12,000 members in New York City alone. In 1943 the YCL became American Youth for Democracy (AYD), then the Labor Youth League (1949–1957), and finally the Du Bois Clubs (1965–1971). Membership in the YCL often served as a bridge into the Communist Party, for many members had developed valuable skills as political pamphleteers, speakers, and strike organizers. Left-wing summer camps, which were established early in the twentieth century and were scattered around the country, proved to be the most memorable and influential of radical activities for boys and girls. Assorted left-wing organizations established adult colonies and children’s camps, such as the Workmen’s Circle Kinder Ring in upstate New York, Highlander Folk School’s Junior Union camp in the South, and the Sholom Aleichem Folk Institute’s Camp Boiberik. One of the most active was Camp Kinderland, founded by communists in 1925 on Sylvan Lake in Dutchess County, New York, and finally sponsored by the Jewish People’s Fraternal Order of the IWO. Campers engaged in rural sports and political and cultural activities, and Yiddish classes were mandatory through the 1930s. Visiting dignitaries were greeted with dramatic presentations, dances, and songs. Some, such as Camp Unity in upstate New York, originated as adult
Little League colonies but eventually became exclusively children’s camps by the late 1930s. Camp Wo-Chi-Ca (Workers Children’s Camp), founded in 1936 in Port Murray, New Jersey, by both communists and others on the left, was less ethnically oriented than Camp Kinderland. Interracial (staff and campers) and initially heavily Jewish, Wo-Chi-Ca also attracted Puerto Rican, Italian, and other boys and girls by the late 1940s. Boys studied black culture and the labor movement while exercising and breathing fresh air. Political pressure closed the camp in the mid-1950s, when it merged with Camp Wyandot, another left-oriented camp. Camp Woodland (1939–1961) in the Catskill Mountains was less politically oriented and more attuned to local rural culture than Wo-ChiCa and Kinderland. It was organized by Norman Studer, a teacher at the Little Red School House in Greenwich Village and later director of the Downtown Community School. Having studied with John Dewey, Studer was more influenced by the Progressive education movement than the Communist Party. The interracial camp attracted the children of the Old Left as well as liberals and for a while featured folk musician Pete Seeger as a music teacher. Boys experienced democratic, rural living and were heavily exposed to local craftspeople, storytellers, and musicians, with folk dancing and singing a vital part of camp life. Left-wing summer camps allowed boys to experience different cultures in a rural setting while being exposed to radical cultural and political influences, at least into the 1950s. Many of the left-wing schools, camps, and organizations had folded by the 1960s, generally due to the demise of the Old Left, which had been battered during the anticommunist movement of the 1950s. Camp Kinderland continued to op-
419
erate, however, as did the Little Red School House and adjoining Elizabeth Irwin High School in Greenwich Village, keeping alive the flame of left-wing education into a new era. Throughout the century boys who had been exposed to leftwing schools and camps experienced a variety of educational experiences that politically and culturally countered mainstream influences. Whether or not such influences left a permanent mark, boys nonetheless learned folk songs and dances, studied radical political and cultural ideas, and in other ways practiced alternatives to the dominant value system. Ronald D. Cohen See also Camping References and further reading Avrich, Paul. 1980. The Modern School Movement: Anarchism and Education in the United States. Princeton: Princeton University Press. Buhle, Mari Jo, Paul Buhle, and Dan Georgakas, eds. 1998. Encyclopedia of the American Left. 2d ed. New York: Oxford University Press. Kaplan, Judy, and Linn Shapiro, eds. 1998. Red Diapers: Growing Up in the Communist Left. Urbana: University of Illinois Press. Mishler, Paul C. 1999. Raising Reds: The Young Pioneers, Radical Summer Camps, and Communist Political Culture in the United States. New York: Columbia University Press. Schrank, Robert. 1998. Wasn’t That a Time? Growing Up Radical and Red in America. Cambridge: MIT Press. Teitelbaum, Kenneth. 1993. Schooling for “Good Rebels”: Socialist Education for Children in the United States, 1900–1920. Philadelphia: Temple University Press.
Little League See Baseball
M Manners and Gentility
refinement of architectural style and material culture available to larger numbers of people through their purchase of consumer items. Tea tables, ceramic teapots, silver spoons, and looking glasses, as well as books of advice, inundated the American market by the 1760s, as the demand for items that indicated social merit spread to farmers, artisans, and even laborers and the poor. These behavioral ideals, however, also have been linked to the emergence of the public sphere, as private individuals engaged in rational and critical discourse in this forum of coffeehouses, clubs, and drawing rooms located between the intimacies of family life and the official state. As civil society expanded through commercial consumerism and voluntary association, a public opinion developed that led to political activity and debate, extending the boundaries of political participation. Manuals advising boys on correct behavior for participation in the world dated to the beginning of printing in the English language in the fifteenth century. William Caxton printed the first Book of Curtsye of Lytyll John in 1477 in beautiful manuscript type. Following continental manuals of courtly manners, he provided maxims and instructions for young boys in the service of the “great.” As the genre of these advice manuals developed, it took the form of letters from a father to his son, as in
Boys learned manners and correct behavior in the eighteenth and nineteenth centuries by consulting time-honored manuals of advice. The behavioral ideals they contained can be traced back to the Italian Renaissance and such manuals as Baldassare Castiglione’s Book of the Courtier, delineating personal and social behavior that was emulated in England by the 1620s at the court of Charles I. Aristocratic courtly behavior was repudiated during the English Civil War in the 1640s, and revived with the Restoration of the monarchy in 1660. Yet the court, especially after the Glorious Revolution of 1688, became less central in English culture than it previously had been. In the increasingly commercial nation, cultural activity came to focus on the town—the lively taverns, coffeehouses, clubs, and drawing rooms of London’s newly fashionable West End. By the early eighteenth century, this coffeehouse culture of the urban bourgeoisie was idealized and diffused to provincial centers and the colonies through such periodicals as Tatler and Spectator. Joseph Addison, who wrote for both of them, hoped to create a culture of “politeness,” convinced that the art of sociable, urbane, and pleasing conversation would generate a new moral authority and taste. As England’s commercial economy prospered, this civility of manners was linked to the
421
John Singleton Copley portrays Daniel Commelin Verplanck in 1771 according to the behavioral ideals of gentility. (Metropolitan Museum)
Manners and Gentility Peter Idley’s Instructions to His Son, also published in the fifteenth century. A popular eighteenth-century manual was Letters Written by the Late Right Honorable Philip Dormer Stanhope, Earl of Chesterfield, to His Son, Philip Stanhope, which was published in London in 1774, reprinted in Boston by 1779, and reissued as a revised version entitled Principles of Politeness in Philadelphia two years later. Chesterfield was the quintessential eighteenth-century aristocrat. Polished at the court of Louis XIV, seasoned as a statesman and adviser to kings, and friend of Addison, Alexander Pope, and Jonathan Swift as well as Montesquieu and Voltaire, he represented a worldly and pleasure-oriented yet shrewd and rational notion of good breeding. His letters originally were very private ones written to his illegitimate and only son and reveal an absent father’s anxious wish to feel himself effective. For twenty-seven years, Chesterfield instructed Philip by letter, teaching him not only the ancient languages, history, and geography but also how to behave in public with civility, affability, and an easy manner. Especially during the years of the American Revolution, citizens of the emerging republic objected to Chesterfield’s racy, aristocratic tone. Mercy Otis Warren of Massachusetts admired his elegance of writing and code of politeness. In a letter to her own son, however, she argued that refinement need not exclude principle. When Chesterfield sacrificed truth to convenience and virtue to momentary gratification, she found him deeply reprehensible. She also objected to his attitude toward women because he connived in their seduction and did not recognize a feminine capacity for education, reflection, and rational conversation. When the Presbyterian printer
423
Robert Aitken published the Chesterfield letters in Philadelphia, he took a similar view. In his volume, the racy pleasureoriented instructions are eliminated, and what remains could provide the kind of polish to finish the education of an American boy. The boy is given tips on achieving a genteel carriage, on maintaining cleanliness of person and dress, and on conversing with elegance and wit. He is taught how to do honors at table, how to propose a toast, and how to behave with grace toward those of lower social status. Yet the Aitken volume also includes Chesterfield’s wary and shrewd advice. A boy should learn to observe others, studying their foibles and susceptibility to flattery. Keeping his own impulsive behavior under strict control with good breeding, he could then gently flatter and play on the weaknesses of others to advance his own reputation in the world. No American colonial boy worked harder to train himself in the code of politeness than George Washington. In 1746, when he was fourteen, Washington copied 110 such “Rules of Civility & Decent Behavior in Company and Conversation” in his commonplace book, acquiring selfconsciously the disciplined self-restraint he later exemplified in his revolutionary leadership. The commands Washington copied focused on respect for others and correct behavior in a social hierarchy. “Let thy ceremonies in courtesy be proper to the dignity of his place with who thou converses,” he wrote, “for it is absurd to act the same with a clown and a prince.” Other rules taught bodily restraint; for example, “Shift not yourself in the sight of others nor gnaw your nails,” and “Shake not the head, feet, or legs; roll not the eyes; lift not one eyebrow higher than the other; wry not the mouth; and bedew no man’s face with your spittle by approaching too
424
Manners and Gentility
near him when you speak” (Reinier 1996). Although these particular rules were based on French maxims of courtly behavior, Washington also learned self-restraint and genteel taste from reading Joseph Addison’s Spectator. Addison was a monarchist, but Americans tended to identify him with the Roman concept of citizenship expressed in his 1713 tragedy Cato, which the young Washington read in the company of Sally Fairfax, a neighbor he admired. “Turn up thy eyes to Cato!” Addison wrote: There mayst thou see to what a godlike height The Roman virtues lift up mortal man, While good, and just, and anxious for his friends, He’s still severely bent against himself. (Fischer 1989)
Washington was so taken with the stoic virtue in public life exemplified by Cato that he ordered the play performed for his officers at Valley Forge during the American Revolution. Still later he included quotations from Addison’s tragedy in his presidential papers. Although the code of politeness, as adapted by Americans, weathered their revolutionary shift in consciousness, by the end of the eighteenth century it was increasingly out-of-date. Expressed in a proliferation of novels that inundated the American market, British cosmopolitan culture began to shift from the rational self-discipline Washington admired to an emphasis on feeling. Cutting a figure in the town became less important than retreat from artificial convention and achievement of transparency in human relationships. The new model of refinement placed human affection as the basis of moral life and regarded those of great
sensitivity as morally virtuous. In The Man of Feeling (1771), Henry Mackenzie distinguished between sentiments of the heart and rational discourse of the head, and Samuel Richardson’s Sir Charles Grandison (1753) created a new model of male virtue that was greatly admired by female readers. In the American context of republican citizenship and emerging democracy, however, such sentimental fiction seemed dangerously effeminate. Yet the emphasis on feeling would be retained for boys in a proliferation of manuals written by evangelical authors. Religious tracts and evangelical children’s literature in the nineteenth century sought to instill in boys self-restrained behavior appropriate for a capitalist economy—cleanliness, industry, record keeping, and self-improvement. Feeling was channeled into spiritual experience and the effort to achieve “a new heart” through religious conversion. Children’s literature, such as Anna Reed’s Life of George Washington, written for the American Sunday School Union in 1829, admired the first president’s regular habits and ability to hold great passion under restraint but attributed his hard-won virtue to the influence of his pious mother. It was she who had tamed the “manly superiority” of the young Washington to a “self-denying tenderness.” Such nineteenth-century books on behavior, however, did not abandon the older emphasis on politeness. Gentility blended with evangelical religion would define respectability for the new middle class, forming the basis for Victorian culture. Jacqueline S. Reinier References and further reading Brewer, John. 1997. The Pleasures of the Imagination: English Culture in the
Masculinities Eighteenth Century. New York: Farrar, Straus and Giroux. Bushman, Richard. 1992. The Refinement of America: Persons, Houses, Cities. New York: Alfred A. Knopf. Carson, Cary, Ronald Hoffman, and Peter J. Albert, eds. 1994. Of Consuming Interests: The Style of Life in the Eighteenth Century. Charlottesville: University Press of Virginia. Fischer, David Hackett. 1989. Albion’s Seed: Four British Folkways in America. New York: Oxford University Press. Reinier, Jacqueline. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers. Richards, Jeffrey H. 1995. Mercy Otis Warren. New York: Twayne Publishers.
Masculinities The term masculinities refers to the social roles, behaviors, and meanings prescribed for men in any given society at any one time. As such, the term emphasizes gender, not biological sex, and the diversity of identities among different groups of men. Although people experience gender as an internal facet of identity, masculinities are produced within the institutions of society and through daily interactions (Kimmel 2000). Much popular discourse assumes that biological sex determines one’s gender identity, the experience and expression of masculinity and femininity. Instead of focusing on biological universals, social and behavioral scientists are concerned with the different ways in which biological sex comes to mean different things in different contexts. Sex refers to the biological apparatus, the male and the female—the chromosomal, chemical, anatomical organization. Gender refers to the meanings that are attached to those differences within a culture—to what it means to be a man or a woman. Although biological
425
sex varies very little, gender varies enormously. Gender takes shape only within specific social and cultural contexts. The use of the plural—masculinities— recognizes the dramatic variation in how different groups define masculinity, even in the same society at the same time, as well as individual differences. Although social forces operate to create systematic differences between men and women, on average on some dimensions, even these differences between women and men are not as great as the differences among men or among women. The meanings of masculinity vary over four different dimensions, and thus four different disciplines are involved in understanding gender. First, masculinity varies across cultures. Anthropologists have documented the ways that gender varies cross-culturally. Some cultures encourage men to be stoic and to prove masculinity, especially by sexual conquest. Other cultures prescribe a more relaxed definition of masculinity, based on civic participation, emotional responsiveness, and collective provision for the community’s needs. The different definitions of being a man in France or among aboriginal peoples in the Australian outback are so far apart that they belie any notion that gender identity is determined mostly by biological sex differences. The differences between two cultures’ version of masculinity is often greater than the differences between the two genders. Second, definitions of masculinity vary considerably in any one country over time. Historians have explored how these definitions have shifted in response to changes in levels of industrialization and urbanization, position in the larger worldwide geopolitical and economic context, and development of new technologies. What it meant to be a man in
426
Masculinities
colonial America is quite different from what it meant in 1900 or what it might mean to be a man in the United States today. Third, definitions of masculinity change over the course of a person’s life. Developmental psychologists have examined how a set of developmental milestones leads to differences in experience and expression of gender identity. Both chronological age and life stage require different enactments of gender. In the West, the issues confronting a man about proving himself and feeling successful will change as he ages, as will the social institutions in which he will attempt to enact those experiences. A young single man defines masculinity differently than a middle-aged father and an elderly grandfather. Finally, the meanings of masculinity vary considerably within any given society at any one time. At any given moment, several meanings of masculinity coexist. Simply put, not all American or Brazilian or Senegalese men are the same. Sociologists have explored the ways in which class, race, ethnicity, age, sexuality, and region all shape gender identity. Each of these axes modifies the others. Imagine, for example, two “American” men, one an older, black, gay man in Chicago, the other a young, white, heterosexual farm boy in Iowa. Would they not have different definitions of masculinity? Each is deeply affected by the gender norms and power arrangements of their society. If gender varies so significantly—across cultures, over historical time, among men and women within any one culture, and over the life course—then masculinity cannot be addressed as though it were a constant, universal essence common to all men. Thus, gender must be seen as an ever-changing fluid assemblage of mean-
ings and behaviors, and the term masculinities must be used. Pluralizing the term acknowledges that masculinity means different things to different groups of people at different times. Recognizing diversity ought not obscure the ways in which gender definitions are constructed in a field of power. Simply put, all masculinities are not created equal. In every culture, men contend with a definition that is held up as the model against which all are expected to measure themselves. This “hegemonic” definition of masculinity is “constructed in relation to various subordinated masculinities as well as in relation to women,” R. W. Connell writes (1987, 183). Erving Goffman (1963, 128) once described the process this way: In an important sense there is only one complete unblushing male in America: a young, married, white, urban, northern, heterosexual, Protestant, father, of college education, fully employed, of good complexion, weight, and height, and a recent record in sports. . . . Any male who fails to qualify in any one of these ways is likely to view himself—during moments at least—as unworthy, incomplete, and inferior. Definitions of masculinity are not simply constructed in relation to the hegemonic ideals of that gender but also in constant reference to each other. Gender is not only plural but also relational. Surveys in Western countries indicate that men construct their ideas of what it means to be men in constant reference to definitions of femininity. What it means to be a man is to be unlike a woman; indeed, social psychologists have emphasized that although different groups of
Masculinities men may disagree about other traits and their significance in gender definitions, the “antifemininity” component of masculinity is perhaps the single dominant and universal characteristic. Gender difference and gender inequality are both produced through human relationships. Nancy Chodorow (1979) argues that the structural arrangements by which women are primarily responsible for raising children create unconscious, internalized desires in both boys and girls that reproduce male dominance and female mothering. For boys, gender identity requires emotional detachment from mother, a process of individuation through separation. The boy comes to define himself as a boy by rejecting whatever he sees as female, by devaluing the feminine in himself (separation) and in others (male superiority). Girls, by contrast, are bound to a pre-Oedipal experience of connection to the same-sex parent; they develop a sense of themselves through their ability to connect, which leads to a desire to become mothers themselves. This cycle of men defining themselves through their distance from and devaluation of femininity can end, Chodorow argues, only when parents participate equally in childrearing. It is possible to recognize gender diversity and still conceive masculinities as attributes of identity only. For example, gendered individuals bring all the attributes and behavioral characteristics of their gendered identity into gender-neutral institutional arenas. But because gender is plural and relational, it is also situational. What it means to be a man or a woman varies in different institutional contexts. Those different institutional contexts demand and produce different forms of masculinity. “Boys may be boys,” Deborah Rhode comments cleverly, “but they express that identity dif-
427
ferently in fraternity parties than in job interviews with a female manager” (1997, 142). Gender is thus not only a property of individuals, some “thing” one has, but a specific set of behaviors that are produced in specific social situations. And so gender changes as the situation changes. Institutions are themselves gendered. Institutions create gendered normative standards, express a gendered institutional logic, and are major factors in the reproduction of gender inequality. The gendered identity of individuals shapes those gendered institutions, and the gendered institutions express and reproduce the inequalities that compose gender identity. Institutions themselves express a logic—a dynamic—that reproduces gender relations between women and men and the gender order of hierarchy and power. Not only do gendered individuals negotiate their identities within gendered institutions, but also those institutions produce the very differences assumed to be the properties of individuals. Thus, “the extent to which women and men do different tasks, play widely disparate concrete social roles, strongly influences the extent to which the two sexes develop and/or are expected to manifest widely disparate personal behaviors and characteristics.” Different structured experiences produce the gender differences often attributed to people (Chafetz 1980). For example, take the workplace. In her now-classic work, Men and Women of the Corporation (1977), Rosabeth Moss Kanter argued that the differences in men’s and women’s behaviors in organizations had far less to do with their characteristics as individuals than they had to do with the structure of the organization and the different jobs men and women held. Organizational positions “carry characteristic images of the kinds of people that should
428
Masculinities
occupy them,” she argued, and those who do occupy them, whether women or men, exhibited those necessary behaviors. Though the criteria for evaluation of job performance, promotion, and effectiveness seem to be gender-neutral, they are, in fact, deeply gendered. “While organizations were being defined as sex-neutral machines,” she writes, “masculine principles were dominating their authority structures.” Once again, masculinity— the norm—was invisible. For example, secretaries seemed to stress personal loyalty to their bosses more than did other workers, which led some observers to attribute this to women’s greater level of personalism. But Kanter pointed out that the best way for a secretary—of either sex—to get promoted was for the boss to decide to take the secretary with him to the higher job. Thus the structure of the women’s jobs, not the gender of the jobholder, dictated their responses. Joan Acker has expanded on Kanter’s early insights and specified the interplay of structure and gender. It is through people’s experiences in the workplace, Acker maintains, that the differences between women and men are reproduced and the inequality between women and men is legitimated. Institutions are like factories, and one of the things that they produce is gender difference. The overall effect is the reproduction of the gender order as a whole (see Acker 1987, 1988, 1989, 1990). Institutions accomplish the creation of gender difference and the reproduction of the gender order through several gendered processes. Thus, “advantage and disadvantage, exploitation and control, action and emotion, meaning and identity, are patterned through and in terms of a distinction between male and female, masculine and feminine” (Acker 1990, 274). It
is erroneous to assume that gendered individuals enter gender-neutral sites, thus maintaining the invisibility of gender-ashierarchy and specifically the invisible masculine organizational logic. However, it is just as incorrect to assume that genderless “people” occupy those genderneutral sites. The problem is that such genderless people are assumed to be able to devote themselves single-mindedly to their jobs, have no children or family responsibilities, and may even have familial supports for such single-minded workplace devotion. Thus, the genderless jobholder turns out to be gendered as a man. Take, for example, the field of education. The differences assumed to be the properties of boys and girls are often subtly—or not so subtly—produced by the educational institutions in which they find themselves. This process takes place in the structure of the institution itself— by having boys and girls form separate lines to enter the school through different entrances, separating boys and girls during recess and encouraging them to play at different activities, and tracking boys into shop and girls into home economics (as if boys would naturally want to repair cars and girls would naturally want to learn how to cook). It also takes place in the informal social interactions with teachers who allow boys to disrupt or interrupt classes more easily than girls or who discourage girls from excelling in science and math classes. And it takes place in the dynamics of the interactions among boys and girls as well, both in the classroom and outside (see Thorne 1983). Embedded in organizational structures that are gendered, subject to gendered organizational processes, and evaluated by gendered criteria, then, the differences between women and men appear to be the differences solely between gendered indi-
Masculinities viduals. When gender boundaries seem permeable, other dynamics and processes can reproduce the gender order. When women do not meet the criteria (or, perhaps more accurately, when the criteria do not meet women’s specific needs), we see a gender-segregated workforce and wage, hiring, and promotional disparities as the “natural” outcomes of already-present differences between women and men. In this way, differences are generated, and the inequalities between women and men are legitimated and reproduced. There remains one more element in this exploration of masculinities. Some psychologists and sociologists believe that early childhood gender socialization leads to gender identities that become fixed, permanent, and inherent in our personalities. However, many sociologists disagree with this notion today. As they see it, gender is less a component of identity—fixed, static—that people take with them into their interactions than the product of those interactions. In an important article, Candace West and Don Zimmerman argue that “a person’s gender is not simply an aspect of what one is, but, more fundamentally, it is something that one does, and does recurrently, in interaction with others” (1987, 140). People are constantly “doing” gender, performing the activities and exhibiting the traits that are prescribed for them. Doing gender is a lifelong process of performances. As people interact, they are held accountable for displaying behavior that is consistent with gender norms, at least for that situation. Thus consistent gender behavior is less a response to deeply internalized norms or personality characteristics and more a negotiated response to the consistency with which others demand that they act in a recognizable masculine or feminine
429
way. Gender is less an emanation of identity that bubbles up from below in concrete expression than an emergent property of interactions, coerced from people by those around them. Understanding how people “do” masculinities, then, requires that they make visible the performative elements of identity and also the audience for those performances. It also opens up unimaginable possibilities for social change; as Suzanne Kessler points out in her study of “intersexed people” (hermaphrodites, those born with anatomical characteristics of both sexes, or with ambiguous genitalia): If authenticity for gender rests not in a discoverable nature but in someone else’s proclamation, then the power to proclaim something else is available. If physicians recognized that implicit in their management of gender is the notion that finally, and always, people construct gender as well as the social systems that are grounded in genderbased concepts, the possibilities for real societal transformations would be unlimited. (Kessler 1990, 25) Kessler’s gender utopianism raises an important issue. If people “do” gender, then gender is not only something that is done to them. They create and re-create gendered identities within the contexts of their interactions with others and within the institutions they inhabit. Michael Kimmel References and further reading Acker, Joan. 1987. “Sex Bias in Job Evaluation: A Comparable Worth Issue.” In Ingredients for Women’s Employment Policy. Edited by C. Bose and G. Spitze. Albany: SUNY Press.
430
Masturbation
———. 1988. “Class, Gender and the Relations of Distribution.” Signs: Journal of Women in Culture and Society 13. ———. 1989. Doing Comparable Worth: Gender, Class and Pay Equity. Philadelphia: Temple University Press. ———. 1990. “Hierarchies, Jobs, Bodies: A Theory of Gendered Organizations.” Gender and Society 4, no. 2. Acker, Joan, and Donald R. Van Houten. 1974. “Differential Recruitment and Control: The Sex Structuring of Organizations.” Administrative Science Quarterly 19, no. 2. Chafetz, Janet. 1980. “Toward a MacroLevel Theory of Sexual Stratification.” Current Perspectives in Social Theory 1. Chodorow, Nancy. 1979. The Reproduction of Mothering. Berkeley: University of California Press. Connell, R. W. 1987. Gender and Power. Stanford: Stanford University Press. Goffman, Erving. 1963. Stigma. Englewood Cliffs, NJ: Prentice-Hall. Kanter, Rosabeth Moss. 1975. “Women and the Structure of Organizations: Explorations in Theory and Behavior.” In Another Voice: Feminist Perspectives on Social Life and Social Science. Edited by M. Millman and R. M. Kanter. New York: Anchor Books. ———. 1977. Men and Women of the Corporation. New York: Basic Books. Kessler, Suzanne J. 1990. “The Medical Construction of Gender: Case Management of Intersexed Infants.” Signs 16, no. 1. Kimmel, Michael. 2000. The Gendered Society. New York: Oxford University Press. Rhode, Deborah. 1997. Speaking of Sex. Cambridge: Harvard University Press. Risman, Barbara. 1999. Gender Vertigo. New Haven: Yale University Press. Thorne, Barrie. 1983. Gender Play. New Brunswick, NJ: Rutgers University Press. West, Candace, and Don Zimmerman. 1987. “Doing Gender.” Gender and Society 1, no. 2.
Masturbation Masturbation, the erotic stimulation of one’s own genitals, is a persistent human
practice, especially among boys and young men. Although common in Western culture, masturbation has historically been discouraged by parents, religious leaders, physicians, and other child care professionals. Recently, some researchers and therapists have asserted that masturbation is part of normal development and may have positive benefits for children, youth, and adults alike (Christensen 1995). Yet masturbation remains a controversial practice that produces anxiety among many parents and generates intense debate among scholars, therapists, and policymakers. Because masturbation is usually a private, solitary act, it is difficult, if not impossible, to be certain about the precise extent of this behavior now or in the past. Surveys of sexual practice, which are widely used today, may not be representative and depend on individuals to be honest about a behavior that is typically ridiculed or censured by others (Okami and Pendleton 1994). Furthermore, there are virtually no survey data about masturbation before the twentieth century, so evidence about the practice in the past is usually anecdotal and inferential. According to Vern Bullough’s (1976) comprehensive review of sexual variance in history, masturbation was noted in many ancient cultures. Contrary to common opinion, however, masturbation is not addressed directly in the Bible. The passage usually associated with masturbation is Genesis 38:7–10, which describes the fate of Onan, who failed to meet his levirate obligations to inseminate his dead brother’s wife, engaging instead in coitus interruptus, thus spilling his semen on the ground. Another passage sometimes used to condemn masturbation is Leviticus 15:16–18, which declares as unclean anything touched by
Masturbation semen and describes the process for ritual cleansing, but this passage refers to semen produced during intercourse and makes no direct reference to masturbation. Even so, masturbation was proscribed in medieval Jewish law, which forbade males even from holding their penises when they urinated, warning masturbators that their hands were “full of blood” (Phipps 1977, 184). Leaders of the early Christian church roundly condemned masturbation as a sinful, despicable act. Writing in the fifth century, St. Augustine rejected masturbation as an unnatural sexual act because it did not lead to procreation. Thomas Aquinas, preeminent medieval theologian, concluded that masturbation was a mortal sin justifying damnation, although his view was not universally accepted (Phipps 1977; Bullough 1976). Masturbation appeared regularly in early church penitentials, but the penalties assigned for this transgression were usually not as severe as those for bestiality or sodomy. Early Protestant leaders such as Martin Luther and John Calvin found masturbation equally objectionable and warned against it in their sermons and writings. Thus, before the eighteenth century, commentators in the West generally objected to masturbation on religious grounds and rarely if ever claimed that it was physically harmful to those who practiced it. This changed with the publication of an anonymous pamphlet authored by a British physician or cleric. Onania; or the Heinous Sin of Self-Pollution, and All Its Frightful Consequences, in Both Sexes, Considered (1724) defined masturbation as “that unnatural practice, by which persons of either sex may defile their own bodies, without the assistance of others, whilst yielding to filthy imaginations, they endeavour to imitate and
431
procure to themselves that sensation, which God has ordered to attend the carnal commerce of the two sexes for the continuance of our Species” (Anonymous 1724, 1). This description of masturbation was consistent with earlier ones, but the author of Onania went well beyond earlier statements when he identified an astonishing array of physical maladies that could be caused by this self-polluting act: stunted growth, “consumptions,” “fainting fits,” “epilepsies,” “phymosis,” “paraphymosis,” “stranguries,”“preiapisms” “and other disorders of the penis and testes, but especially gonorrheas,” “premature ejaculation,” “loss of erection,” “infertile seed,” and weak, sickly offspring with “meager jaws and pale looks, with feeble hams, and legs without calves” were just some of the many ills males could suffer if they masturbated (Anonymous 1724, 14–17). Readers of Onania who wished to prevent or eliminate this dangerous habit were warned, among other things, to avoid butter and not to sleep on their backs or spend too much time in bed. To avoid nocturnal emissions, males were instructed “to tie a string, when you go to bed, about your neck and the other end about the necks of your penis, which when an erection happens, will timely awaken you, and put an effectual stop to the seminal emission” (Anonymous 1724, 44). Onania proved to be very popular. By 1730, Onania was in its sixteenth edition and had grown from the original 60 pages to 194 pages, plus a 142-page supplement of personal letters and testimony on the topic (MacDonald 1967; Hare 1962). By 1750, Onania had appeared in nineteen editions and sold 38,000 copies (Bennett and Rosario 1995), partly because it had little competition. Cotton Mather, a leading New England minister, published
432
Masturbation
The Pure Nazarite: Advice to a Young Man, Concerning an Impiety and Impurity (Not Easily to Be Spoken of) Which Many Young Men Are to Their Perpetual Sorrow, Too Easily Drawn Into (1723), the first publication on masturbation in North America. Mather’s work, though more moderate and empathetic in tone than Onania, remained relatively unknown outside New England. In 1724 an edition of Onania was published in Boston. Onania continued to dominate the field until 1758, when a second Onania appeared, this one by Samuel Tissot, a highly respected Swiss physician, devoutly Catholic, and adviser to the Vatican on epidemics (Phipps 1977). Perhaps in part because of his excellent reputation, Tissot’s Onania sold very well and was translated into several languages. Although Tissot criticized the first Onania as disorganized and unsystematic, he actually expanded its long list of physical maladies attributed to masturbation. Moreover, he stressed masturbation’s potential harmful effects on mental health, warning that it could lead to insanity. He cited one case in which a masturbator “dried out his brain so prodigiously that it could be heard rattling in his skull” (Spitz 1952, 495). Tissot’s Onania was read well into the nineteenth century, and his insanity hypothesis influenced several writers, including Benjamin Rush, who identified masturbation as one of the causes of madness in his important work, Medical Inquiries and Observations upon the Diseases of the Mind (1812). Several explanations are possible for the growing secular concern about masturbation during the eighteenth century: the earlier onset of puberty and later age of marriage; fear of venereal disease; de-
clining authority of the church and increasing individualism; emergence of print culture, which encouraged private fantasies; and the need to regulate sex “through useful and public discourses,” in a rapidly changing social and cultural environment (Foucault 1980, 25). Whatever its sources, the public, secular discussion of masturbation that began in the eighteenth century intensified in the nineteenth and led to a remarkable antimasturbation campaign that persisted into the twentieth century. At first, those who spoke against masturbation in the nineteenth century, like Benjamin Rush, warned against both the physical and mental consequences of masturbation. Gradually, however, concern for the masturbator’s mental health became more important in the discussion (MacDonald 1967; Hare 1962). Sylvester Graham (1794–1851), a noted American nutritionist, believed all forms of excessive sexual desire led to insanity, which in turn contributed to sexual excess, clearly a dangerous trap for masturbators who seemed to lack self-control (Bullough 1976, 543). Victorians pursued their campaign against masturbation with an amazing and revealing enthusiasm. The more they exposed this vice, the more threatening it seemed and the more severe the remedies they considered to prevent or eliminate it. By the 1870s, American children who masturbated risked being subjected to a horrific array of “treatments,” including infibulation and clitoridectomy for girls, circumcision and even castration for boys, and physical restraints and genital cages for both (Phipps 1977). Even though these extreme measures were not commonly used, the fact that they were discussed indicates the intensity of the antimasturbation campaign and the deep anxiety of
Masturbation those who supported it. As Rene Spitz notes, “in the eighteenth century physicians endeavored to cure masturbation,” and “in the nineteenth century they were trying to suppress it” (Spitz 1952, 499). The campaign against masturbation may have reached its peak between 1880 and 1914 when social reformers, educators, physicians, religious leaders, and psychologists joined in the social purity movement to promote “the idea of salvation through renunciation” (Hunt 1998, 579). This movement drew much of its remarkable energy from an invigorated evangelical Christianity and the powerful antisexual feminism that emerged in the nineteenth century. However, the belief that masturbation caused insanity and serious physical maladies became increasingly difficult to sustain as the results of systematic investigations became more widely known. Havelock Ellis questioned the scientific basis for these claims in his Auto-Eroticism (1900), the first volume in his Studies in the Psychology of Sex. Furthermore, the social purity crusade soon dissipated in the aftermath of World War I, and opposition to masturbation reverted to more informal, less visible forms. Since 1920, attitudes toward masturbation have remained profoundly ambivalent, but they have reflected a growing awareness that masturbation is very common and that it does not, in fact, cause serious physical or mental harm. Although the masturbatory hypothesis continues to have some currency as folk knowledge, scholars and therapists have generally rejected it. After 1920, masturbation was more likely to be seen as an undesirable but normal behavior, especially among boys, and at worst a symptom of illness or weak moral character rather than their cause (Hunt 1998). Surveys of sexual atti-
433
tudes and behavior played an important role in shaping these perceptions of masturbation. Beginning with the post–World War II surveys by Leslie Hohman and Bertram Schaffner (1947) and by Alfred Kinsey, Wardell Pomeroy, and Clyde Martin (1948), virtually all (85–93 percent) males surveyed under the age of twentyone reported they had masturbated, and the rates reported by females ranged from 33 to 64 percent (Leitenberg, Detzer, and Srebnik 1993). One illustration of how much attitudes about masturbation have changed during the twentieth century is the fact that masturbation has been used for some time as a technique in sex therapy to treat a range of sexual problems in both males and females (Christensen 1995). Indeed, some therapists have argued that masturbation is in some respects preferable to intercourse. The legitimate fear of acquired immunodeficiency syndrome (AIDS) and other venereal diseases has no doubt made this argument more convincing. Masturbation is now viewed more positively than it was in the past, and few would wish to return to the antimasturbation hysteria of the nineteenth century. Yet some scholars and commentators object strenuously to using masturbation as a therapy technique or promoting it among youth. They often point to the narcissism inherent in masturbation and question whether it promotes or inhibits the capacity for mature sexual intimacy, which necessarily requires mutuality and sharing. Clark Christensen argues that recommending a program of masturbation for someone in a troubled relationship “may be similar to telling alcoholics to find time by themselves and indulge in drinking.” The fact that masturbation “occurs with great frequency in the adult and adolescent population,” he concludes,
434
Melodrama
“does not mean that it need be prescribed carte blanche by professionals as therapeutic” (1995, 95). Robert C. Solomon has raised a related but different objection. He sees masturbation as symptomatic of the modern tendency to focus on sexuality as private enjoyment. He observes that although we “enjoy being sexually satisfied; we are not satisfied by our enjoyment.” Why, he asks, would any human activity so “intensely promoted and obsessively pursued” as sex is today not produce greater “gratification”? (Solomon 1974, 341). N. Ray Hiner See also Bodies; Sexuality References and further reading Anonymous. 1724. Onania; or the Heinous Sin of Pollution, and All Its Frightful Consequences, in Both Sexes, Considered. 10th ed. Boston: John Phillips. Bennett, Paula, and Vernon A. Rosario II, eds. 1995. Solitary Pleasures: The Historical, Literary, and Artistic Discourses of Autoeroticism. New York: Routledge. Bullough, Vern L. 1976. Sexual Variance in Society and History. Chicago: University of Chicago Press. Christensen, Clark. 1995. “Prescribed Masturbation in Sex Therapy: A Critique.” Journal of Sex and Marital Therapy 21 (Summer): 87–99. Ellis, Havelock. 1900. The Evolution of Modesty; the Phenomena of Sexual Periodicity; Auto-Eroticism. Philadelphia: E. A. Davis. Foucault, Michel. 1980. The History of Sexuality. Vol. 1: An Introduction. New York: Vintage. Hare, E. H. 1962. “Masturbatory Insanity: The History of an Idea.” The Journal of Mental Science 108 (January): 2–25. Hohman, Leslie B., and Bertram Schaffner. 1947. “The Sex Lives of Unmarried Men.” American Journal of Sociology 52 (May): 501–507. Hunt, Alan. 1998. “The Great Masturbation Panic and the Discourses of Moral Regulation in Nineteenth- and
Early Twentieth-Century Britain.” Journal of the History of Sexuality 8 (April): 575–615. Kinsey, Alfred C., Wardell B. Pomeroy, and Clyde E. Martin. 1948. Sexual Behavior in the Human Male. Philadelphia: W. B. Saunders. Leitenberg, Harold, Mark J. Detzer, and Debra Srebnik. 1993. “Gender Differences in Masturbation and the Relation of Masturbation Experience in Preadolescence and/or Early Adolescence to Sexual Behavior and Sexual Adjustment in Young Adulthood.” Archives of Sexual Behavior 22 (April): 87–98. MacDonald, Robert H. 1967. “The Frightful Consequences of Onanism: Notes on the History of a Delusion.” Journal of the History of Ideas 28 (1967): 423–431. Mather, Cotton. 1723. The Pure Nazarite: Advice to a Young Man. Boston: T. Fleet for John Phillips. Okami, Paul, and Laura Pendleton. 1994. “Theorizing Sexuality: Seeds of a Transdisciplinary Paradigm Shift.” Current Anthropology 35 (February): 85–91. Phipps, William E. 1977. “Masturbation: Vice or Virtue?” Journal of Religion and Health 16: 183–195. Porter, Roy. 1995. “Forbidden Pleasures: Enlightenment Literature of Sexual Advice.” Pp. 75–98 in Solitary Pleasures: The Historical, Literary, and Artistic Discourses of Autoeroticism. Edited by Paula Bennett and Vernon A. Rosario II. New York: Routledge. Solomon, Robert C. 1974. “Sexual Paradigms.” The Journal of Philosophy 71 (June): 336–345. Spitz, Rene A. 1952. “Authority and Masturbation: Some Remarks on a Bibliographical Investigation.” The Psychoanalytic Quarterly 21 (October): 490–527.
Melodrama In turn-of-the-twentieth-century New York City, the most popular form of theater was melodrama. Boys who worked during the day flocked to the theaters at night to see their favorite hero defeat the
A poster for the melodrama The Two Orphans, ca. 1879 (Library of Congress)
436
Melodrama
villain and save the heroine. For anywhere between 10 and 30 cents they could escape from their dreary lives and enter a world of excitement, suspense, tears, and laughter. In contrast to today’s older theatrical audiences, in those days young men fought to enter the doors of the melodrama houses. In the first half of the nineteenth century, French and English melodramas had been adapted and imported, but beginning in midcentury, American playwrights put their own stamp on the form with local characters and settings. By the end of the century, fed by the enormous waves of immigrants, American audiences were growing by leaps and bounds. Between 1890 and 1990, the population of New York City skyrocketed 126.8 percent (Rosen 1982). As urbanization and industrialization changed the face of society, men and women, young and old, worked outside the home earning enough money to afford inexpensive amusements. Producers, ever poised to exploit a new market, lowered the price of theater tickets considerably in order to attract these working-class audiences. In this way the ten-twenty-thirty movement, known more familiarly as the “ten, twent’, thirt’,” was born. Originally a reference to ticket prices, the term came to encompass resident stock companies, various types of touring companies, and hundreds of plays, mostly melodramas, written expressly for this type of theater. Producers hired playwrights to churn out melodramas by the dozens, which were often performed in New York before being packaged, promoted, and sent on tour to cities throughout the country. These nationwide theaters formed a large “circuit” or “wheel,” yielding fortunes for the producers despite the cheap admission.
New York theaters exhibited a “Broadway-Bowery polarity” separating middleclass entertainments, which often boasted European plays and stars, from the lowerclass, homegrown melodrama theaters (Sante 1991). The term “Broadway” referred to the theaters farther uptown, whereas “Bowery” implied downtown or Lower East Side theaters in the immigrant neighborhoods (and several in Brooklyn as well). The higher-class theaters charged up to $2 per ticket, as opposed to the “popular-priced” theaters, some of which included the Grand Street Theater, the Academy of Music and the Fourteenth Street Theater (both on Fourteenth Street), the Star Theater (Thirteenth Street), and the Grand Opera House (Twenty-third Street). These theaters catered to the needs of their public, who demanded cheap, thrilling, escapist entertainment. As Daniel Gerould explains in American Melodrama (1983), this genre is well suited to the American mentality because of its adherence to poetic justice. In melodrama ordinary people become heroes, defeating evil in every form. There are no limits to what the individual can do—nothing is impossible. Despite poverty and all manner of handicaps, the pure and the innocent triumph in the end. For the immigrant population, many of whom lived in squalor and worked under unbearable conditions, the melodrama theater offered the hope and optimism that did not exist in their own lives. Owen Davis, one of the most popular and successful of the ten, twent’, thirt’ playwrights, explained that since much of his audience did not speak English, his plays had to appeal to the eye rather than the ear. They did so by making use of stock characters and spectacle. The characters were distinct types, clearly recog-
Melodrama nizable by every audience member because of their physical appearance, gestures, and behavior. These included the hero and heroine, the comic sidekicks, and the villain (who could be male or female; some plays had two villains—one male and one female). Davis and Theodore Kremer, another prolific playwright of the time, attest to the fact that they always wrote with their audience in mind. Davis, a Harvard graduate, would sit up in the cheapest gallery seats of the melodrama theaters, studying audience reactions. Kremer would read his plays to his barber, butcher, and bootblack to get their responses before submitting the play to his producer. They dared not change the melodrama formula for fear of upsetting their audiences, who were more dogmatic than religious fanatics. This formula mandated trouble in the first act, a heroine at the mercy of the villain in the second, a courageous rescue in the third, and a reunion of the lovers and punishment for the villain in the fourth. One of the reasons for the popularity of these plays was that the audiences recognized themselves in the characters onstage. In its heyday during the first decade of the century, the ten, twent’, thirt’ theater featured the Irish, Jews, Italians, and Asians, as well as types like the “Bowery B’hoy” and “G’hal” speaking in the familiar dialects heard on the streets. In Scott Marble’s play The Sidewalks of New York, we find an example of the latter couple—Tacks and his girlfriend Jane, who eventually end up together after alternating between verbal battles and attempts to save the hero and heroine. Tacks, who sells newspapers, hopes to earn enough money to convince Jane to marry him, but she is determined to assert her independence. In a short interchange in the third act he asks her:
437
Tacks: Ain’t I engaged? Jane: You’re altogether too inferior to grasp the duties of the engager. Tacks: Den you won’t be me wife? Jane: How dare you sir: Asking me to be your wife. That is my place now, when I’m ready. I may ask you to be my husband. I am the new woman. [She exits] Tacks: Hully gee: What’s to become of us men; suppose some old dame gets mashed on me and says I want you for my husband. Den I refuse an get arrested. Love scenes alternated with comedy and suspense to keep the spectators off-balance and hungry for more. Unfortunately, the majority of plays written for the ten, twent’, thirt’ theater have never been published. Characters like Tacks appealed to the newsboys who lived and worked in the Bowery area around the turn of the twentieth century. They were among the chief occupants of the gallery seats. With the expansion of cities, there was a need for “newsies” to sell papers to homebound commuters. Hundreds of thousands of these boys, mostly between the ages of eleven and fifteen, hawked their papers in U.S. cities (Nasaw 1985). Sometimes as an incentive they were given theater tickets by circulation managers who found themselves competing to fill an ever-increasing need for newsboys. These young men were independent wage earners who quickly learned to capitalize on the headlines of the day. Often the galleries of the theaters were so crowded and noisy that a policeman was hired to keep order. These boys were demanding and difficult to please, and the success of the production depended on their response, which was vocal. They
438
Melodrama
were the ones who hissed at the villain and cheered for the hero. Newspaper clippings of the time assert that actors were painfully aware of the importance of pleasing these “gallery gods,” who could shout them off the stage if they were unsuccessful. Charles T. Aldrich claimed that instead of finding his stage reality from the lines, he first had to convince the gallery and ended by believing in himself. Actress Lottie Williams was praised as a favorite with the gallery, who reduced more eminent players to “fear and trembling.” The famous producer A. H. Woods felt that if his productions did not please the gallery, they were “no go.” Although these audiences were often criticized for their lack of taste and discernment, they kept the melodrama alive. Called the “Peter Pans of stageland” by Porter Emerson Browne (1909) in Everybody’s Magazine, they got more value for the few cents they paid than any higher-class audience. Some local boys even found their way into the plays as “supers,” or supernumeraries (extras). As an actor in the Metropolis Theater in the Bronx, Frank J. Beckman (1962) and several other neighborhood youths appeared in the plays in a variety of nonspeaking roles, including cowboys, newsboys, soldiers, sailors, convicts, bootblacks, messengers, and even cannibals. For Owen Davis’s prison melodrama, Convict 999, Beckman and his friends dressed in convicts’ stripes and chains and walked along the streets of their neighborhood, wearing placards advertising the play. The other element of melodrama that attracted its large, uncultured audience was its use of spectacle. The stage not only mirrored its patrons but also reflected violent news events and disasters that befell them. It is here that the job of
the stage carpenter became essential to the success of the production. Because this was a theater based on visual attractions, special effects and frequent scene changes were expected by every member of the audience. It was one of the most important selling points of melodrama. Posters in various sizes and colors advertising the “sensation scene” (the most dramatic event of the play: an escape, a murder, or a daring rescue) were made up even before the script was written and pasted on every available billboard. Newspaper advertisements listed the new scenic effects in each production. Ben Singer (1992) points out that the dangers of everyday life, recounted in the headlines of the day, became the subject of melodrama: terrible injury and deaths caused by electric trolleys and railroads, factory machinery, the hazards of tenement life, and falls from buildings and bridges. The thrills of melodrama aped the violence of everyday life, presented onstage in as realistic a setting as possible. Actors were often injured in these productions because they had to perform their own stunts. The son of Laurette Taylor, whose husband Charles A. Taylor starred her in his melodramas early in her career, speaks of her brush with death in attempting to enact some of the perils of the heroine. Thus in melodrama the young men of the time saw themselves and their lives represented onstage, spiced with comedy skits and songs (between acts as “olios” or even during the plays), and were assured of a happy ending with justice for all. Yet despite the popularity of these plays, the ten, twent’, thirt’ movement declined and disappeared by the end of the first decade of the twentieth century. Among the several explanations, the most weighty is competition with a new
Mexican American Boys medium: film. Beginning in 1905, film exhibitors took over converted storefronts, dance halls, and restaurants, where for the admission price of 5 cents, they showed a program of short films, many of which borrowed their subjects from melodrama. By 1910 these “nickelodeons” numbered more than 10,000 nationwide, taking over the very theaters to which melodrama audiences had flocked several years earlier. Their patrons were primarily children and young people. David Nasaw (1985) points out that youngsters accounted for one-quarter to one-half of the audience of this “creation of the child.” Each day they were offered a different program, which often, like melodrama, reflected their lives—the earliest ones were documentaries of everyday events. Although theatrical presentations took hours, programs shown at nickelodeons were so short (half an hour) that newsies could see them during the day or right after work. Films were more immediate and more realistic, and the audience did not have to wait for scene changes. In addition, because they were silent, there was no need to understand English. These “nickel dumps” went out of their way to attract young audiences by offering incentives like gum or two admissions for a nickel on Saturdays. Before long, the gallery gods deserted the melodramas for the movies, where they remain to this day. Barbara M. Waldinger See also Films; Newsboys; Performers and Actors; Theatre; Vaudeville References and further reading Beckman, Frank J. 1962. “The Vanished Villains: An Exercise in Nostalgia.” Unpublished manuscript, Billy Rose Theater Collection, New York Public Library at Lincoln Center.
439
Browne, Porter Emerson. 1909. “The Mellowdrammer.” Everybody’s Magazine (September): 347–354. Davis, Owen. 1914. “Why I Quit Writing Melodrama.” American Magazine (September): 28–31. ———. 1931. I’d Like to Do It Again. New York: Farrar and Rinehart. Gerould, Daniel. 1983. American Melodrama. New York: Performing Arts Journal. Goodman, Jules Eckert. 1908. “The Lure of Melodrama.” Bohemian Magazine (February): 180–191. Marble, Scott. 189-. “Daughters of the Poor.” Unpublished manuscript, Billy Rose Theater Collection, New York Public Library at Lincoln Center. Nasaw, David. 1985. Children of the City: At Work and at Play. New York: Oxford University Press. Rosen, Ruth. 1982. The Lost Sisterhood: Prostitution in America, 1900–1918. Baltimore: Johns Hopkins University Press. Sante, Luc. 1991. Low Life: Lures and Snares of Old New York. New York: Farrar, Straus and Giroux. Singer, Ben. 1992. “A New and Urgent Need for Stimuli: Sensational Melodrama and Urban Modernity.” Paper presented at the Melodrama Conference, British Film Institute, London. Taylor, Dwight. 1962. Blood-and-Thunder. New York: Atheneum.
Mexican American Boys Mexican boys in the United States in the 1920s and 1930s existed in a cultural crucible. Whether they were the children of immigrants who journeyed north in the great migration of the 1920s to the agricultural fields or the city of Los Angeles, California, the descendants of New Mexican families dating back to the Spanish and Mexican eras, or the sons of onceindependent cattle ranchers whose lands and status had been alienated to Anglos, Mexican boys were being exposed to the American popular culture, education, and system of wage labor. While their
440
Mexican American Boys
A Mexican American boy, Chamisal, New Mexico, late 1930s (Library of Congress)
parents strained to maintain their and their children’s Mexican ways, the modern modes of the Southwest proved attractive. At the same time, however, poverty and discrimination allowed for only paltry participation in consumer culture or educational advancement. In addition, adults in these very diverse Mexican communities of the Southwest sought not to assimilate to American culture or identity but rather to fortify their regional variety of Mexican culture in New Mexico or Texas or to rebuild what they understood to be the traditional ways of the interior of Mexico from which they had migrated to California. Mexican boys living in the United States in the interwar decades thus experienced contrary messages and prescrip-
tions about what they could, would, and should be like. No other phrase is more central to the ideal form of Mexican family practice than respeto y honor. This refers to the respect a father should give his wife and children in return for their honoring him and submitting to his authority, will, and protectiveness. Children are to respect all elders, and the eldest brother must be obeyed and respected to the point that he becomes like a second father. The various Mexican regional cultures all evolved elaborate customs by which that deference was shown. Children, for example, always addressed their elders with the formal usted, never the informal tú. Although the realities of family life diverged in various degrees from the ideal, the good child was one who conformed to the prescribed social role. The Mexican boy was to learn dutifulness first to father and mother and then to wife and children, the latter accompanied by sovereignty over them. At the same time, he was to learn submission to the church and, if he was landless, to lord or patron. Traditional education, understood as the learning of these norms and conducted under the purview of family and church, provided the socialization of boys. The 1920s, however, witnessed profound change in the matter of education on both sides of the border. As the postrevolutionary regimes of Mexico sought to replace church schools with secular ones sponsored by the federal government (a policy that sparked the bloody Cristero Revolt of 1927), Mexican children in the United States also encountered the complexities of state-sponsored education. In the north, however, their experience varied from place to place because school districts in the United States exercised local control over educa-
Mexican American Boys tion. Consistent in intent but varying in intensity and success, the public schools sought to purge Mexican boys of their putatively inappropriate cultural traits, although not necessarily to replace them with values of American democracy and citizenship. For example, Stanford professor Lewis M. Terman (1877–1956) argued that the intelligence quotient (IQ) test he promulgated and popularized “told the truth. These boys,” he said, “are uneducable beyond the merest rudiments of training. . . . They represent the level of intelligence which is very, very common among Spanish-Indian and Mexican families of the Southwest and also among Negroes. Their dullness seems to be racial, or at least inherent in the family stocks from which they came” (quoted in Gould 1981). Such notions dominated the policies of school districts entrusted with the education of Mexican American children and circumscribed their experiences in American schools. Boys’ education was overwhelmingly limited to vocational instruction and the rudiments of arithmetic and the English language. Certainly, though, the educational experiences of Mexican boys varied widely. In the agricultural colonias of California, schooling was at best irregular as families moved from place to place. In settled rural communities and in the urban schools of El Paso, Texas, children found themselves segregated into “Mexican schools,” which were underfunded and understaffed. In other urban centers, such as Los Angeles, Mexican boys could find themselves in classrooms in which they were a strong majority or ones in which they mixed with Anglo, Jewish, Japanese, and other immigrant children. Schoolsponsored programs in Los Angeles did provide inexpensive lunches, which in-
441
cluded pasteurized milk, and instructions for parents in hygiene, both of which enhanced children’s health. In New Mexico, where descendants of eighteenth- and nineteenth-century settlers were the majority in many counties, Latino boys, usually monolingual in Spanish, attended underfunded schools only for a very few years. In Texas, where insistence on Americanization waxed in the years after World War I, schooling in Mexican-majority counties centered more on instruction in the English language than on substantive subject matter. Ultimately, educators’ negative assumptions about Mexican boys’ intelligence, tests that were culturally biased against them, and the boys’ need to work conspired to produce very high dropout rates, especially in the rural schools. Although some Mexican boys did complete school and learn to read in English, most achieved only a modicum of literacy. In the agricultural regions of Texas and California, a boy’s life in an immigrant Mexican family typically revolved around picking farm crops and travel. As young as five years of age, he learned his skills at his parents’ side in the fields, and what he picked contributed to the family’s total earnings. This was a boy’s life rather unlike the images middle-class Americans associate with childhood, which is ideally a time of play, innocence, and freedom from the drudgery of work. His formal schooling was at best spotty, but his education had to do with learning the requisite skills for officially joining a picking crew at age sixteen. However, if either or both parents had steady jobs in the city, boys could attend school and from such odd jobs as selling newspapers could earn enough money for their two favorite pastimes, movies and sports. Social work surveys done in the
442
Mexican American Boys
1920s and 1930s reveal that Mexican children went to the movies once or twice per week. In the early years of the “talkies,” Mexican boys liked such actors as James Cagney (the “tough guy”), Tom Mix (the “cowboy”), and Joe E. Brown (the “wise-cracking comedian”) and such movies as the famous gangster epic Scarface (1932). Although the public schools often had limited success in the Americanization of Mexican boys, such movies introduced them to new modes of appearance and behavior in the urban north as interpreted by Hollywood. No sport excited the passions of Mexican boys (and men) more than boxing. “Two or three Mexicans have become famous boxers and gotten rich, like Colima, Fuente, and the like,” a Los Angeles playground director noted with only slight exaggeration in 1926. “Nearly every Mexican boy has the ambition to be a great boxer. This is the main thing that he thinks about until he gets married and has to go to work digging ditches or working for the railroad” (quoted in Monroy 1999). This affinity for boxing, like the sport itself, is much more complicated than is immediately apparent. Although the winner of the match can celebrate his might, many youthful fighters proved their fortitude in a losing cause by being able to take a punch and by enduring beatings. Boxing taught Mexican boys certain lessons about what it meant to be a man: to associate physical prowess and fortitude (in victory or defeat) with male virtue and character. Few stories are more interesting than that of baseball and Latin American boys. U.S. mining companies and Marines, sent by the U.S. government to intervene in various Latin American countries, spread the sport to the Dominican Republic, Cuba, Nicaragua, Venezuela, and
Mexico. In the late nineteenth century, the Latin American upper middle class saw playing baseball as a way to emulate the cricket-playing English gentry, but by the early twentieth century baseball had become popular among all classes of Mexican boys in the mining camps of southern Arizona and New Mexico and in Los Angeles. Informally, in their schools and in clubs that local businesses sponsored, Mexican boys in the United States rounded the bases and shouted encouragement to one another in Spanish. It is of no little significance that at the same time Mexican boys became active in such pursuits, the police in urban centers such as Los Angeles began to connect them with criminality. The professionalization of police departments racialized crime. In Los Angeles, for example, police targeted many of the undertakings of Mexican boys for arrest. “Juvenile delinquency” increasingly concerned police and school authorities, though the activities of Mexican male youths differed little from those of other groups. Behaving with general rambunctiousness, drinking alcohol, and loitering caused most of the trouble. Here police presuppositions about Mexican boys that persist to this day and police actions toward the boys created an adversarial relationship that was in large part responsible for the creation of Mexican youth gangs. The experiences of rural, northern New Mexican boys were very different. Children of “forgotten people,” as George I. Sanchez called them in his path-breaking book (1940) of the same name, they lived in depressed communities where the land base continually shrank, public education was dull and unresponsive to boys’ needs, and traditional artisanal skills deteriorated. New Deal programs in concert with the Spanish colonial arts movement and
Mexican American Boys inspired by the exciting innovations in postrevolutionary Mexican education associated with the great philosopher and Minister of Education José Vasconcelos (1882–1959) sought to reinvigorate Latino New Mexican life by expanding education and stimulating the production of traditional crafts. The Federal Emergency Relief Administration (May 1933), along with the Civilian Conservation Corps and the Works Progress Administration (WPA), built schools and recreational facilities for Latino youth. By 1939 the WPA had instituted a hot lunch program that delivered one nourishing meal per day to kids who otherwise might have gone hungry. The WPA also initiated other programs to instruct parents in hygiene and nutrition. The National Youth Administration (NYA) gave boys jobs on highway and street improvement projects and instructed them in handicrafts. NYA residence camps took youths out of rural villages to Albuquerque and Las Vegas, New Mexico, where they worked in construction, and Latino NYA workers built hundreds of recreational facilities throughout New Mexico. Boys were exposed to 4-H club work and sports, as well as to folk music, dance, and dramas, and were taught such ancestral arts and crafts as furniture and religious figurine production. Such federal efforts assumed, rightly or wrongly, that boys in New Mexico needed outside intervention to keep them and their families from despondency. Such an attitude inevitably kindled resentment on the part of local adults. The patronizing nature of these programs, their relative effectiveness, and their degree of success are subjects of debate. Nonetheless, these New Deal programs (terminated in 1943) brought relief for many from the poverty, isolation, and monotony of boyhood in rural New Mexico.
443
One is tempted to position the history of Mexican boys in the United States within the older paradigm of assimilation. But it would be a mistake to conceptualize such a history simply as the process by which Mexican Americans were created. Rather we see here the process of cultural syncretization (mestizaje in Spanish) by which a regional or transborder culture was created. In some ways integrated into American culture and politics via Hollywood and the New Deal, in some ways rebuffed by the police who associated being Mexican with criminality, sometimes encouraged to maintain traditional ways in the service of arts and crafts but at other times pressured to deny Mexican ways in favor of Americanization, Mexican boys forged in their barrios and schools, car and sports clubs, pachuco groups and Texas palomillas (cliques) a new Mexican way of being on the landscape of the American Southwest. Douglas Monroy See also California Missions References and further reading Escobar, Edward J. 1999. Race, Police, and the Making of a Political Identity: Relations between Chicanos and the Los Angeles Police Department, 1900–1945. Berkeley: University of California Press. Forrest, Suzanne. 1998. The Preservation of the Village: New Mexico’s Hispanics and the New Deal. Albuquerque: University of New Mexico Press. Galarza, Ernesto. 1971. Barrio Boy. Notre Dame: University of Notre Dame Press. Gonzalez, Gilbert G. 1990. Chicano Education in the Era of Segregation. Philadelphia: Balch Institute Press. Gould, Steven J. 1981. The Mismeasure of Man. New York: W. W. Norton. Monroy, Douglas. 1999. Rebirth: Mexican Los Angeles from the Great Migration to the Great Depression. Berkeley: University of California Press.
444
Military Schools
Sanchez, George I. 1940. Forgotten People: A Study of New Mexicans. Albuquerque: University of New Mexico Press.
Military Schools Many military schools were founded in the nineteenth century, usually at the secondary level but also as elementary schools and colleges. In the twentieth century some military schools opened their doors to girls, but the idea persists that military schooling is better suited for educating boys. Today military schools generally fall into two broad categories: private, single-sex and coeducational institutions (both boarding and day) focusing on college preparation and secondary, coeducational public day schools aimed at inner-city children with behavioral problems. The origin of the military school movement begins with the foundation of the U.S. Military Academy in 1802. Many Americans in the late eighteenth and early nineteenth centuries were impressed by arguments urging the establishment of academies to train engineers and artillerists. Following the establishment of the U.S. Military Academy, a movement to found other military schools spread slowly across the country. This took various forms, causing the foundation of military, state, private, and denominational schools and colleges. After 1825, however, those promoting new military schools shifted the basis of their case. They no longer stressed the need to prepare military men to defend the southern and western states against the Spanish, the British, or the Indians, or the pressing need for more trained engineers. They argued instead that military schooling provided young men with a unique training, prepar-
ing them to become citizen-soldiers, not professional soldiers. The chief advocate of military schooling was Captain Alden Partridge, founder of the American Literary, Scientific and Military Academy at Norwich, Vermont. Partridge’s notions concerning the educational value of military training became the basis for the American military school movement of the nineteenth century. From 1860 to 1900, some sixty-five military schools or military departments were established throughout the United States in already existing schools. Partridge assumed, as did his followers, that youth educated in the military fashion would become leaders of the country. At the core of Partridge’s thinking was the notion of the citizen-soldier who could perform military service in time of national need. The rationale for promoting military education was increasingly the idea that military training brought out “manly, noble and independent sentiments” of young men (Webb 1958, 186–187). The passage of the Morrill Act of 1866 and the subsequent establishment of military studies in many schools and colleges were a significant victory for Alden Partridge’s ideas. His notion of encouraging young men to learn the “arts of war” in order to serve as citizen-soldiers was accepted by the midnineteenth century. Toward the end of that century, however, proponents of military schooling once again shifted rationale. They based their case not merely on the argument that such education provided the nation with potential citizen-soldiers but also on the claim that military schooling ought to play a unique role in shaping the character of the adolescent. This view was a further transformation of perceptions about military schooling, appealing as it did to an age increasingly concerned about the
Military Schools
445
Four young boys play games in the dormitory during their spare time at the Black Foxe Military academy in Hollywood, California, 1933. (Bettmann/Corbis)
peculiar needs of the adolescent. This movement gained strength when some states and territories actively encouraged military training programs in their common school programs. The most notable of these flourished in California, New Mexico, Massachusetts, and Arizona. Throughout the nineteenth century, military schools and colleges flourished in the South; some were state institutions, such as the Virginia Military Institute and The Citadel in South Carolina. During the latter part of the nineteenth century the military school movement
expanded further with the foundation of private and denominational military schools throughout the country. A number of such institutions developed in the South before and after the Civil War; some thirty military schools were established in Virginia alone between 1840 and 1860. Following the Civil War, Virginia again led the field with twenty military schools. Other southern states accounted for an additional 111 military schools: thirteen in Alabama; twelve in Georgia; nine in Kentucky; eighteen in North Carolina; five in South Carolina;
446
Military Schools
fourteen in Tennessee and Texas; and from one to three each in Mississippi, Florida, and Arkansas. This movement also grew in other parts of the United States, with 106 schools founded in New York, New Jersey, Ohio, Pennsylvania, California, Connecticut, Illinois, Massachusetts, New Hampshire, Wisconsin, Utah, Washington, Delaware, Vermont, Indiana, and West Virginia. The Morrill Act of 1866 aided these schools because the bill authorized the War Department to detail active or retired army officers as professors of military science and tactics to secondary schools. The movement received added support when many religious denominations established schools on the military model. During the second half of the century, the Episcopal Church founded fifteen such schools; the Methodists and Baptists, thirteen schools each; the Roman Catholics, fourteen; the Presbyterians, seven; and various other denominations, including the Quakers, eight. Significantly, the rationale for using military training in these schools was not to prepare students to become soldiers or even to serve as citizen soldiers in case of war. Military education in these schools was essentially tied to character formation. Ancillary to that idea were notions linking military training to the physical health and well-being of adolescent boys. The link between military schooling and character formation received added impetus from other powerful cultural and social forces in late-nineteenth-century and early-twentieth-century America. During those years many Americans expressed concern over what they perceived as growing softness and lack of discipline among the young, particularly among adolescent boys. Leaders of such
movements as the Boy Scouts and the Young Men’s Christian Association (YMCA), for instance, focused on the adolescent boy, concerned that boys might not develop into manly men. There was a growing concern among many educators that modern life somehow threatened the morals, masculinity, and potential adult success of boys. Underlying all this, paradoxically, was fear that adults might also lose control of adolescent boys. Military schools saw themselves as playing a key role in developing the character of American boys. Boys were viewed on the one hand as being potentially wild and unruly, but on the other hand there was growing concern that boys would grow up soft and untested, surrounded by too much female influence. Military schools could remedy that problem; the military would develop a boy’s character, curb his wild impulses, and channel them into more constructive lines. Throughout the rest of the twentieth century, military schools would address the problem of character education, but the understanding of that role would change dramatically as the century unfolded. A decade later, in the midst of the Progressive era, intellectual fashions had changed: no longer was there concern to curb the potential wildness of the boy. Rather, many educators were concerned with the notion of social efficiency—the idea that individuals should “perform all of their functions efficiently and in a manner that would serve the state” (Church and Sedlack 1976, 310). Many military school leaders argued for a basic compatibility between military education and progressive values, specifically by cultivating in boys the spirit of democracy and the ideal of service to the community. Military school publicists
Military Schools also tried to distance themselves from militarism by making the paradoxical claim that the boys’ experience of military life might give them “some small conception of what the horrors of war might be” (Gignilliat 1916, 81–82). In the 1930s intellectual fashions had once again changed: parents and teachers grew concerned with developing the individuality of each child while not neglecting the cultivation of his social responsibility to the larger group. Many military schools offered their patrons an amalgam of the military, educational psychology, and the testing movement popular in that period as a means to develop a boy’s individuality. The latest techniques of educational psychology were, in effect, wedded with drill to form the character of the adolescent. For example, upon entering Culver Military Academy, “boys were given elaborate tests covering every facet of their character and intelligence. They were rated in scholastic aptitude, I.Q., and personality traits, such as selfsufficiency, dominance and introspection. This data was interpreted by counselors trained in the latest techniques of educational psychology who used the military framework to develop the individuality of the cadet to its fullest” (Davies 1983, 281). In the mid-1940s, following the war, the military came to be valued for its ability to produce leaders in business and the professions. Stress was placed in many military schools upon developing the skills necessary in becoming a business or professional leader. Military schools prided themselves in teaching boys the skills of leadership, namely, “the process of taking control in various types of situations” (McKinney n.d., 7). Leadership was viewed, somehow, as being cultivated independently from the
447
other moral qualities that schools sought to instill in boys, such as courage, independence, trustworthiness, a sense of honor, fair play, and school spirit. In the 1950s and intensifying in the 1960s and the 1970s, public perceptions of both the military and military schooling changed. Protests against the Vietnam War combined with the antiestablishment feeling of the late 1960s and early 1970s to put advocates of military schools on the defensive. Doubts about the value of the military arose even in military schools. What had once been regarded as a key in the development of maturity in the adolescent—the military—was often viewed as retarding it, taking time away from more valuable pursuits that developed the mind and spirit of the boy. The military increasingly came to be viewed by the general public as too inhibiting and too rigid to produce the bright, innovative leaders society needed. In the late twentieth century, it was believed, access to the new elite status depended upon knowledge and expertise. It was the classroom, not the drill field, that held the key to both individual success and the social good. Military schools went into a period of decline in the 1970s. In 1926 the Handbook of Private Schools listed eighty military schools. By 1966 only thirty remained. By 1976 the number of military schools remained at the same level, but enrollment in many of these schools had dropped alarmingly. The ferocious antimilitary feeling on the part of the public and especially the young slowly faded. Public perception of the military became more positive in the 1980s and 1990s, especially following the Gulf War, but this rise in the fortunes of the military was not accompanied by a similar change in public
448
Military Schools
attitudes toward military schooling. Military schools continued to be perceived as on the fringes of American education. Many schools either closed their doors, like Staunton Military Academy in Virginia, or rebuilt themselves as civilian institutions, like the Harvard School in California, formerly known as the Harvard Military School. In 2000 there were thirtytwo boarding military prep schools, five military day schools, and two elementary schools with “a military component.” More recently, some urban public school systems have turned to military schools as the organizing theme of new schools. The Chicago Public Schools recently opened a junior reserve officer training corps (JROTC) school, the Chicago Military Academy, Bronzeville. The school is located in an armory in a historic African American community in the city. Although this was Chicago’s first public military high school, it was not the first such school in the nation. Franklin Military Academy had previously opened its doors in 1980 in Richmond, Virginia. A Naval ROTC High School in St. Louis also predates Bronzeville. The opening of schools like these may indicate that military schooling is still viewed by some educators as filling a vital need in the education of young people. A common theme among teachers at schools like Bronzeville is this sentiment: “Students want structure and discipline. We give them that” (“Special Report” 2000, 2). If this movement continues it will be yet another reformulation of the special ability of military schools to address the character formation of American youth. One of the most common, if problematic, icons in American popular culture is the idea that military schools are places that deal successfully with unruly boys. Writers and filmmakers have often used
military schools to illustrate the best and the worst aspects of the human psyche. Films in this genre run from the dark look at military schooling in Taps (1981) to the irreverent but oddly positive portrayal of the military in Major Payne (1995). An old cliché further illustrates the problematic character of military schooling: a local judge hectors a defiant and unrepentant boy standing before his bench: “Son, you have a choice—go to Nonesuch Military Academy or go to reform school.” These stereotypical ideas linking military schools with troubled boys continue to plague such institutions to the present day. A less favorable image of military schools developed as strict military-type boot camps were established for drug rehabilitation. Such programs were touted as successful in the 1990s but have waned in popularity because their results have not been shown to last. In the twenty-first century, most military schools see themselves as institutions that first and foremost prepare their students for college. The military is viewed as playing a separate but significant role in developing leadership skills and in shaping the character of the students. There is evidence of a revival of interest in what was once the central mission of military schools—forming the character of moral citizens and leaders of a democracy, not addressing the behavioral needs of troubled adolescents. Some institutions like Culver Military Academy and its sister institution, Culver Girls Academy, have placed character education and the cultivation of character back again at the center of their mission. The two schools share common academic and athletic facilities while maintaining separate leadership programs for young men and women. The school’s mission statement boldly proclaims that the insti-
Money tution “educates its students for leadership and responsible citizenship in society by developing and nurturing the whole individual—mind, spirit and body—through an integrated curriculum that emphasizes the cultivation of character” (“School Goals” 2000). Central to the development of character is education “in the classical virtues of wisdom, courage, moderation and justice” (“School Goals” 2000). If Culver and other military schools are successful in placing leadership development and character formation on a par with college preparation, this may breathe new life into the military school movement. It remains to be seen whether military schools will survive this century. Military schooling has its avid proponents, but the movement remains on the fringe of education in this country. Paradoxically, the success of the military in regaining its positive image with the public may not be helpful for military schools. The military is viewed in America as a professional and specialized force—not as a system to be used to develop the character of mainstream Americans. Ironically, the opening of schools in inner-city urban areas may reinforce a double stereotype: that military schooling is for those adolescents with behavioral needs—not ordinary children with ordinary needs—and that inner-city children are different from ordinary children with ordinary needs. To survive and thrive in the twenty-first century, military schools will have to market themselves as institutions that prepare any young person to be a moral citizen and active leader. Richard G. Davies References and further reading Avery, Gillian. 1975. Childhood’s Pattern: A Study of Heroes and Heroines of Children’s Fiction, 1750–1950. London: Hodder and Stoughton.
449
Baird, Leonard L. 1977. The Schools: A Profile of Prestigious Independent Schools. Lexington, MA: D. C. Heath. Church, Robert, and Michael W. Sedlack. 1976. Education in the United States: An Interpretive History. New York: Free Press. Cunliffe, Marcus. 1968. Soldiers and Civilians: The Martial Spirit in America, 1775–1865. Boston: Little, Brown. Davies, Richard G. 1983. “Of Arms and the Boy: A History of Culver Military Academy, 1894–1945.” Ph.D. diss., School of Education, Indiana University. Ekrich, Arthur A. 1956. The Civilian and the Military. New York: Oxford University Press. Gignilliat, Leigh R. 1916. Arms and the Boy: Military Training in Schools. Indianapolis: Bobbs-Merrill. Handbook of Private Schools, The. 1926. 17th ed. Boston: Porter Sargent. Kett, Joseph. 1977. Rites of Passage: Adolescence in America 1790 to the Present. New York: Basic Books. Kraushaar, Otto. 1972. American Nonpublic Schools: Patterns of Diversity. Baltimore: Johns Hopkins University. Macleod, David I. 1983. Building Character in the American Boy: The Boy Scouts, YMCA and Their Forerunners. Madison: University of Wisconsin Press. McKinney, C. F. N.d. “A Discussion of Leadership.” Culver Military Academy, 7. Napier, John Hawkins III, ed. 1989. “Military Schools.” In Encyclopedia of Southern Culture. Vol. 1, Agriculture— Environment. New York: Anchor Books/Doubleday. “School Goals: Draft.” 2000. Culver Academies, October 27. “Special Report.” 2000. Available online at http://www.ausa.org/ausnews/items/ chicagojan00.htm (accessed October 20). Webb, Lester Austin. 1958. “The Origins of Military Schools in the United States Founded in the Nineteenth Century.” Ph.D. diss., School of Education, University of North Carolina.
Money See Allowances
450
Mothers
Mothers Throughout American history, a boy’s relationship with his mother has been central to his physical, social, and emotional development. However, what it means to mother a boy has changed significantly over time. In the seventeenth and eighteenth centuries, mothers cared for boys when young, but preparing a boy for manhood was considered the father’s responsibility. In the nineteenth century, as a result of political and economic changes, good mothering became key to preparing a boy to succeed; in turn, a boy was the key to his mother’s future happiness. In the twentieth century, the mother-son relationship took on new, often negative overtones as developments in psychology and sociology increasingly blamed the mother for whatever might trouble the son. During the American colonial era (1607–1776), the family home was also the location of the family business, and both parents engaged in hands-on childrearing. However, this society was patriarchal, believing that men, who were assumed to be stronger morally as well as physically, should be dominant. A colonial mother’s control over her children, especially her boys, was thus limited. A mother was responsible for caring for the children from their birth through the first five years. Busy colonial mothers integrated child care into their daily work routines. Mothers were also expected to teach their children the fundamentals of reading and writing, assuming they were themselves literate. After about age five, however, girls and boys followed different paths to adulthood. Girls usually remained under the tutelage of their mothers so that they might master the often complicated skills of housewifery and child care essential to successful woman-
hood. Boys, however, would be expected to assume more varied responsibilities as adult men. They were thus more likely than their sisters to receive an education if the family could afford it. But like their sisters, boys learned to be adults mainly by following the example of their appropriate parental role model. To learn to be a man in colonial America, boys looked to their fathers. A farmer’s son learned to farm by working alongside his father in the fields and barns; a merchant’s son learned the business by clerking in his father’s store. Since manhood often demanded public service, boys also learned about politics by listening to and observing older men. Thus, in colonial America, it was the father, not the mother, who played the more prominent role in preparing boys for adulthood. Although colonial American mothers were often well loved and respected by their children, they were not expected to be able to provide much guidance to their adult sons. By the end of the eighteenth century, as Americans fought for and gained independence, they sought to ensure that the citizens of the new United States would prove worthy of the unprecedented degree of political participation expected of most adult white men. To sustain their democratic revolution, American men would have to be not just citizens but virtuous citizens, who put the interests of the nation ahead of their own, private concerns. With so much at stake, it was vital that boys, as future U.S. citizens, be brought up to understand their moral duty. At the same time, a new theory of epistemology—a way of understanding how humans learn—associated with British philosopher John Locke (1632–1704) became popular. Rejecting religious arguments that humans were born with their
Mothers moral characters already formed, Locke compared the human mind instead to a tabula rasa (Latin for “blank slate”). All of life’s experiences, but especially the lessons learned in early childhood, wrote upon the blank slate to create an individual’s character. And because mothers spent the most time with the very young, their role acquired new status. By the late eighteenth and early nineteenth centuries, Americans increasingly insisted that a boy’s mother, rather than his father, held the key to his character and thus, ultimately, to the character of the new nation. Through a good mother’s efforts, the future of the United States as a republic of virtuous citizens might be secured. Further supporting this new appreciation of mothers were far-reaching economic changes. By the early nineteenth century, the American economy was rapidly moving away from a focus upon producing for subsistence or local trade in favor of producing for regional, national, and international markets. In addition, westward migration dramatically expanded the size of the country, and the growth of trade and the beginnings of industrialization spurred the rise of cities. Known collectively as the “market revolution,” these changes generated unprecedented economic opportunities. But the market revolution also affected relations within the family with the development of a new, urban middle class. This group differed from earlier generations because home and work no longer overlapped. Instead, middle-class men left the house to go to work during the day, while middle-class women remained at home with the children. American mothers replaced fathers as the parent in charge of a middle-class boy’s development at a time when preparing for adulthood became more de-
451
A Native American mother with her baby boy, Winnebago, Wisconsin, early twentieth century (Library of Congress)
manding than ever. The opportunities unleashed by the market revolution held out the promise of significant upward mobility, but only to those who were equipped to take advantage of them. Anxious to prepare their sons to compete in the new market economy, middle-class families employed a number of strategies to benefit their children. Husbands and wives increasingly used birth control in order to concentrate the family’s resources on fewer children. With their energies now focused on being good mothers, middleclass women formed maternal associations to learn effective parenting techniques from more experienced mothers
452
Mothers
An African American mother picks up her son at a day care center. (Shirley Zeiberg)
and sought expert advice from books, such as The Mother at Home, which was a best-seller in the 1830s. These mothers also insisted that their daughters be well educated, leading to a surge in schools for middle-class girls and young women. Mothers also tried to steel their sons to resist the inevitable temptations of adulthood by overseeing their moral education at home and in church. Indeed, where colonial Americans had assumed that men had greater moral strength than women, nineteenth-century Americans reversed this: a good mother was now the family’s conscience and the exemplar of morality. Thus trained by his mother, a middle-class boy might take advantage of the opportunities newly available to him. American newspapers, magazines, ser-
mons, political speeches, and even jokes all sent the same message: a boy’s best friend and his best hope for a promising future was his mother. But if boys had good reason to befriend their mothers, mothers also became increasingly dependent upon their sons. The developing market economy had strained traditional social welfare practices to the breaking point. This was especially problematic for women, who were assumed to be legally and economically dependent upon others, usually a husband. A woman whose husband did not or could not support her faced a future of extreme poverty. A good mother, however, was more fortunate, for she had invested in her son. A boy whose mother had sacrificed for his benefit was expected to provide for her comfort in old age. Thus in the nineteenth century the nature of the mother-son relationship changed greatly. Mothers now played a starring role in their sons’ lives by preparing them to seize the opportunities of manhood. In return, boys who had benefited from a lifetime of maternal care were expected to support their aging mothers. To be sure, not all boys and their mothers experienced these new roles and expectations. Most Americans were still farmers, where home and work space were one and the same; here, traditional patterns continued. Social class also influenced family practice. Among elite families it was fashionable to send boys to boarding school during their formative years. In many poor families the struggle to survive was paramount, and boys were expected to contribute to the family economy from an early age. Race was also a critical factor. Mothers held in slavery faced perhaps the greatest challenges in preparing their sons for adult-
Mothers hood. When slave mothers were forcibly separated from their children, other women stepped in to act as surrogates, expanding the definition of mother beyond a blood tie. Others sought to teach their children how to survive in an abusive system, even while encouraging their growth as individuals. For these groups, the middle-class ideal did not describe their reality. Nevertheless, the notion of the at-home mother who focused on her children was widely embraced. For example, where the law had once dictated that fathers were the proper guardians of children, judges increasingly ruled in favor of mothers, particularly when young children were involved. By the end of the century, the good mother had become a cultural ideal. But if the nineteenth century at least paid lip service to honoring mothers, the twentieth century proved more suspicious. Emerging from the shadow of the good mother was her evil twin, the bad mother. Always implicit in the celebration of mothers—after all, if a good mother is responsible for her son’s success, a bad mother must be held responsible for her son’s failure—the bad mother began to move toward center stage. Indeed, what previous generations of Americans had seen as self-sacrificing maternal devotion, a new, more psychologically oriented generation viewed as “overprotective” and “stifling.” Especially after World War II, psychologists, sociologists, social workers, and guidance counselors asserted their professional authority in the field of childrearing. Often antagonistic toward the traditional authority of mothers, these groups singled out mothers as the source of family troubles. The danger was particularly acute for boys, whose masculinity seemed to be at stake. Daughters
453
would become women by modeling their behavior after their mothers, but sons must separate psychologically from the mother in order to achieve manhood. If a mother impeded this separation by being overly involved in her son’s life, psychologists argued, she endangered his masculinity and might even “make” him homosexual. However, if a mother was too detached from her son, psychologists accused her of endangering his emotional development as well. Mothers even took the blame for bad fathers: whether he was too authoritarian or too passive, a man’s failure as a parent was usually traced back to his wife. By the late twentieth century, American attitudes toward mothers constituted a series of no-win situations. If a mother worked outside the home, some considered her to be a bad mother who was insufficiently invested in her children. If a mother did not work outside the home, others considered her to be a bad mother who was overly invested in her children. The twentieth-century suspicion cast on mothers, especially the mothers of sons, hit a low point in the 1980s and 1990s. Demands for welfare reform blamed poor single mothers, particularly if they were women of color, for causing urban crime by failing to provide positive male role models for their sons. Female-headed families, which had once been poor but respectable, were now considered inherently pathological and the mothers incapable of guiding their sons into manhood. Carolyn J. Lawes See also Fathers; Siblings References and further reading Coontz, Stephanie. 1988. The Social Origins of Private Life: A History of American Families, 1600–1900. New York: Verso.
454
Movies
Grossberg, Michael. 1985. Governing the Hearth: Law and the Family in Nineteenth-Century America. Chapel Hill: University of North Carolina Press. Kerber, Linda K. 1980. Women of the Republic: Intellect and Ideology in Revolutionary America. Chapel Hill: University of North Carolina Press. Ladd-Taylor, Molly, and Lauri Umanski, eds. 1998. “Bad” Mothers: The Politics of Blame in Twentieth-Century America. New York: New York University Press. Lawes, Carolyn J. “Capitalizing on Mother: John S. C. Abbott and SelfInterested Motherhood.” Proceedings of the American Antiquarian Society 108, pt. 2: 343–395. Ryan, Mary P. 1981. Cradle of the Middle Class: The Family in Oneida County, New York, 1790–1865. New York: Cambridge University Press. ———. 1982. The Empire of the Mother: Americans Writing about Domesticity, 1830 to 1860. New York: Institute for Research in History and Naworth Press.
Movies See Films
Muscular Christianity As a response to perceptions of both immorality and escalating effeminacy among American boys, late-nineteenthand early-twentieth-century American Protestants embarked on an ambitious project to enhance the masculine tenor of Christianity. Drawing on British literary and organizational precedents, the movement for muscular Christianity represented an attempt to articulate and demonstrate the essential compatibility between Christian faith and virile, masculine expression. Advocates of this mission were convinced that American churches, dominated by female presence, were equally dominated by a feminine ethos incommensurate with boys’ na-
ture. In such a religious milieu, they reasoned, boys would either reject religious influence, thus creating an opening for immorality, or embrace religion and risk an enfeebled transition from boyhood to manhood. Only a muscular Christianity could attract boys to the religious life, prepare them for manly service, and channel boys’ instincts in positive ways. Rejecting the cultural identification of religion with the feminine private sphere and ideals of passive piety and selfrestraint, self-proclaimed muscular Christians constructed a subculture that emphasized the athletic, militant, and businesslike components of Christianity to reflect the needs and interests of men and boys. The development of such a faith, rooted in the overarching model of the “masculine Christ,” was far more than a simple attempt to redress gender imbalances in the institutional church. Through such organizations as the Young Men’s Christian Association (YMCA), United Boys’ Brigades of America, Men and Religion Forward bands, and a host of medieval clubs, proponents of muscular Christianity sought to provide for the efficient masculine socialization of American boys within the safe confines of the Christian faith. Although the predominance of females in Protestant churches had been a consistent demographic pattern since the 1660s, there was a growing sense of urgency regarding these trends by the turn of the twentieth century. Between 1880 and 1920, religious leaders, educators, and social critics spoke of a pervasive “boy problem” in American society. On the one hand, many were troubled by the perceived delinquency of the American boy. Anchored by recapitulation theories of boy development suggesting that young males repeated in their own bio-
Muscular Christianity logical maturation process the history of the human race, many of those who worked with boys were convinced that the male instincts of savagery, wanderlust, and gang spirit might lead boys to inevitable immorality. If natural social processes and institutions were unable to channel inherited impulses in socially fruitful ways, they argued, boys would continually be perched on the edge of a biologically engendered moral precipice. Coupled with statistics delineating the growing number of male adolescents (particularly working-class and immigrant youth) committed to reform schools and juvenile detention centers, public concern was piqued. The Reader’s Guide to Periodical Literature, which listed only thirteen articles on boyhood juvenile delinquency in the last decade of the nineteenth century, included more than 200 such citations in this field between 1900 and 1910. Although boys’ instincts could be directed for both good and evil, the writers of these articles contended, the failure of educational and religious institutions to provide a necessary outlet for boys’ nature meant that these dangerous proclivities would be exercised on the “school of the street.” On the other hand, social critics also used alarmist rhetoric to complain about the improper masculine socialization of American boys. If many were concerned with the inappropriate expression of boyhood instincts, many others were equally disturbed by the apparent blunting of these instincts altogether. Whether couched in the language of “degeneracy,” “effeminacy,” “overcivilization,” or “overrefinement,” the burden of this critique was unchanging. Because of pervasive social changes and institutional failures, boys were failing to develop the robust, masculine, self-assertive forms of
455
faith and moral goodness that characterized true manhood. Directed chiefly at the urban middle and upper classes, the condemnation of boyhood flaccidity was rooted in a sense that the natural cultural dynamics reinforcing masculine development were experiencing comprehensive decline. The oft-cited closing of the American frontier had supposedly blunted the self-assertive wanderlust impulse, diminishing the need for bodily strength and courage while generating a sense of enervating confinement. More important, shifting economic realities in the fin-desiècle United States complicated the development of masculine independence while also positioning boys within increasingly feminine settings. This concern certainly included the lack of physical exertion awaiting boys in sedentary, white-collar professions, but the critique was more inclusive. In a broader sense, men were losing the masculine initiative and independence that had blossomed under the rubric of entrepreneurial capitalism. Corporate capitalism, by contrast, seemed to constrain manly self-assertion within webs of corporate bureaucratic norms. This new economic paradigm valued not the innovative entrepreneur but rather the otherdirected team player who would fulfill his proper role in the larger corporate structure. With fewer men either owning their own farms and small businesses or possessing firsthand contact with the products of their labor, the necessary perception of individual potency was greatly curtailed. When combined with the rise of the “new woman,” a highly educated competitor in the white-collar world, and the growing presence of muscular immigrants, these economic trends presaged a general demise of the male middle class. For boys, these trends meant that proper
456
Muscular Christianity
masculine development would require formal and purposeful activity in other domains. In the midst of this economic transition, the home, the locus of the boys’ upbringing, was increasingly separated from the world of manly exertion. Fathers were physically absent from the home, leaving mothers with the dubious task of promoting masculine socialization. In addition, as public and private spheres were progressively separated, the private, feminine, and consumption-oriented values of the home were contrasted with the public, assertive, and production-oriented values of the work world. Boys growing up within the home were therefore trained within an environment increasingly defined as “feminine” in nature. Many were hopeful that the public school could bridge the gap between the home and the world of work for the boy. It soon became clear, however, that the school was itself a primary component of the problem. Schools were dominated by feminine influence, both in personnel and in the style and content of teaching and learning. Female teachers constituted 59 percent of all teachers in 1870, but that number had escalated to 86 percent by 1920. Male teachers were typically described as weak and effeminate as well, members of a profession that allowed them to avoid the more demanding exigencies of the public sphere. In addition, the “bookish” curriculum and passive learning styles characteristic of schools were deemed incommensurate with the active, assertive nature of boys. As a variety of economic factors began to direct more and more boys into the high schools, social critics were clear in asserting that the youth were moving from the masculine to the feminine sphere
during the most critical phase of male socialization—adolescence. Yet despite these varied laments, it was the church that received unequivocal criticism with regard to the problem. A number of experts on boys and religious leaders remarked that the church was losing boys because of its inability to appeal to boyhood proclivities. Effeminate clergy, linked to women by virtue of their profession, were deemed unworthy to serve as heroic examples for growing boys. Like the public school, Sunday schools suffered from the feminine influence of teachers and passive book learning. In light of these factors, it was not surprising to educators and youth leaders that male Sunday school attendance dropped precipitously as boys moved into their adolescent years. Yet because adolescence was increasingly designated as the ideal incubation period for boyhood conversion experiences, these statistics were of major import. Blame was placed squarely on the program of the Sunday school itself. Although boys were “naturally religious,” the religion of boyhood, characterized by practical, businesslike, and heroic fervor, was wholly absent from these gatherings. Whether the perception was correct or not, many boys seemed to feel that the development of Christian faith was a threat to masculine development. It was this perception that muscular Christians were out to disprove. Of course, sponsors of muscular Christianity boldly proclaimed that institutional commitment to masculine forms of Christian expression would solve both aspects of the problem simultaneously, protecting boys from vice by channeling virile instincts and preparing boys for masculine service to society. Muscular Christians noted repeatedly that their recommendations were linked closely to the
Muscular Christianity rediscovery of the “manly Christ,” a worthy exemplar for boys to follow. Following the lead of G. Stanley Hall, who spoke vehemently against typical written and pictorial representations of Christ, muscular Christians rejected the emphasis on the passive, peaceful, and otherworldly Jesus of Sunday school lore. By contrast, they pointed out that the Jesus described in the Bible was a muscular carpenter with a strong physique, honed through his rugged and nomadic lifestyle. Far from a monolithic “prince of peace,” Jesus fought courageously for personal and social righteousness against the forces of evil. Muscular Christians frequently suggested that Jesus possessed a strong business mind, training men to carry out a successful mission through his powerful leadership skills and personal magnetism. By 1925, it was therefore not unusual to see Bruce Barton, in The Man Nobody Knows, speak of Jesus as a burly carpenter who was alluring to women and a popular dinner guest, possessing a keen business and organizational acumen. A near-perfect embodiment of the Rooseveltian “strenuous life,” Jesus was a model boys could emulate. Following this example, muscular Christians urged church leaders to adopt a “Boy Scout model” for the Sunday school, providing virile male leaders, a more practical orientation, and regimented appeals to commitment through gang loyalty and oaths of allegiance to Christian ideals. Yet even though the attempt was made to transform the church and Sunday school along these lines, the ideals of muscular Christianity were perhaps most efficiently diffused through a proliferation of Christian youth organizations for boys. In their own unique ways, each of these club-based associations sought to encourage the formation of a masculine Chris-
457
tianity among growing boys. The YMCA, which by the late nineteenth century was growing increasingly interested in the urban middle classes, concentrated on the athletic elements of muscular faith, championing the manly character-building force of competitive sports. Paramilitary organizations like the United Boys’ Brigades of America, imported to the United States in 1894, attempted to utilize the military proclivities of boys for the development of a muscular Christianity. In addition, the recrudescence of medieval boys’ clubs revealed the vigilant antimodernism characteristic of these boys’ organizations. Anchored in the belief that boys were recapitulating the medieval spirit of hero worship and chivalry, the stated goals of the Knights of King Arthur and other similar clubs reflected a desire for manly expression, the emulation of masculine heroes, and a return to chivalry and noblesse oblige. The Men and Religion Forward movement of 1911–1912, though brief in duration, maintained a youth division that emphasized boys’ future role in recapturing vigorous male leadership in organized Christianity. Yet despite these differences in the focus of manly exertion, many common features characterized all the groups influenced by muscular Christian ideals. All were committed to character development as a central theme, anchored in practical deeds rather than pious discussion. The emphasis on service was pervasive, enlisting boys in campaigns for purposeful change in society. Reacting to urban overrefinement and “spectatoritis,” participatory outdoor activities were commonplace, and camping became a significant staple of club life. Lusty hymn singing remained a critical component of religious expression, and yet the hymn books changed dramatically to reflect
458
Muscular Christianity
muscular themes. Both in the revised YMCA hymnbooks and in other popular alternatives such as Manly Songs for Christian Men (1910), tunes that emphasized active and heroic service for the “manly man of Galilee” were the norm. Focused less on heaven and more on practical and martial ideals of kingdom building, popular hymns served as an important means of reinforcing muscular Christian ideals for “manly men.” Perceiving their clubs as healthy and morally invigorating expressions of the gang impulse, such agencies provided an important means of Christian socialization for American boys at this time. Here was a means of ensuring that boys’ instincts would be expressed in positive ways under direct adult supervision. In this way, leaders could guarantee for the future that Christianity would be manly and that manliness would be expressed in Christian ways. Interestingly, many educators and youth workers argued that muscular Christianity was ultimately a means to save both the boy and the worker who focused on him. For men in the white-collar world, working with boys in the YMCA or serving as scoutmaster or brigade leader was a pathway to the masculine expression that was no longer provided by the world of work. For pastors and religious leaders, boys’ work was a means of combating the crippling effeminacy of their profession. Cramped within a lifestyle that sponsored enfeebled passivity and female companionship, this work would therapeutically restore a sense of the heroic potency of the Christian life. Even though the purpose of the movement for boys was always linked to the salvation and masculine socialization of the boy, experts also hoped
that boys would be the salvation of “softened” men. By the early 1920s, both the perceived urgency of the boy problem and the plea for a masculine Christianity for boys had begun to wane. Although between 1880 and 1920 many leaders spoke of the importance of boys’ instincts in shaping their behavioral proclivities, books and articles printed after 1920 paid surprisingly little attention, even in retrospect, to this previously dominant paradigm. Replacing this early-twentieth-century consensus was a proportionately greater emphasis on cultural influence over and above natural instinct, of nurture over inherited nature. In addition, the 1920s introduced a whole new array of “youth” issues, many of which revealed equal concern for girls and young women. By this time, the fear that female students were becoming “male” in both appearance and attitude seemed to quell the cries for heightened masculinity among American boys. Yet between 1880 and 1920, muscular Christianity represented a significant attempt to help boys become both men and Christians in a society where such a combination seemed increasingly tenuous. David Setran See also Boy Scouts; Camping; Parachurch Ministry; Sunday Schools; Young Men’s Christian Association References and further reading Barton, Bruce. 1925. The Man Nobody Knows: A Discovery of the Real Jesus. Indianapolis: Bobbs-Merrill. Bederman, Gail. 1989. “‘The Women Have Had Charge of the Church Work Long Enough’: The Men and Religion Forward Movement of 1911–1912 and the Masculinization of Middle-Class Protestantism.” American Quarterly 41, no. 3 (September): 432–465.
Music Bendroth, Margaret Lamberts. 1997. “Men, Masculinity, and Urban Revivalism: J. Wilbur Chapman’s Boston Crusade.” Journal of Presbyterian History 75, no. 4 (Winter): 235–246. Case, Carl. 1906. The Masculine in Religion. Philadelphia: American Baptist Publishing Society. Fiske, George W. 1912. Boy Life and SelfGovernment. New York: Association Press. Forbush, William B. 1907. The Boy Problem. 3d ed. Boston: Pilgrim Press. Hall, Donald E., ed. 1994. Muscular Christianity: Embodying the Victorian Age. Cambridge, UK: Cambridge University Press. Hoben, Allan. 1913. The Minister and the Boy: A Handbook for Churchmen Engaged in Boys’ Work. Chicago: University of Chicago Press. Kett, Joseph. 1977. Rites of Passage: Adolescence in America, 1790 to the Present. New York: Basic Books. Lears, T. J. Jackson. 1981. No Place of Grace: Anti-Modernism and the Transformation of American Culture, 1880–1920. New York: Pantheon Books. Macleod, David I. 1983. Building Character in the American Boy: The Boy Scouts, YMCA, and Their Forerunners. Madison: University of Wisconsin Press. Mangan, J. A., and James Walvin, eds. 1987. Manliness and Morality: Middle Class Masculinity in Britain and America, 1800–1940. New York: St. Martin’s Press. Merrill, Liliburn. 1908. Winning the Boy. New York: Fleming H. Revell. Putney, Clifford W. 1995. “Muscular Christianity: The Strenuous Mood in American Protestantism, 1880–1920.” Ph.D. diss., Brandeis University.
Music For many Americans, the standard image of the musical boy is that of the sissified dandy, the kind of boy who trundles his violin to school, fastidiously protects his hands from work or dirt, and is beaten and robbed of his lunch money by his rougher fellows. Music, particularly the
459
highbrow music of piano instructors and dancing masters, is not frequently associated with a healthy boyhood. Popular music is another story. From the first European settlements to the present, popular music has surrounded the American boy. It has served a dual role in his life, simultaneously functional and idiosyncratic, providing him with his first introduction to official culture along with a way of expressing unofficial yearnings. For every generation of American boys, popular music may be seen as a socializing agent and a vehicle for rebellion, a mode of expression that rigidifies lines of class, race, and gender while allowing for their temporary erasure. Despite the tendency for many historians to address them as “puritanical” in values, Anglo-Americans of the early colonial period were surrounded by popular music. The dominant context for the European conquest of America was Elizabethan England, its models for boyhood and manhood more aligned with Shakespeare’s Falstaff and Sir Toby Belch than with the comparatively dour Winthrops, Bradfords, and Mathers. For these Elizabethan types who settled in colonial British America, singing and dancing were common practices, introduced to young boys as traditional, albeit morally troublesome, holiday and leisure pursuits. Thus, although colonial Americans made few or no distinctions between songs for adults and music for children, their music included much youthful and boyish energy, from Scots-Irish fiddle tunes to springtime revels, alehouse “merriments,” and sea chanteys picked up by young sailors from various regions of the Atlantic world. Even the early religious dissidents and separatists who may rightly be called
460
Music
A clarinet lesson (Shirley Zeiberg)
Puritans had their share of desires expressed through music. Officially sanctioned Puritan music was both popular and functional, centering on the communal singing of psalms and hymns frequently set to popular tunes. These soon spread throughout the colonies. As with Spanish mission hymns found in locales to the south and west, the purpose of these songs was to introduce neophytes— children as well as Native Americans and African Americans—to the tenets of Christian faith. Young boys would repeat lined-out psalms from the Bay Psalm Book (1640) in the seventeenth century, and a host of hymns like “Northfield,” “Amazing Grace,” and “Lamentation over Boston” written by Isaac Watts, John Newton, and William Billings during the
revolutionary period. At the same time, younger Americans continued to have easy access to secular songs and dances. For Puritans and non-Puritans alike, much of this music raised problems. Some early Americans felt that all secular music provided young people with an “incitement to adultery,” whereas others held that some examples—particularly country dances “for as many as will” as opposed to “mixed” or couple dances— were harmless amusements. Along with the Bay Psalm Book, John Playford’s The English Dancing-Master (1650) remained one of the most popular books in the United States well into the nineteenth century. Despite the ubiquity of these songs, many Americans agreed that the singing of “corrupt songs” or music arising from “gross disorderly carriage” was out-of-bounds for younger boys. Still, the records of even the most staunchly Puritan regions are filled with young people— the preponderance of whom seem to have been adolescent boys and young men— who were charged with “unseasonable night meetings.” A typical case involved a youthful apprentice brought before the New Haven Colony Court in 1662. Accused of repeatedly sneaking away from his master, the boy confessed, as the court recorder put it, “‘that his maine ground of goeing away was, that he might goe where he might have more liberty, for one from Connecticut told him if he lived there he might live merrily & sing & daunce &c’” (Dexter 1919, 23–25). By the time of the American Revolution, this quest for liberty and its resultant conflict with social strictures would become a characteristic of Americans. Through this period and into the era of the early republic, admonitions against popular music as a corridor for desires remained but were increasingly muted
Music with the rise of a more liberal and market-oriented society. As the three young brothers who later formed the wildly popular Hutchinson Family Singers recalled, even in the 1830s their father, once a renowned fiddler in their New Hampshire village, would smash his instrument as the devil’s tool during a Baptist revival. Thus the brothers were forced to buy violins on the sly, practicing their chords and fingering while hiding behind a large rock on the family farm. Elsewhere, ministers and pamphleteers railed against popular tunes and defended hymnody, one typical example from 1833 declaring: “Many a young man has commenced his downward course by yielding to the influence of festive songs and sportive glees” (Lucas 1833). Still, the number of these warnings, combined with their shrillness, suggests that they were fighting a lost cause. The turning of the nineteenth century witnessed the development of a truly popular American music, along with increasingly clear distinctions between adult and children’s songs. Disconnected from their original meanings and some with frequently bawdy lyrics cleaned up or rewritten entirely, traditional English popular tunes such as “Three Blind Mice,” “John Barleycorn,” and “A Frog He Would aWooing Go” became children’s songs. Others originated as broadside ballads yet quickly became integrated into the widening education system of the Jacksonian and antebellum period, where they were included as didactic exercises in early public school readers. Many of these seem to have been expressly designed to initiate young boys into official ideals of patriotism and national or regional identity. Thus boys of the period learned and sang endless classroom versions of “Yankee Doodle,” “America, Commerce, and
461
Freedom,” “The Jolly Tar,” and “The Indian’s Lament,” learning that Yankees were more liberal and entrepreneurial in values than their stiffly aristocratic English ancestors, that hard work was ennobling and healthful, and that the new nation’s many Native American peoples were noble but doomed to an apparently “natural” extinction. If these songs were ardently didactic in content, by the 1830s and 1840s another more rebellious music had captured the attention of many American boys. It was blackface minstrelsy. Blackface minstrelsy may be defined as white singers and actors, almost always young men, performing what they and their audiences perceived as authentic yet comical and exaggerated versions of African American song, dance, and speech. Although its origins may be traced to the eighteenth century, blackface received its modern form during the democratic ferment of the Jacksonian era. During this period, a growing host of young men began performing in the genre’s standard trappings: donning striped frock coat and white gloves, applying burnt cork or black greasepaint to darken their faces, and speaking or singing in spurious versions of African American dialect. Many historians and musicologists have identified this music as racist. Certainly, its imagery is filled with stereotypes: malapropisms, tortured diction, happy slaves longing for the old plantation, and northern black dandies whose efforts at gentility are exposed as “putting on airs.” At the same time, these same scholars have linked blackface with “genuine” African American music or with the authentic folk expressions of an early American working class. Thus they have muted their own charges, making the very stereotypes their critique has
462
Music
identified seem natural and real. In actual fact, it takes practically no musical training to discern that the standard music of blackface was primarily a collection of Irish jigs, Scottish reels, and English sentimental songs. In addition, if the origins of blackface were working-class, it quickly passed into the middle classes with the rise of more commercial songwriters and promoters such as Stephen Foster and Edwin Christy. As early as the 1840s, one finds blackface tunes with new lyrics in the service of middle-class reform, providing sing-alongs for the meetings of widows’ and orphans’ associations, temperance unions, and even abolitionist societies. In addition, if the music of blackface was racist, it reveals that racism itself has a history. For this was a racism composed of attraction, not revulsion, one of white yearnings and desires projected onto black bodies. At the minstrel show, young white males witnessed a stylized version of “blackness” as a rebellion against mothers, fathers, and etiquette guides and as a democratic release from authority, from proscriptions for proper manhood, and even from whiteness itself. Pioneered by a rapidly proliferating number of troupes from the Virginia Minstrels of the 1840s to the New Orleans Serenaders and Christy’s Minstrels, minstrel songs like “Jump Jim Crow” (1831), “Old Dan Tucker” (1843), “Oh Suzanna!” (1848), “My Old Kentucky Home” (1851), and “Dixie” (1859) soon spread throughout the nation. What these and countless other songs expressed was predictable enough, for it fell well within the contemporary boundaries of liberation. At the minstrel show, male audiences could revel in stylized versions of erotic dances, fistfights, boundless appetites, and gender and racial transgressions.
With the rise of blackface minstrelsy, the American music that would characterize boys’ rebellion reached its modern form. Aside from the slow disappearance of greasepaint, its basic dynamic would remain unchanged into and throughout the twentieth century. From Stephen Foster to Elvis Presley, from the burnt cork of the mid-nineteenth-century stage to the hip-hop affectations of the present, Anglo-American boys have enacted rebellion through a musical mask of exaggerated African American styles. Over time, this music would become the stuff of consumer society, as the producers of sheet music and later radio programs, records, and boy groups would strive to make rebellion a necessity of boyhood and a key foundation for corporate profits. Its idiosyncratic origins would also be blended with more didactic elements. Through this music of rebellion, American boys learned a variety of lessons that would keep hierarchies of class, race, and gender alive even as they apparently transgressed their boundaries: boys are rebellious, but girls are not (even though girls like rebellious boys); the characteristics of whiteness (possessive materialism, repression, and culture) are the opposite of African American characteristics (soulfulness, self-expression, and nature); and white boys are free to “slum” or “get down” with their more expressive ethnic and class opposites, while their “opposites” are locked into strict categories composed of stereotypes. And finally, through this music of standardized rebellion, generations of American boys learned that the violin-carrying schoolboy, the daring conformist, the rebel against rebellion is little more than a sissified dandy and thus deserving of a good beating. Brian Roberts
Music See also African American Boys; Rock Bands References and further reading Anti-Slavery Melodies: For the Friends of Freedom; Prepared by the Hingham Anti-Slavery Society. 1843. Hingham: Elijah B. Gill. Boston Temperance Songster: A Collection of Songs and Hymns for Temperance Societies, Original and Selected. 1844. Boston: William White. Cassuto, Leonard. 1997. The Inhuman Race: The Racial Grotesque in American Literature and Culture. New York: Columbia University Press. Cockrell, Dale, ed. 1989. Excelsior: Journals of the Hutchinson Family Singers, 1842–1846. New York: Pendragon Press. Dexter, Franklin Bowditch. 1919. Ancient Town Records. Vol. 2: New Haven Town Records, 1662–1684. New Haven: New Haven Colony Historical Society. Hamm, Charles. 1979. Yesterdays: Popular Song in America. New York: W. W. Norton. Hutchinson Family’s Book of Words. 1851. New York: Baker, Godwin and Co., Steam Printers. Lambert, Barbara, ed. 1980. Music in Colonial Massachusetts 1630–1820: Music in Public Places. Boston: Colonial Society of Massachusetts. Levine, Lawrence. 1988. Highbrow/ Lowbrow: The Emergence of Cultural Hierarchy in America. Cambridge: Harvard University Press. Lhamon, W. T., Jr. 1998. Raising Cain: Blackface Performance from Jim Crow to Hip Hop. Cambridge: Harvard University Press. Lott, Eric. 1993. Love and Theft: Blackface Minstrelsy and the American Working Class. New York: Oxford University Press. Roediger, David R. 1991. The Wages of Whiteness: Race and the Making of the American Working Class. New York: Verso.
463
Rogin, Michael. 1992. “Blackface, White Noise: The Jewish Jazz Singer Finds His Voice.” Critical Inquiry 18 (Spring): 417–453. Saxton, Alexander. 1990. The Rise and Fall of the White Republic: Class Politics and Mass Culture in Nineteenth-Century America. New York: Verso. Silverman, Kenneth. 1976. A Cultural History of the American Revolution: Painting, Music, Literature, and the Theatre. New York: Thomas Y. Crowell. Southern, Eileen. 1971. The Music of Black Americans. New York: W. W. Norton. Tawa, Nicholas E. 2000. High Minded and Low Down: Music in the Lives of Americans, 1800–1861. Northeastern University Press. Toll, Robert C. 1974. Blacking Up: The Minstrel Show in Nineteenth-Century America. New York: Oxford University Press. White, Shane, and Graham J. White. 1999. Stylin: African American Expressive Culture, from Its Beginnings to the Zoot Suit. Ithaca: Cornell University Press. Discography Brave Boys: New England Traditions in Folk Music. 1995. New World Records. Don’t Give the Name a Bad Place: Types and Stereotypes in American Musical Theater, 1870–1900. 1978. New World Records. The Early Minstrel Show. 1998. New World Records. English Country Dances: From Playford’s Dancing Master, 1651–1703. 1991. Saydisc. His Majestie’s Clerks. 1996. Goostly Psalmes: Anglo American Psalmody, 1550–1800. Harmonia Mundi. Music of the American Revolution: The Birth of Liberty. 1976. New World Records. Penny Merriment: English Songs from the Time of the Pilgrims. 1986. Plimoth Plantation.
N Nationalism and Boyhood: The “Young America” Movement
ing the name “Young America,” which was the title he chose for an address to the Boston Mercantile Association in February 1844. He was probably influenced by the contemporary rise of similar romantic nationalist groups in Europe—such as “Young Italy,” “Young Germany,” and “Young Ireland”—whose members provided the intellectual leadership for the explosive European democratic revolutionary movements of 1848. In the United States, Emerson’s call for a twofold nationalist agenda of an American rather than a European art and literature and westward expansion driven by an innovative American railroad and communication technology became the central issues for the rise of the Young America movement. In the next two decades, these ideas had a deep resonance in American culture (Kerrigan 1997). The label was quickly adopted by a circle of nationalist literary figures led by the critic John O’Sullivan, who regularly called for American cultural independence in his reviews in the New York–based U.S. Magazine and Democratic Review from 1841 to 1848. Literary nationalism was further promoted by the publisher Evert Duyckinck, whose Library of American Books provided an outlet for the work of such members of the Young America group as Nathaniel Hawthorne, Herman Melville, and Walt Whitman. In the arts,
During the middle third of the nineteenth century, the idea of boyhood became a metaphor for the growth of and pride in the young American nation. Based on the American romantic nationalist trend at the time and inspired in part by similar romantic nationalist movements in Europe, the term Young America was adopted by both art and literary critics who called for an American- rather than a European-style art and literature and by young Democratic partisans who sought to create a new vision of westward expansion and an end of sectionalism for the political party of Andrew Jackson. In the aftermath of the Civil War, the term became a humorous or sentimental symbol exploited for advertising and entertainment. The most recent scholarship argues that there were two phases of Young America. The first developed in the late 1830s and 1840s and was characterized by a romantic national political agenda of economic reform intertwined with cultural nationalism. The second emerged in the 1850s when a highly partisan Democratic factionalism focused on territorial expansion and foreign policy, turning a blind eye on slavery issues in an attempt to defuse sectional threats to the party (Widmer 1998). Ralph Waldo Emerson was responsible for coining and populariz-
465
466
Nationalism and Boyhood
Young America became the title attached to a group of genre artists, such as William Sidney Mount and Francis Edmunds, who worked through the American Art Union to promote ordinary Americans as suitable subjects for an American painting tradition. For the Young Americans, art and literature were inextricably tied to reform in American politics, which was at the time based on three things: a Jacksonian Democratic agenda of national westward expansion and land reform; the development of a simpler American jurisprudence through codification of state laws; and support for the development of railroads and other technologies enabling improved systems of distribution and mass marketing, which would tie the nation together and override sectionalism. George Henry Evans, who had been promoting land reform through his newspaper, the Workingman’s Advocate, became a key figure in the political side of the movement, retitling his Democratic paper supporting Martin Van Buren with the name Young America! shortly after Emerson’s speech. From its beginning, Young America involved a generational challenge. Young politicians from the newer states in the Midwest—men like Stephen A. Douglas of Illinois and George Sanders of Kentucky—criticized the old fogey ideas of the leaders of the Democratic Party in the 1850s, supported intervention on the side of foreign republican movements, and enthusiastically endorsed O’Sullivan’s coining of the phrase “Manifest Destiny” to argue for U.S. acquisition of lands reaching to the Pacific Ocean. Douglas’s failure to win the presidential nomination in 1852 and the younger politicians’ blindness to the importance of the slavery issue led to their downfall as a major political force by the end of the 1860s.
By then, however, the phrase Young America had entered into general cultural use, and its users often gave the symbolism of youth a concrete visual reality. An 1865 engraving titled Young America Crushing Rebellion and Sedition used an infant Hercules as the symbolic vision of the North’s defeat of the Confederacy. Creative promoters adopted the imagery as well. The currency of the phrase is reflected in the titles of several publications, such as a Bird’s Eye View of Young America: Warren County, Illinois, published by A. Ruger in Chicago in 1869, and a set of educational lantern slides illustrating the development of an American architectural tradition before 1840, titled Young America Admires the Ancients. A series of advertising promotions seized on the pictorial possibilities of Young America, such as the 1858 logo for Young America Denims. A Young America advertising card for Lilienthal’s tobacco showed a young boy holding an American flag, and a post-1860 colorgraph promoted “Young America Hams and Breakfast Bacon.” In 1871, the manufacturer of entertaining stereographic view cards, M. M. Griswold of Lancaster, Ohio, featured in his sentimental series of images of children, “Griswold Compositions,” such titles as “Young America Bathing,” “Young America in the Nursery,” and “Young America Asleep.” Constance B. Schulz References and further reading Danbom, David B. 1974. “The Young America Movement.” Journal of the Illinois State Historical Society 67: 294–306. Kerrigan, William Thomas. 1997. “‘Young America!’: Romantic Nationalism in Literature and Politics, 1843–1861.” Ph.D. diss., University of Michigan. Reagan, Daniel Ware. 1984. “The Making of an American Author: Melville and
Native American Boys the Idea of a National Literature.” Ph.D. diss., University of New Hampshire. Spiller, Robert E. 1971. “Emerson’s ‘The Young American.’” Clio 1: 37–41. Widmer, Edward L. 1998. Young America: The Flowering of Democracy in New York City. New York: Oxford University Press. Contemporary sources: Advertising. Library of Congress Prints and Photographs Division: 1858. “Young America Denims.” 1860. “Young America” box label for NY tobacco distributor C. H. Lilienthal. 1865. “Young America Hams and Breakfast Bacon.” 1866. Emerson, Ralph Waldo. 1844. “The Young American.” The Dial (April). 1867. Ruger, A. 1869. Bird’s Eye View of Young America: Warren County, Illinois. Map: Warren County, IL. Library of Congress Map Division. 1868. Sartain, William. 1864. “Young America Crushing Rebellion and Sedition.” Engraving in Library of Congress Prints and Photographs Division. 1869. Young America!: The Organ of the National Reform Association. Formerly the Workingman’s Advocate. 1844–1845. Periodical, New York City, George H. Evans, publisher. Library of Congress Newspapers and Periodical Division. During the 1850s and 1860s, several other newspapers and periodicals also adopted this name. 1870. Young America Admires the Ancients. 1783–1840. Set of 80 lantern slides on American architecture. Library of Congress Prints and Photographs Division.
Native American Boys The more than 550 American Indian nations in North America make generalizing about Native American boyhood a high-risk venture. It can at least be said that, in all cultures, Indian boys in the past spent their childhoods in training for their adult roles as men. Family, age, and gender were crucial to each individual’s relationship to the larger society and determined each person’s economic and po-
467
litical roles, responsibilities and obligations, and social status and authority. Most of what we know about Indian boyhood in the past comes from Indian men who in the early twentieth century wrote or otherwise recorded their experiences growing up in the late nineteenth century. European travelers, missionaries, and bureaucrats provided some information on Indian family life in the seventeenth and eighteenth centuries. Indian societies were age-graded, some more rigidly than others, in which case transitions to a new age category were made explicit through a ceremony. Boys spent their infancy, like girls, under the immediate care of their mothers; in addition, fathers liked to play with their children, and aunts and grandmothers often stepped in as primary caregivers. Luther Standing Bear, a Lakota (Sioux) man who was a child in the late nineteenth century, remembered how all the women in the tiyospaye, or extended family, took care of him. In many Indian societies, babies commonly spent their first years in a cradleboard designed not only to ease the mother’s tasks of carrying and watching over the baby but also to produce straight, sturdy spines. Around the age of five, boys and girls began to live separate lives in terms of playmates and adult role models. Fathers and uncles taught boys the skills they would need as adults. Early in his life, every boy seems to have learned how to make and use bows and arrows. Other skills varied by region and economy. Navajo boys in the nineteenth century began herding sheep and learned how to care for livestock. On the plains, boys learned to ride and care for horses and to detect the patterns of buffalo and other prey. Among the Iroquois, famous for their oratory, boys learned the principles of a
468
Native American Boys
Navajo boy, ca. 1906 (Library of Congress)
good speech. Boys practiced adult skills in the games they played with other boys. They hunted for rabbits and birds. Cherokee boys competed at shooting arrows at cornstalks. Lakota boys practiced stalking Crow or Pawnee enemies and looked out for the well-being of their sisters. Most important in boys’ education was learning what behavior led to respect for adult men: generosity, reserve, deliberation, and clearheadedness. Physical endurance, agility, and the courage to withstand pain or hardship were also highly valued. Looking back at their childhoods, Indian men remembered having to rise early and running to learn agility and endurance, and instead of learning under threats of physical punishment or coercion or through bribery, they saw that politeness was the best way to treat others.
In many Indian communities, public ceremonies marked transitions to new life-course stages. One of the earliest accounts of an Indian initiation ceremony is John Smith’s description of the huskenaw, or busk, as practiced by Algonquian Indians in early-seventeenth-century Virginia. To be eligible to be councilors or shamans later in life, boys between the ages of ten and fifteen went through this arduous ceremony. After a day during which the entire village sang and danced, young men beat the boys with sticks as they ran a gauntlet. The boys then spent months in the woods, while their families and friends regarded them as dead. When they returned to the community, they were men. Among the Lakota, boys passed through a series of ceremonies from the naming ceremony in their infancy to the vision quest undertaken in their early teens as they approached manhood. The vision quest was critical, for it determined a boy’s future. After purifying in a sweat lodge for several days with his male relatives and friends, the boy left for a secluded spot, where alone and without food he waited many days for a vision. If graced, his vision would point out a spirit helper and indicate his future achievements, especially whether he would excel at war, hunting, or medicine. Only a rare few received visions powerful enough to start them on the path toward being a medicine man. The Pueblo Indians in the Southwest had perhaps the largest number of ceremonies to mark children’s progress toward adulthood. Zuni and Hopi boys went through frequent ceremonies; at each stage, they learned more of the community’s religious knowledge and the secrets of the kachina society into which they would eventually be initiated. Like
Native American Boys the huskenaw in the Southeast, at one point boys were publicly whipped in the plaza; however, most of their religious education took place in kivas, which were the religious centers located underground that symbolized the Pueblo peoples’ origins and their emergence from the earth’s womb. By twentieth-century standards, Indian boys in earlier times, whether in seventeenth-century Virginia or on the nineteenth-century plains, became men at a young age. Boys faced formal coming-ofage ceremonies at around puberty, but even without such ceremonies, they began taking on the tasks of men in their early to midteens. Luther Standing Bear recalled accompanying his father on a war party at age ten. It would be several more years before boys accompanying war parties were considered more than camp helpers and ready to engage the enemy in battle. Among the Great Lakes tribes and in the Southeast, boys became literally known as “young men” in their late teens. If they had demonstrated their capability, they could then lead war parties and marry, two signs of the transition to adulthood. For Hopi and Zuni boys, donning a kachina mask carried a similar significance and also occurred at about age twenty. The most direct challenge to native childrearing customs came with the U.S. government boarding school system. The first Indian Industrial School opened in Carlisle, Pennsylvania, in 1879. Several dozen others, located around the country, quickly followed. Christian missionaries had operated many Indian boarding schools throughout the United States in the nineteenth century, but Carlisle and its imitators were part of a larger federal initiative to assimilate Indians into the general population as individuals stripped
469
of any tribal allegiance or ethnic customs. Carlisle, Haskell (Kansas), Chilocco (Oklahoma), Tomah (Wisconsin), and other Indian Industrial Schools took Indian children away from their families and communities, sometimes by force or coercion and usually for several years. Dressed in military-style uniforms, Indian children led a regimented life of drills, grammar school lessons, and work intended as on-the-job training for future occupations. Girls learned homemaking, and boys learned skills such as farming, carpentry, and metalworking. Despite the rigid, sometimes violent, discipline of boarding school life, children formed a subculture in which they exercised their own social code and regulated the behavior of younger members. Boys distributed themselves into gangs in which fistfights were the primary means to show who was outside the gang, and intense loyalties prevailed within the gang. At the turn of the twentieth century, just as efforts to turn Native American children away from their own cultures peaked, American writers and educators began to glamorize Indian boyhood as a model for the middle classes. Back-tonature enthusiasts such as Ernest Thompson Seton, Theodore Roosevelt, and especially the American Boy Scout movement appropriated Indian motifs and lore to promote the values of self-reliance, hard work, honesty, and simplicity. Concurrent with the historian Frederick Jackson Turner’s theory that a unique American character had developed out of the frontier experience, American boys were encouraged to pass through a rugged and heroic stage of personal development by mimicking a romanticized Indian past. Several Indian writers contributed children’s books to feed the growing interest in how Indians lived. Charles Alexander
470
Native American Boys
Two young Hualapai boys stand on the rim of the Grand Canyon, 1991. (Tom Bean/Corbis)
Eastman’s Indian Boyhood (1902), Luther Standing Bear’s My Indian Boyhood (1931), and Arthur C. Parker’s The Indian How Book (1927) tell about such experiences in growing up as surviving in the woods and hunting birds and rabbits, as well as the importance of their relatives in teaching them the values and social mores they would need as adults. At the turn of the twenty-first century, Indian boyhood does not seem too different from American boyhood in general. Still the most rural minority group in the United States, more than half of the Indian population lives in cities. Most of the Indian children living on Indian reservations attend state public schools, especially at the high school level. Although many communities and families still mark children’s transitions to new re-
sponsibilities with a traditional ceremony, for most Indian families high school graduation has become the sign of arriving at adulthood. Nancy Shoemaker See also California Missions; Fathers References and further reading Axtell, James, ed. 1981. The Indian Peoples of Eastern America: A Documentary History of the Sexes. New York: Oxford University Press. Dyk, Walter. 1938. Son of Old Man Hat: A Navaho Autobiography. Lincoln: University of Nebraska Press. Eastman, Charles A. 1971. Indian Boyhood. 1902. Reprint, New York: Dover. Hilger, M. Inez. 1992. Chippewa Child Life and Its Cultural Background. 1951. Reprint, St. Paul: Minnesota Historical Society Press.
Newsboys La Flesche, Francis. 1963. The Middle Five: Indian Schoolboys of the Omaha Tribe. 1900. Reprint, Madison: University of Wisconsin Press. Lomawaima, K. Tsianina. 1994. They Called It Prairie Light: The Story of Chilocco Indian School. Lincoln: University of Nebraska Press. Penney, David. 1993. “Indians and Children: A Critique of Educational Objectives.” Akwe:kon [Native Americas] 10 (Winter): 12–18. Roscoe, Will. 1991. The Zuni ManWoman. Albuquerque: University of New Mexico Press. Simmons, Leo W. 1942. Sun Chief: The Autobiography of a Hopi Indian. New Haven, CT: Yale University Press. Standing Bear, Luther. 1978. Land of the Spotted Eagle. 1933. Reprint, Lincoln: University of Nebraska Press. Szasz, Margaret Connell. 1985. “Native American Children.” Pp. 311–332 in American Childhood: A Research Guide and Historical Handbook. Edited by Joseph M. Hawes and N. Ray Hiner. Westport, CT: Greenwood Press.
Newsboys Cold mornings. Cranky customers. Fearsome canines. Historic headlines. Cherished earnings. These are some of the common recollections of the millions of Americans who have hawked or delivered newspapers from colonial times to the present. Whether they grew up in small towns or big cities, many children’s first and most formative job has been peddling papers. It is how generations of youths have learned the meaning of work, the value of a dollar, and the sometimes narrow difference between opportunity and exploitation. Newsboys are real workers, but they are also mythic figures. Juvenile novels, genre paintings, and documentary photographs have made newsboys into enduring symbols of American democracy and the spirit of capitalism. Writers, artists, and reformers have alternately praised
471
news selling as a public service and decried it as a social evil. What we find if we retrace newsboys’ steps across time and listen to their words is that children’s labor was integral to the rise of the newspaper industry, which, for better or worse, has been one of the most influential child welfare institutions in the United States. The title of “first American newsboy” usually goes to Benjamin Franklin, who in 1721, at the age of fifteen, helped deliver his brother’s paper, the New England Courant, through the streets of Boston. Newspapers were rarely cried on the streets in the colonial period or early republic. Most were picked up at the newspaper offices; sent by post; or delivered to subscribers, coffeehouses, and taverns by printers’ apprentices or lowpaid carriers. On New Year’s Day they distributed carriers’ addresses—poetical broadsides that always ended with an appeal for a tip. Franklin notwithstanding, newsboys can better trace their professional ancestry to Bernard Flaherty and Samuel Messenger. These were two of the first boys recruited by Benjamin Day in 1833 to peddle the New York Sun, the first successful penny newspaper. Most New York papers were huge “blanket sheets” that cost 6 cents and specialized in financial news. Day’s dream was a cheap, feisty tabloid for workingmen. He ran an ad addressed “To the UNEMPLOYED—A number of steady men can find employment by vending this paper. A liberal discount is allowed to those who buy and sell again.” Profits looked to be so low that no men came forward, so Day hired Flaherty, Messenger, and a half-dozen other boys at $2 a week. He assigned them districts but otherwise gave them complete control of their areas; they could either peddle on
472
Newsboys
A newsboy selling newspapers announcing the beginning of World War II (Archive Photos)
the streets or build subscription routes. Many did both. The most active boys earned $5 a week, almost as much as a journeyman printer (Lee 1937, 261). Realizing there was money to be made, adults began organizing routes and hiring their own boys. Penny dailies soon spread to other cities, giving rise to two systems of circulation: the London plan, which emphasized street sales and the use of middlemen; and the Philadelphia plan, which stressed home delivery and direct control of operations. Newsboys worked under both systems and were celebrated in song and story as symbols of
“Young America,” archetypal young men on the make, poised to profit from new markets linked by an expanding network of roads, rails, canals, and, of course, newspapers. In the 1850s newsboys became feared members of what philanthropist Charles Loring Brace called the “dangerous classes” (Brace 1872). The decade saw two economic depressions that left an estimated 10,000 children to fend for themselves on the streets of New York. They “slept rough” under stairs, in alleyways, and on steam grates, particularly around newspaper offices where they could get papers on credit. The word newsboy became synonymous with street waif. Cold and hunger were well known to them. In his 1860 memoir, New York newsboy Johnny Morrow called hunger “the tyrant of animal life” and the most compelling force behind his trade (Morrow 1860, 131). In 1853, Brace founded the Children’s Aid Society and began shipping street children out west on “orphan trains” to live and work on farms. In 1854 he opened the Newsboys’ Lodging House in the Sun building. For 6 cents it provided beds, baths, meals, and entertainment. Over the years dozens of similar institutions sprang up across the country, including several operated by the Catholic Church. The Civil War raised the number and stature of newsboys. In Detroit they became “a noticeable feature of the town” with the first battle of Bull Run, and in New York they soon numbered “many thousands” and spanned “all the seven ages of man” (“Then and Now” 1896, 70; “New York Newsboys” 1869, 717). Many big-city dailies regularly sold 10,000 copies per day and began to issue multiple editions rather than extras. Some established separate “evening” papers,
Newsboys
473
Lewis Hine photograph of newsboys on the steps of the White House (Library of Congress)
most of which were sold on the street rather than by subscription. Englishman Edward Dicey observed in 1863 that this “chance circulation” influenced the style of American journalism because it encouraged “the sensation system of newspaper headings and paragraphs, which offends our taste so constantly” (Dicey 1863, 30–31). Although offensive to some, newsboys were valorized in the press as war orphans who had a special claim on the public weal. Many became soldiers themselves, drilling in squads at newsboy homes and joining up when they were old enough. Newsboys took in from 50 cents to $3 a day in the 1860s, but they made much more when the news was hot. Two fifteen-year-old newsboys reportedly “sold
2,000 papers between them when the telegraph announced the capture of Jefferson Davis; and on the evening that Mr. Lincoln was assassinated, they sold the enormous number of 3,400” (“New York Newsboys” 1869, 717). Among those who profited from war news was Thomas Edison, a train boy on the Grand Trunk Railroad. When word came of the carnage at Shiloh, Edison had wires sent to all the stations along the line. He ordered 1,000 copies of the Detroit Free Press—ten times his usual number—and retailed them at inflated prices to the crowds he knew would be waiting at every stop. In the frenzy of industrialization that followed the Civil War, the U.S. urban population more than tripled, and newsboys emerged as unwitting advocates of
474
Newsboys
laissez-faire capitalism. Horatio Alger portrayed them in his many novels as ragged individualists whose essential good character made success inevitable. Likewise, John George Brown, the most prolific and popular genre painter of his generation, pictured newsboys and bootblacks as rosy-cheeked cherubs who thrived on the street. Alger’s and Brown’s works implicitly assured the middle class that poor city children could rise if they truly wanted to succeed. Despite these idealized images, child peddlers reemerged as a pressing problem during the depression years of the 1870s and early 1880s. In 1874, concerned citizens in New York founded the Society for the Prevention of Cruelty to Children (SPCC) and renewed efforts to sweep them off the streets. Peddling papers was not a crime per se, but to officers of “the Cruelty” it was prima facie evidence of parental neglect. Boston started licensing newsboys and bootblacks in 1868. It issued them leather badges and limited the number to 400. The city also established a special newsboys school that held two two-hour sessions a day. Detroit instituted a badge system in 1877 that was partly a response to the children’s labor militancy. Newsboys struck the Detroit Evening News over its pricing policy, and their “generally unruly character” led to the passage of an ordinance requiring each newsboy to obtain a yearly license and badge for 10 cents. An amendment stipulated that the badges were to be issued “only on satisfactory assurance of good conduct” (“Newsboys’ Riot” 1877, 4; Farmer 1889, 963). Nationally, the number of newsboys climbed as newspaper circulation shot up from 2.4 million in 1879 to 24 million in 1909 (West 1996, 37). Estimates of the newsboy population ranged
from 800 in Philadelphia to 1,600 in Detroit and 5,000 in New York City. Turnover was constant; boys worked anywhere from a few weeks to a few years (Beach 1888, 202; Wager-Fisher 1880, 693; Ward 1875, 949). Contrary to popular belief, relatively few newsboys were orphans. Most lived with one or both parents and worked the streets as part of a family business, often accompanied by siblings and monitored by relatives. Newsboys typically started to peddle between the ages of five and ten. Their earnings accounted for up to 20 percent of a household’s income, which gave them greater autonomy and status within the family. Few newsboys continued in the trade after fifteen, the age at which working-class males typically entered the adult labor force. Some boys, particularly blacks, stayed on longer simply because there were no better jobs for them. The ethnicity of newsboys usually reflected the ethnic composition of a city’s working class, with the newest and poorest arrivals tending to dominate. Thus most newsboys up to the 1880s were from Irish stock. They were followed primarily by the children of southern and eastern European immigrants. Not all newsboys were boys. Girls also sold papers, but they rarely exceeded 2 percent of the workforce. Parents were less likely to let their daughters approach strangers for commercial purposes. As New York police captain Edward Tynan said in 1881, “Girls who begin with selling newspapers usually end with selling themselves” (“Miseries of News-Girls” 1881, 12). Such prejudices were inscribed in law in the early 1900s, when many municipalities required girls to be sixteen, eighteen, or twenty-one years old to obtain street trading licenses but allowed boys to trade as young as ten.
Newsboys Most observers considered street hawking unskilled labor, but it required physical ability and mental acuity. Chief among the prerequisites was a big voice and an ability to assess the news. In crying their wares newsboys staked out their turf with their voices. Volume was not enough, though. They had to predict how many papers they could sell on a given day and which stories to shout. Newsboys sometimes developed little tricks, such as embellishing headlines, selling day-old papers, or short-changing customers. At times they were arrested for crying false news, violating the Sabbath, and peddling proscribed papers. They also needed street smarts to avoid being run down in traffic, molested by customers, cheated by suppliers, robbed by their peers, and caught in the crossfire of bloody circulation wars. As in all retailing, newsboys’ profits depended greatly upon location. The more heavily trafficked areas commanded the highest sales. Such sites were at a premium and had to be defended against all interlopers. “It was a case of survival of the fittest,” recalled Joe “Awful Coffee” Rutkofsky, a professional boxer who started selling papers in 1917 at the age of twelve in Pueblo, Colorado. “In those days, everybody was tough. You had to fight for your corner. You had to fight for everything” (Leppek 1995, 46). Newsboys competed fiercely with each other, but they also collaborated. They knew each other by a roster of colorful nicknames (“Carrots,” “Squinty,” “Dutchy”), and when one of their number died they took up collections for flowers, passed resolutions of condolence, and marched through the street in funeral trains. They developed elaborate proprietary rights to specific routes, corners, buildings, and streetcar lines. In ef-
475
fect, they established shadow real estate markets in which they bought, sold, raffled, bartered, and bequeathed public space for private commercial purposes. Newsboys also formed unions and mounted strikes. Documented newsboy strikes occurred in Detroit, St. Louis, and Chicago in the 1870s; Milwaukee, Lynn, Massachusetts, and Nyack, New Jersey, in the 1880s; and Cleveland, Toledo, New Orleans, and Lexington, Kentucky, in the 1890s. In 1899, New York newsies struck the nation’s two largest newspapers to protest a price hike imposed during the Spanish-American War. For two weeks they sabotaged the distribution of William Randolph Hearst’s Evening Journal and Joseph Pulitzer’s Evening World. The walkout sparked a children’s general strike in which newsboys, bootblacks, and messenger boys in scores of cities stopped work to demand better pay and working conditions. The strikers failed to reinstate the old price but won the right to return unsold copies for a full refund (Nasaw 1985). Other newsboy strikes followed in the twentieth century, prompting the industry to emphasize newsstand sales and experiment with “mechanical newsboys,” or coin racks. News selling required bursts of activity bracketed by periods of idleness. During their idle moments newsboys wrestled, played stickball, shot craps, and pitched pennies. With their ready cash, newsboys were among the most avid consumers of popular entertainment. They patronized cheap theaters, pool halls, penny arcades, movie houses, and brothels. To counter such vices, philanthropists and publishers opened newsboy reading rooms and night schools and hosted newsboy banquets and excursions. Circulation managers went on to establish newsboy clubs, bands, buildings, and sports teams
476
Newsboys
to win the loyalty of the boys and keep them “gingered up.” Such programs also helped deflect charges of exploitation and stave off child labor legislation. In 1890, journalist Jacob Riis shamed the nation with How the Other Half Lives, a vivid portrait of New York tenement life. His impassioned reportage and now iconic photographs of “street Arabs” huddled in alleys exposed the dark underside of industrial capitalism. Riis praised newsboys for their “sturdy independence, love of freedom and absolute self-reliance,” yet touched off a protracted campaign to rescue them from the slums (Riis 1890, 147). Reformers, many of them women working together in settlement houses, temperance unions, and trade unions, made elimination of child labor a priority. In Chicago, Hull House resident Florence Kelley likened street trading to “white child slavery” and lobbied for compulsory education laws to eradicate it. “There is no body of self-supporting children more in need of effective care than these newsboys and bootblacks,” she wrote in 1895. “They are ill-fed, ill-housed, ill-clothed, illiterate, and wholly untrained and unfitted for any occupation” (Kelley and Stevens 1895, 54–55). Kelley, like most Progressive-era reformers, downplayed the economic and emotional importance of children’s earnings. In 1904, activists formed the National Child Labor Committee to coordinate their efforts. They produced reams of sociological and statistical studies that portrayed juvenile street trading as part of the problem of, not the solution to, youth homelessness, poverty, and delinquency. “The professional newsboy is the embryo criminal,” declared economist Scott Nearing in 1907 (Nearing 1907, 784). To underscore the point, the committee
polled prison wardens and reform school superintendents who reported that between 50 and 75 percent of their inmates were newsboys (Lovejoy 1910). The prototypical newsboy of this period was epitomized in the documentary photographs of committee investigator Lewis Hine, who portrayed them as both casualties and survivors of capitalism. Between 1890 and 1918, every state in the union passed compulsory education laws (Postol 1997, 340). During the next decade, thirty-nine cities and twenty states regulated juvenile street trading (Shulman 1932, 13). Newspaper publishers at first resisted government interference and denounced reformers as meddling do-gooders and socialists infringing on the freedom of the press, but ultimately they came to embrace licensing schemes as a way to oversee their young workers. Newsstand operators and some newsboy unions also supported these measures because they limited competition. Yet enforcement was weak, and thousands of underage youths left school for work. Ironically, most child labor laws tended to push boys out of shops and factories where enforcement was relatively effective and into street trades where they could work more freely. To encourage self-regulation and instill principles of citizenship, several cities instituted newsboy courts in which teen judges heard cases and imposed fines for misconduct. In Boston, Milwaukee, and Birmingham, Alabama, these courts grew into full-fledged newsboy “republics” with constitutions and elected representatives from various “states” or neighborhoods. Members of the International Circulation Managers Association, formed in 1898, sponsored similar self-government schemes for newsboys. In 1915, it established a Newsboy Welfare Commit-
Newsboys tee to help recruit and retain boy labor and fight further government regulation. World War I created havoc and opportunity in the newspaper industry. Newspapers were hit hard by the accompanying business decline of 1914. Advertising revenues dropped as operating expenses rose. Papers took drastic measures. Some arbitrarily limited circulation and banned returns. Most penny papers doubled in price, and that of many Sunday papers jumped from 5 cents to 10 cents. Sales dipped, but by war’s end circulation and revenue climbed to new heights. The number and profile of news sellers changed during the war. In Buffalo, their ranks shrank by 25 percent between 1917 and 1919 (Juvenile Protective Department 1935, 13–14). One explanation is that as older boys entered the military, younger ones took their better-paying jobs in industry. Labor was in such short supply that circulation managers welcomed girls into the news trade. As with previous conflicts, the war presented new opportunities to honor newsboys. Fictional newsboys-turned-soldiers were the protagonists of several wartime talkies, and a hit song in 1919 was “I’d Rather Be a Newsboy in the U.S.A. Than a Ruler in a Foreign Land.” Despite the spread of a middle-class ethos that prized children for their sentimental rather than economic value and an overall decline in child labor, the number of newsboys rose in the 1920s (Mangold 1936, 303). In 1924, the U.S. Children’s Bureau backed a constitutional amendment to regulate all forms of child labor. Congress approved the measure, but only four states ratified it. Reformers’ efforts to characterize news selling as a corrupting occupation were undermined by positive portrayals of newsboys in advertisements and political
477
campaigns. One of the most popular politicians of the day was former newsboy Al Smith. He outpolled publisher Hearst to become governor of New York in 1922 and won the Democratic nomination for president in 1928. The stock market crash of 1929 negated whatever gains child labor reformers had made in persuading the public that news peddling was detrimental to the welfare of American youth. If anything, people now felt that the work was too valuable to relegate to children when nearly one-third of all wage earners—15 million adults—were unemployed. The hard times of the 1930s sent more men into the news trade and compelled youths to stay in the business longer than they normally would have done. Hawkers and carriers totaled 500,000 in the 1930s, with carriers comprising 70 percent of the workforce. The average age of newsboys climbed from eleven to fourteen, while their annual earnings declined along with everyone else’s (Linder 1990, 836). Publishers instituted sales training programs for their carriers; held subscription contests; and offered bikes, trips, and scholarships as prizes. Still, revenues declined; 400 newspapers failed during the 1930s, leaving 80 percent of communities one-newspaper towns. Meanwhile, the newsboy emerged as a proletarian hero on stage and screen. He was the star of the radical repertory piece “Newsboy,” which became a standard with workers’ theater groups nationwide. Adapted from a poem by a Communist Party cultural official, it blended dance, chants, and dialogue to expose the real class struggle behind the day’s sensational headlines. At the same time Hollywood churned out reels of newsboy and gangster films; actors James Cagney, Humphrey Bogart,
478
Newsboys
and John Garfield specialized in roles as street toughs who learned early in life to take capitalism to its limit, suspending all rules of morality. In 1933, President Franklin D. Roosevelt pushed through the National Industrial Recovery Act, which created a system of codes regulating competition in every branch of the economy. The Newspaper Code set minimum age requirements for hawkers and carriers and prohibited night work. Publishers lobbied against it, insisting that the nation’s newsboys were “little merchants” and not employees whose hours and conditions they could regulate. Besides, they said, news selling was more play than work and to deny carriers their routes “would constitute a national menace and drive them into the Devil’s Workshop” (Trattner 1970, 194). The debate became moot when the Supreme Court invalidated the entire act in 1935. In 1938, the Fair Labor Standards Act raised the working age to sixteen and again tried to ban the interstate commerce of goods made by children, but newsboys, ruled independent contractors, were exempt. In the early 1940s, World War II dominated the headlines, and labor shortages again led to a relaxation of child labor and school attendance laws. Selling and recycling newspapers were part of the war effort. Newsprint was rationed; hawkers sold out their allotments in half the normal time, and carriers could not accept new customers. Circulation managers now rejected the terms newsboy and newsie as suggestive of ragged urchins. They declared October 4 National Newspaperboy Day and estimated that there were 350,000 newsboys in the United States, 90 percent of whom were carriers (McDaniel 1941, 43). The U.S. Treasury authorized them to sell war sav-
ings stamps, and when victory came they had sold $1.7 billion in 10-cent stamps (Postol 1997, 340). The newsboy now morphed into a comic book superhero: Billie Batson had only to shout “SHAZAM!” to become Captain Marvel, “the world’s mightiest man—powerful champion of justice—relentless enemy of evil.” His nemesis was none other than Captain Nazi. In 1942, DC Comics introduced “The Newsboy Legion,” a series featuring four crime-fighting slum kids. Carriers with paper routes all but displaced newsboys on street corners in the 1950s and 1960s. These two decades brought full employment and increased incomes. More families could afford automobiles and houses in the suburbs. Children worked less than ever; only 2 percent of youths aged ten to fifteen were gainfully employed, whereas between 78 and 88 percent stayed in school up to age nineteen (West 1996, 207, 217). More people got their news via radio and television, which contributed to a long-term per capita decline in newspaper circulation. Newsboys nevertheless received a new kind of tribute. In 1952, after lobbying by circulation managers, the U.S. Postal Service issued a 3-cent stamp showing a newspaper carrier as a symbol of free enterprise. His shoulder bag bore the legend “Busy Boys . . . Better Boys.” Newspapers started to recruit more girls as carriers in the 1960s, although some states still barred them from the trade. In 1974, thirteen-year-old Lynn Warshafsky cited Title VII of the 1964 Civil Rights Act to challenge such a statute in Wisconsin, but the state supreme court held that Title VII did not apply to minors and that the state had a right to protect girl carriers, who would be more prone to sexual assaults than boy carriers. The threat of violence was
Newsboys real, but boys were no less vulnerable than girls. At least seven young carriers were kidnapped, raped, or murdered on their routes in the early 1980s (Stein 1987, 30–31). Eleven newsboys died “in the line of duty” in the mid-1990s (Linder 1997, 76–77). They were abducted, shot as burglars, or struck by vehicles. Aware of such dangers, insurance companies charge almost double what they charge workers in retail and other industries to cover newspaper deliverers. Yet most states do not require newspapers to provide independent carriers of any age with workers’ compensation coverage, unemployment insurance, Social Security benefits, or the minimum wage. This exemption represented a $172 million savings in payroll taxes for the industry (“Are Newspapers Taking Advantage?” 1988, 8–10). Safety concerns were just one factor in the decline of youth carriers. Falling birthrates shrank the pool of potential paperboys. The expanding fast food industry gave them other job options. Beginning in 1980, the number of carriers under eighteen declined at a rate of 10,000 a year. They were replaced by senior citizens who needed to supplement fixed incomes and new immigrants who saw the work as an alternative to welfare. Publishers realized that a corps of grownup, nonunion, independent carriers with their own cars and insurance was cheaper and more efficient than an army of adolescents on bikes. In some cities this realization led to the wholesale dismissal of youth carriers. In 1999, the Newspaper Association of America declared newsboys and newsgirls “an endangered species” (“Newsboys and Newsgirls” 2000, 5). Vincent DiGirolamo
479
See also Jobs in the Nineteenth Century; Jobs in the Twentieth Century; Melodrama References and further reading “Are Newspapers Taking Advantage of Child Labor?” 1988. Stark Metropolitan Magazine (April): 8–10. Beach, E. P. 1888. “A Day in the Life of a Newsboy.” Harper’s Young People 9 (January 17): 202. Brace, Charles Loring. 1872. The Dangerous Classes of New York and Twenty Years’ Work among Them. New York: Wynkoop and Hallenbeck. Dicey, Edward. 1863. Six Months in the Federal States. London: Macmillan. Reprint, Herbert Mitgang, ed., 1971. Spectator of America. Chicago: Quadrangle Books. Farmer, Silas. 1889. The History of Detroit and Michigan. Detroit: Silas Farmer. Juvenile Protective Department. 1935. “Street Traders of Buffalo, New York,” pp. 13–14. Buffalo: Juvenile Protective Department. Kelley, Florence, and Alzina P. Stevens. 1895. Hull-House Maps and Papers. New York: Crowell. Lee, Alfred McClung. 1937. The Daily Newspaper in America: Evolution of a Social Instrument. New York: Macmillan. Leppek, Chris. 1995. “The Life and Times of Denver’s Joe ‘Awful’ Coffee.” Western States Jewish History 27, no. 1 (October): 43–61. Linder, Marc. 1990. “From Street Urchins to Little Merchants: The Juridical Transvaluation of Child Newspaper Carriers.” Temple Law Review (Winter): 829–864. ———. 1997. “What’s Black and White and Red All Over? The Blood Tax on Newspapers.” Loyola Poverty Law Review 3: 57–111. Lovejoy, Owen. 1910. “Newsboy Life: What Superintendents of Reformatories and Others Think about Its Effects.” National Child Labor Committee, pamphlet no. 32 (June). Mangold, George B. 1936. Problems of Child Welfare. 3d ed. New York: Macmillan. McDaniel, Henry Bonner. 1941. The American Newspaperboy: A Comparative Study of His Work and School Activities. Los Angeles: Wetzel.
480
Nintendo
“Miseries of News-Girls.” 1881. New York Tribune, February 20, 12. Morrow, Johnny. 1860. A Voice from the Newsboys. New York: A. S. Barnes and Burr. Nasaw, David. 1985. Children of the City: At Work and at Play. New York: Oxford University Press. Nearing, Scott. 1907. “The Newsboys at Night in Philadelphia.” The Survey 17 (February 2): 778–784. “New York Newsboys, The.” 1869. The Leisure Hours (November 1): 717. “Newsboys and Newsgirls Constitute an Endangered Species.” 2000. Editor and Publisher (January 31): 5. “Newsboys’ Riot, A.” 1877. Detroit Evening News, July 21, 4. Postol, Todd Alexander. 1997. “Creating the American Newspaper Boy: MiddleClass Route Service and Juvenile Salesmanship in the Great Depression.” Journal of Social History (Winter): 327–345. Riis, Jacob. 1890. How the Other Half Lives, p. 147. Reprint, New York: Penguin, 1997. Shulman, Harry M. 1932. “Newsboys of New York: A Study of the Legal and
Illegal Work Activities During 1931,” p. 13. New York: Child Labor Committee. Stein, Mark A. 1987. “Carriers—The Young Are Fading.” Los Angeles Times, April 10, 1, 30–31. “Then and Now: Newspaper Distributing in Detroit in the ’50s.” 1896. Friend Palmer Scrapbook (Detroit Public Library) 13 (May 26): 70. Trattner, Walter. 1970. Crusade for the Children: A History of the National Child Labor Committee and Child Labor Reform in America. Chicago: Quadrangle Books. Wager-Fisher, Mary. 1880. “The Philadelphia Newsboys.” Wide Awake 11, no. 1 (July): 16, 18. Ward, Paul. 1875. “Street Arabs: Bootblacks and Newsboys.” Oliver Optic’s Magazine 18 (December): 949. West, Elliott. 1996. Growing Up in Twentieth Century America: A History and Reference Guide. Westport, CT: Greenwood Press.
Nintendo See Video Games
O Orphanages
was some public discussion about resurrecting orphanages, but little came of this discussion. Changes in the economy, demography, and gender roles help to account for the growth of orphanages for impoverished boys and girls in the nineteenth century (only a few orphan asylums were founded in the eighteenth century). The economy changed from agrarian to industrial in the course of the century. As manufacturing plants expanded, largely in cities, demographic change occurred as laborers from the surrounding farms and immigrants from other countries flocked to urban areas seeking work. These immigrant men were very likely to be unskilled. They earned little, worked long hours in dangerous surroundings, and lived with their families in small apartments. Disease was rampant in crowded cities, and industrial accidents were all too common. It was not unusual for a boy to lose a father to death or sometimes desertion. His widowed or deserted mother had few jobs open to her, and those that were, as a seamstress or a servant, paid little. There was very little public welfare available either, so desperate, poor mothers often turned to orphanages to care for their children until the mothers were better off or the children were old enough to work. Although industrialization worsened the lives of unskilled workers and their families, it had the opposite effect on the
Orphanages were live-in asylums, common from the eighteenth century to the 1930s, that aided impoverished boys and girls (aged four to twelve) missing one or both parents. Most orphanages cared for children of both genders in separate sections of their buildings, but approximately 9 percent (in 1890) admitted boys only. Middle-class women, and to a lesser extent middle-class men, founded most orphanages to care for impoverished working-class children of immigrants, many of whom had lost a parent due to death. Orphanages were primarily local, privately run institutions with a religious orientation. Most operated on a shoestring. Many were founded by ethnic groups, especially recent immigrants, but few admitted African American boys and girls. Most boys remained in orphanages for a year or two and then were discharged to work for farmers or return to their families. By the late nineteenth century, reformers criticized orphanages for separating children from families and raising them in an artificial environment. Nonetheless, orphan asylums continued in existence until the 1930s, when federal legislation in the Social Security Act created Aid to Dependent Children, making it possible for mothers to afford to keep their children at home rather than place them in orphanages. In the 1980s and 1990s, there
481
482
Orphanages
Jacob Riis’s photograph “Prayer Time, Five Points House of Industry,” ca. 1889, shows small boys in a New York City orphanage. (Bettmann/Corbis)
middle class. Educated men found new jobs in industry as managers, accountants, and lawyers. They earned good wages and could afford to support their wives and children in large, comfortable homes staffed by many servants. For middle-class women, a change in gender roles occurred. No longer did they have to spend long hours at housekeeping and child tending. They could turn these tasks over to servants and spend their spare time in volunteer work outside the home. For them a natural area of interest was children, particularly those boys and girls who lacked fathers to support them and whose mothers were too poor to care for them adequately. It was primarily these middle-class women who founded and maintained orphanages.
Most orphanages were established in cities with large numbers of poor boys and girls. The founders of most orphanages were strongly religious Protestants who taught their orphan charges accordingly. Catholics and Jews feared the conversion of their children to Protestantism, and so they too founded orphanages to spread their respective faiths to their young inmates. Ethnic groups such as Poles and Italians often formed orphanages to care for poor children of the same background. Most asylum founders were whites, and very few admitted African American boys and girls. After the Civil War, African Americans in the South founded some orphanages, but most freed slaves did not have the money to build and support asylums for children.
Orphanages The typical privately run orphanage held fewer than 100 children, although in the late nineteenth and early twentieth centuries some larger public orphanages were founded in the Midwest. Orphanages customarily housed boys and girls in separate wings of a large building, but by the late nineteenth century some orphanages housed groups of 30 to 50 boys or girls in large cottages. Founders believed that girls needed to be protected from the wild, loud antics of boys and from any sexual contact with them. The regimen of orphanages was fairly strict. Founders believed that impoverished boys who had grown up without a parent in dangerous cities needed to be retrained in orphanages to more disciplined habits so they could mature into responsible adults. Boys presumably learned the value of regularity when bells rang to awaken them and send them to the washroom, dining room, playroom, and school. After the Civil War, many asylums introduced military drill to teach boys the value of disciplined action. Boys also learned that they had responsibilities when they were assigned to do yard work or heavy chores. Because orphan asylum founders feared that the impoverished parents of boys might interfere with their reform, asylum managers often limited parental visits to once a month in the afternoon. Some orphanage officials also censored the mail of boys and their families. Inside asylums, orphanage officials also expected boys to learn to live a Spartan life. Presumably they would leave orphanages to return to a working-class existence, so there was no reason for them to get used to lives of luxury. Boys wore simple uniforms, bathed in cold water, and played in large rooms with very few toys.
483
Any boys who violated asylum rules were slapped, kicked, or hit on the palm with a switch. Orphanage food was adequate but not plentiful. Older boys often left the table hungry. Sometimes boys found their way into asylum kitchens at night and stole food. Other times they sneaked out of asylums and purchased or stole candy and fruit from neighborhood stores. Despite their best efforts, orphan asylum officials could not completely control the lives of their young charges. Older boys in orphanages bullied younger ones or offered younger ones “protection” from teasing and violence in bathrooms and playrooms in return for favors such as giving over desserts or doing servile tasks like shining shoes. In some cases, younger boys were forced by older ones to provide sexual favors. Such activity was probably rare, however, given the young age of most orphans: few were older than twelve, and their average age was ten in the 1920s. Because they were so young, orphan boys were likely to contract childhood diseases like scarlet fever and measles. Since antibiotics to treat scarlet fever and inoculations against diseases like the measles were not developed until the 1930s and after, contagious diseases were a very serious matter in nineteenth-century and early-twentieth-century orphan asylums. Often diseases spread quickly, and it was not uncommon for many children to die from them. By the late nineteenth and early twentieth centuries, orphanages tried to limit child deaths by having doctors examine new inmates carefully to prevent ailing youngsters from entering and spreading disease. Also, when despite doctors’ best efforts boys did become sick, orphanages isolated them in special hospital wards until they recovered or died.
484
Orphanages
Even with all the disadvantages of asylum life, boys benefited from the schooling they received in orphanages. Before the end of the nineteenth century, schools were overcrowded and understaffed in most cities. Many poor boys did not attend them at all. However, all nineteenth-century orphanages maintained schools where female teachers taught their young charges to read and write and do elementary arithmetic. Many orphanages also tried to prepare boys for work by teaching them how to use tools and make simple objects out of wood. By the twentieth century, orphanages frequently closed their schools and sent their young charges to neighborhood public or, in the case of Catholic orphanages, parochial schools. Here orphans came in contact with a wider range of acquaintances and probably obtained a more well-rounded education than they did in asylums. Orphan asylums grew in number throughout the nineteenth century, but they also had critics. One of the most influential was Charles Loring Brace, who founded the New York Children’s Aid Society in 1853. He argued that orphanages were bad for boys and girls because they removed youngsters from families and raised them in artificial, overly regimented environments where they failed to learn independence or develop their own individuality. In 1909 President Theodore Roosevelt convened the first White House Conference on Dependent Children, which concluded that the best place for a boy to live was within a family, either his own or, if necessary, a foster family. Orphanages should be a last resort for the care of poor children. Nonetheless, despite criticism, orphanages continued to provide large numbers of boys and girls with temporary homes until the 1930s.
Orphan asylums were never intended to be permanent residences for boys and girls. Most remained for one to four years, although a few stayed five years or more. A boy’s length of stay often depended on the condition of his natural family and his age when he entered the orphanage. If a child’s family fell on hard times temporarily, he might be able to return to his mother or father in a short time. Conversely, if families broke up due to death or illness, boys might never return to them. Age mattered, for very young boys would probably stay longer than older ones provided their families were too poor to reclaim them. In the nineteenth and early twentieth centuries, boys of ten years, twelve years, and above were old enough to work. Sometimes poor mothers reclaimed their boys at this age so they could come home to work and help support their families. In other cases, orphan asylums signed indenture contracts with farmers and tradesmen who took older boys into their family homes to live and work while providing the boys with room, board, and sometimes schooling. By the twentieth century, indenture contracts were rare, but orphan asylums still placed out boys in foster family homes. Often the boys worked for their keep, but sometimes asylums paid families to take the children in. The end of orphanages came after 1935 with the passage of the Social Security Act. One part of the act created Aid to Dependent Children (later renamed Aid to Families with Dependent Children, or AFDC), which provided federal government payments to needy mothers to allow them to keep and care for their children. Since the 1910s, some needy mothers had received pensions from state governments to enable them to afford to raise
Orthodontics their own children. Yet because mothers’ pensions provided minimal benefits to a few women, thousands of other impoverished mothers often had no choice but to place their children in orphanages to protect them from homelessness and even starvation. In the Great Depression, which began in 1929, most states did not have the tax dollars to pay mothers’ pensions, and the population of children in orphanages skyrocketed. Removing boys and girls from their families purely because of poverty had been frowned upon since the 1909 White House Conference on Dependent Children. To prevent such family breakups, the federal government created AFDC. Now that they had the opportunity to choose, needy mothers elected to keep their children at home with the help of AFDC payments. Foster care programs also grew in number and accommodated more children. Orphanages either went out of existence or were transformed into homes for emotionally troubled or abused youngsters. However, in the 1980s and 1990s, there was some talk of reviving orphanages to help care for the large number of children who were victims of abuse in their biological or foster family homes or whose parents were drug users. Some conservatives also favored orphanages as methods of discouraging illegitimacy by denying unwed mothers welfare and forcing them to give up their children to orphanages instead. The most thoughtful and articulate proponents of reviving orphanages regard them as one alternative for children in need and not a final solution to all problems of child welfare. Priscilla Ferguson Clement See also Foster Care; Placing Out
485
References and further reading Ashby, LeRoy. 1997. Endangered Children: Dependency, Neglect, and Abuse in American History. New York: Twayne Publishers. Clement, Priscilla Ferguson. 1997. Growing Pains: Children in the Industrial Age, 1850–1890. New York: Twayne Publishers. Hacsi, Timothy A. 1997. Second Home: Orphan Asylums and Poor Families in America. Cambridge, MA: Harvard University Press. McKenzie, Richard B., ed. 1998. Rethinking Orphanages for the 21st Century. Thousand Oaks, CA: Sage.
Orthodontics Orthodontics is a branch of dentistry concerned with the development and growth of facial form. Orthodontic treatment focuses on the correction of irregularities of tooth alignment and malocclusion. It is a skilled and complex specialty, with its own techniques and procedures, and successful orthodontic treatment can bring about improvements in facial appearance and function. For many American boys, orthodontics is synonymous with braces, a broad lay term covering a wide variety of corrective appliances that have become an everyday experience of boyhood. The earliest examples of orthodontic treatments have been found in Greek and Etruscan remains from pre-Christian times. In American history, there is no indication of anything more than sporadic attempts at the regulation of tooth development until the second half of the nineteenth century. Norman Kingsley was one of the first Americans to develop orthodontic techniques, which he outlined in his 1880 “Treatise on Oral Deformities as a Branch of Mechanical Surgery.” Kingsley and others developed
486
Orthodontics
A teenage boy with braces gives a big smile. (Bob Rowan; Progressive Image/Corbis)
methods for realigning teeth and correcting protrusions. By the 1890s, the failure of the upper and lower teeth to meet properly when the jaws are closed, or malocclusion, was also viewed as a problem and one that could be treated using orthodontic methods. Edward H. Angle developed the Angle classification of malocclusion around 1900, and it remains in use 100 years later. By the early 1900s orthodontics had developed into a recognizably modern form, establishing its twin concerns of poor tooth alignment and malocclusion. The best ways to treat these problems have been the subject of ongoing debate, and the nature and extent of treatments have also changed, with the impact of these debates and changes being felt by generations of American children. In many ways, orthodontic treatment affects boys and girls in the same way, but some differences are apparent.
Angle and his colleagues at first developed techniques using extra-oral force and opposed the use of dental extractions as part of their treatment strategy. Good occlusion was their primary aim, often at the expense of proper facial proportions. The earliest orthodontic appliances were cumbersome and clumsy, and their degree of effectiveness was limited. In the early 1900s George Crocat developed appliances using gold wires and springs that were more effective and somewhat easier to wear, but their high cost meant that they were not available to the vast majority of boys. In 1929 the American Board of Orthodontics was established as the first specialty board in dentistry. During the 1930s extraction came to be seen as a legitimate orthodontic strategy, helping to improve a boy’s appearance and also stabilizing the improved occlusion produced by the use of appliances. For the next few decades, American orthodontists favored a combination of dental extractions and fixed appliances. These appliances, as the name suggests, were fixed into the boy’s mouth using special wiring techniques and could not be removed by the wearer. In contrast, European orthodontists favored removable appliances that could be removed and reinserted by the wearer for cleaning or readjustment. From the 1960s, treatment developed further with the introduction of “functional” appliances that acted on the position of the mandible as a way of altering tooth position. Orthodontic appliances remained cumbersome, and boys were less likely to cooperate with treatment than were girls (Clemmer and Hayes 1979). After 1980, American and European orthodontists moved closer together with regard to treatment. Removable appliances are now common for early treatments, with
Orthodontics fixed appliances being used for later or more complex treatment. Studies using Angle’s classification have found that only a minority of American children have normal occlusion according to Angle’s strict definition. Estimates of the extent of malocclusion have ranged from 35 percent to 95 percent, depending upon the degree of deviation from normal that is considered to be acceptable. Research in the 1960s found that in six- to eleven-year-olds, only 22.9 percent of white children and 33.1 percent of black children had acceptable occlusion, with 13.7 percent and 16.9 percent, respectively, being assessed as having severe malocclusion. Demand for orthodontic treatments rose markedly in the late twentieth century, but this increase does not necessarily indicate a worsening of American boys’ dental occlusion or tooth alignment. More effective techniques, more comfortable appliances, and increased availability of orthodontic treatment may all have played their part. Indeed, as general health improved, perhaps parents could turn their attention away from other, life-threatening health problems toward a greater concern for dental comfort and facial appearance for their boys. However, although the need for orthodontic treatment is higher in boys than in girls, the demand for it is lower (Wheeler et al. 1994). About 4 million people in the United States are being treated with braces at any one time, and almost one-quarter are adults (American Association of Orthodontists 2001). In the case of adult patients, 70 percent are female. In children, 60 percent of patients are girls, and 40 percent are boys. These percentages suggest that around 1.25 million boys in the United States receive orthodontic treatment at any one time. This gender differ-
487
ence in orthodontic treatment may well be because American society puts greater stress on facial appearance for women than for men, although many boys also undergo orthodontic treatment for aesthetic reasons. Modern orthodontics offers a range of treatments. Fixed and removable appliances can be active, aimed at achieving tooth movement, or passive, designed to maintain tooth position. Retainers are an example of the latter and are used to maintain new tooth positions in the period immediately following active treatment. Since the 1980s new bonding techniques have enabled fixed appliances to be fitted without the need for metal bands. In the late 1990s, braces and retainers became available with a choice of colored wires or with logos on them. Boys were able to customize their braces to give them a distinct, stylish, and perhaps more masculine appearance. Orthodontics can still be a costly therapy—the average cost of comprehensive orthodontic treatment in the late 1990s was more than $2,000— but it is nonetheless increasingly common. Demand for orthodontic treatment tends to be higher in boys from urban settings than from rural ones and in boys from higher-income families. The American Association of Orthodontists recommends that all children undergo an orthodontic assessment by seven years of age. Boys typically start treatment at eleven or twelve years old, slightly later than girls. Despite the greater need for treatment in boys, much of the promotion of orthodontics is aimed at girls. When “celebrities with braces” are identified as “role models,” the vast majority are female and are targeted at girls, although some, such as athlete Carl Lewis and football players Terrell Davis and Brett Favre, are role models for boys (“Yo, It’s Time for
488
Orthodontics
Braces” 2001). Treatments that once stigmatized those boys who underwent them are now seen as a normal experience. “Braces” in their many forms are part of growing up. Indeed, in some parts of the United States, orthodontic treatment is so common that those boys who do not need it may feel left out. Bruce Lindsay References and further reading American Association of Orthodontists. 2001. “Orthodontics Online,” http:// www.aaortho.org/ (accessed March, 2001).
Clemmer, E. J., and E. W. Hayes. 1979. “Patient Cooperation in Wearing Orthodontic Headgear.” American Journal of Orthodontics 75, no. 5: 517–524. Proffit, William R. 1993. Contemporary Orthodontics. 2d ed. St. Louis: Mosby Year Book. Wheeler, T. T., S. P. McGorray, L. Yorkiewicz, S. D. Keeling, and C. J. King. 1994. “Orthodontic Treatment Demand and Need in Third- and Fourth-Grade Schoolchildren.” American Journal of Orthodontics and Dentofacial Orthopedics 106, no. 1: 22–33. “Yo, It’s Time for Braces.” 2001. http:// tqjunior.thinkquest.org/5029/ (accessed March, 2001).
P Parachurch Ministry
family. When a boy in the Plymouth Colony was sent out of the home for service or apprenticeship, the host committed to provide the youth not only with training, a stipend, and room and board but also with religious instruction (Browning 1997, 56). In colonial America, a boy like the young Thomas Jefferson, who could afford a more formal education, was sent to live with educated clergy for tutoring and religious instruction. In the southern colonies, the family was chiefly responsible for training boys and girls in religion. There the ideal was to “pray thrice daily, read scriptures at dawn and dusk or consult family members about the state of their souls.” Samuel Davies (1723–1761), a Presbyterian pastor and hymn writer, commanded his flock in a sermon to “either set up the worship of God immediately in your families or sin willfully against the knowledge of the truth” (both quotes by Davies are cited in Heimert and Miller 1967, 199). Among the southern clergy, Davis was also one of those who strongly advocated targeting young African American slaves and freed persons in his preaching. Children were not a primary focus of the colonial religious revivals of the eighteenth century, yet nonetheless male youths were some of the persons most affected by these spiritual renewals. George Whitefield (1714–1770) embarked on his
The term parachurch refers to “any spiritual ministry whose organization is not under the control or authority of a local congregation” (White 1983, 19). The phenomena of parachurch ministries and organizations, although not unique to the United States, has been and continues to be a central defining aspect of the American religious experience, particularly among evangelical Protestants. Parachurch organizations have played an important role in the religious and moral formation of boys for most of this country’s history. These groups often work alongside churches and local congregations to gain support and funding but maintain independent structures and set their own doctrinal standards. Michael Anthony observes that “historically parachurch organizations have been on the cutting edge of ministry. They are more susceptible to change and do not have the same degree of bureaucracy associated with many church denominational structures” (Anthony 2000, 326). By design, parachurch ministries form to do specialized work that is often mission-oriented and geared toward specific subgroups. A conservative estimate puts the number of these groups at more than 10,000 in the United States alone (310). From the colonial period until the Civil War, religious instruction of boys was primarily the role of the local parish and the
489
490
Parachurch Ministry
first preaching tour in Georgia in 1738 to assist in the founding of an orphanage. Jonathan Edwards’s (1703–1758) ministry in New England included attempts to educate and evangelize Native American youth. And Jonathan Parsons related the continuing effects on young people of Gilbert Tennent’s three-month preaching tour in 1744: By the latter end of April our young people were generally sick of that vain Mirth and those foolish amusements that had been their delight and were form’d into several religious societies for prayer and reading books of piety under my direction. Many were in my study for advice and the bent of their soul was evidently towards things of another world. (Heimert and Miller 1967, 199) The religious revivals in the period following the American Revolution came to be called the “Second Great Awakening.” One of the lasting legacies from this movement was the proliferation of voluntary societies. These groups had a variety of agendas that included social reform and the evangelization of the frontier and world. The American Sunday School Union was founded in Philadelphia in 1824. These “Sabbath schools,” as they were called, were formed to convert children outside the church and spiritually strengthen Christian young people. Prior to the emergence of public schools, the Sunday school movement provided a rudimentary education for many lowerclass children. There were 70,000 Sunday school associations founded in the nineteenth century alone. Secular historical studies of the Civil War have often failed to account for the deeply religious nature of the conflict
both in the North and South. Religious impulses could inspire boys and men to enlist, fight, and die. Northern ministers and abolitionists consistently presented the northern cause as God’s will for the nation. Southern clergy equally spoke of preserving God’s preordained ways. Boys and men killed in combat in both armies were spoken of as martyrs for the cause. Religious revivals were a constant part of camp life and were particularly popular among Confederate troops. It has been estimated that between 100,000 and 200,000 boys and young men converted in the revival “camp” meetings during the war (Moorhead 1978, 70). The spiritual, educational, and physical plight of the urban poor in general and wayward boys in particular became a mission focus for American churches and independent ministries in the industrial boom following the Civil War. One of the most influential voluntary groups, the Young Men’s Christian Association (YMCA), founded in England in the 1840s, had its greatest success in the United States. Perhaps no other figure in the nineteenth century reflects the move from denominational ministries to a parachurch model better than the mass evangelist D. L. Moody (1837–1899). The greatest revivalist of his time, he worked across ecclesiastical boundaries and had a keen interest in children and youth. D. L. Moody began his evangelistic work with the YMCA, promoted the Sunday school movement, and founded the Herman School for Boys. With the help of Presbyterian and Baptist leaders, Moody founded the Student Volunteer Movement in 1876, which over the years recruited thousands of college-age men and women to overseas missions. In 1881, a Congregationalist minister founded the Christian Endeavor Move-
Parachurch Ministry ment, a nondenominational organization formed explicitly for ministry to youth. Christian Endeavor became the prototype of youth ministry for both independent and denominational groups. Another famous group, the Boy Scouts, was founded in England in 1907 by Robert Baden-Powell and incorporated in the United States in 1910. Though not primarily a religious group, the Boy Scouts impressed upon boys the need for devotion to God, country, and church. Local church congregations often encouraged among their members the formation of independently functioning Christian Endeavor and Boy Scout groups. The Boys’ Brigade, which was founded much later, was an explicitly Christian version of the Boy Scouts, which included Bible instruction and Christian training alongside recreational and outdoor skills training. Perhaps the most famous parachurch ministry founded in the early part of the twentieth century was Roman Catholic. In 1917 Father Edward Flanagan began the work that would become Boys Town. From humble beginnings with a few boys rescued from the streets of Omaha, Nebraska, Boys Town became a national and then an international movement for rescue of orphaned and troubled boys. Spencer Tracy’s Academy Award–winning portrayal of Father Flanagan in the 1938 movie helped Boys Town establish itself as a national institution. Today, Girls and Boys Town continues as an international advocacy group for children and families. One of the bridges between the revivalism of the late nineteenth century and the fundamentalist modernist debates of the post–World War I period was the Bible Conferences and Holiness Camp Meetings. These conferences combined Bible study, moral exhortation, and recre-
491
ation. They also served as models for an explosion of Christian-oriented camping for children and youth throughout the twentieth century. Even as average attendance at church grew steadily after World War II, there was an increasing sense of the organized church’s failure to reach many young people, especially boys. In Texas, Jim Rayburn, a young Presbyterian minister, began Young Life in 1940. It was designed to be a Christian meeting for non-Christian high school students. The Young Life approach is characteristic of that of many parachurch ministries. It emphasizes adult staff and volunteers building nonjudgmental, caring relationships with high school students, particularly the unchurched. It presents an evangelical message emphasizing the need for a personal commitment to and relationship with Christ. Held in private homes, Young Life meetings are informal and fun, and their leaders present the Christian faith in terms adolescents can understand. Carried out with sensitivity to the culture and context of contemporary youth, Young Life meetings include music, humor, activities, and a Christian message. The organization’s commitment to excellence and Rayburn’s adage, “It’s a sin to bore a kid with the Gospel,” is exemplified in its camping program, which is one of the best in the country, religious or secular. Young Life’s relational, dynamic philosophy has proven especially appealing to many adolescent boys who were otherwise disinterested in formal religious involvement. It currently has branches in more than 550 communities and took more than 30,000 high school youth to camp in 1999 (“Where Is Young Life?” 2001). Other parachurch organizations sprang up in the 1940s to bring more young people to religion. In 1941, the U.S. chapter
492
Parachurch Ministry
of Inter-Varsity Christian Fellowship was formed and currently has a membership of more than 34,000 college and university students nationwide. Youth for Christ was founded in 1944. Its methodology was based on the traditional revival format and geared toward youth, and its first full-time staff person was a young Baptist preacher named Billy Graham. One of the largest parachurch organizations in the world is Campus Crusade for Christ, founded in 1951 by Bill Bright. It began as an evangelical ministry to college students and has since expanded to high school, adult, and family ministries. In the spirit of “muscular Christianity,” in 1954 a group of Pittsburgh businessmen began the Fellowship of Christian Athletes (FCA), which initially targeted male high school and college athletes. FCA sought to combine higher ideals of sportsmanship, team, and physical accomplishment with Christian spirituality. In its early years it attempted particularly to counter the cultural notion that Christianity was inherently a feminine activity. FCA has since added ministry to females and has more than 6,500 chapters nationally. The turbulence of the 1960s and 1970s and the rise of the youth culture shaped the direction of parachurch activity. African American children and youth often led the way in the marches and boycotts of the civil rights movement. Beginning in the late 1960s, the rise of Christian rock and Woodstock-type religious festivals engaged thousands of youth. The counterculture “Jesus people” and the neo-Pentecostal movement directly addressed alienated youth and victims of the drug culture. Some existing parachurch ministries as well as new ones began to grapple with the complex issues of race and justice surrounding the
needs of boys and youth in the poor urban areas of the United States. The resurgent conservatism of the 1980s led to youth events and ministries geared around themes of sexual restraint outside marriage, the return to traditional values, and abstinence from drugs and alcohol. In the 1990s, the emerging Internet culture included innumerable points of contact for youth interested in spirituality outside the confines of traditional religious institutions. There is a growing appreciation for the role parachurch ministries can play among urban youth in general and boys in particular in addressing the crisis of violent youth crime. John DiIulio, one of the nation’s leading political scientists who was appointed in 2001 to lead the Bush administration’s faith-based initiative, believes religion is the single most important strategy in countering predominantly male violent youth offenders whom he calls “super-predators” (DiIulio 1995, 27). Foundations like Public-Private Ventures and organizations like Boston’s Ten Point Coalition and DiIulio’s own Partnership for Research on Religion and At Risk Youth are leading the call for both private and governmental support for faith-based social initiatives targeting urban youth. The first decade of the twenty-first century will witness unprecedented opportunities for parachurch ministries to partner with public institutions in addressing the needs of urban and impoverished boys at risk. Parachurch groups have served an important role in educating the religious community about the unique spiritual, emotional, and social needs of boys. One of the most important legacies of youth parachurch ministries at the beginning of the twenty-first century is the resurgence of children and youth as a priority in both
Performers and Actors Protestant and Catholic parishes. The exponential increase in staffing and programming specifically for youth and children at the congregational level since 1970 is a direct result of parachurch influences. As Ellen Charry of Princeton Theological Seminary observes, parachurch groups may be the best hope of creating from childhood through youth a Protestant center in the postmodern and postdenominational United States (2001, 453). William L. Borror See also Boy Scouts; Boys Town; Muscular Christianity; Sunday Schools; Young Men’s Christian Association References and further reading Anthony, Michael J. 2000. Foundations of Ministry: An Introduction to Christian Education for a New Generation. Grand Rapids, MI: Baker Books. Browning, Don, ed. 1997. From Culture Wars to Common Ground: Religion and the American Family Debate. Louisville. Westminister/John Knox. Charry, Ellen T. 2001. “Will There Be a Protestant Center?” Theology Today (January): 453–458. DiIulio, John J., Jr. 1995. “The Coming of the Super-Predators.” The Weekly Standard (November 27): 23–27. Heimert, Alan, and Perry Miller. 1967. The Great Awakening. Indianapolis: Bobbs-Merrill. Marty, Martin E. 1984. Pilgrims in Their Own Land: 500 Years of Religion in America. New York: Penguin. Moorhead, James. 1978. American Apocalypse: Yankee Protestants and the Civil War: 1860–1869. Louisville: Westminster/John Knox Press. Noll, Mark. 1992. A History of Christianity in the United States and Canada. Grand Rapids, MI: Eerdmans. Rayburn, Jim III. 1984. Dance Children Dance: The Story of Jim Rayburn, Founder of Young Life. Wheaton, IL: Tyndale. Walker, Williston, Richard A. Norris, David W. Lotz, and Robert T. Handy. 1985. A History of the Christian
493
Church. 4th ed. New York: Charles Scribner’s Sons. “Where Is Young Life?” 2001. http://www. younglife.org. White, Jerry. 1983. The Church and the Parachurch: An Uneasy Marriage. Portland: Multnomah Press.
Performers and Actors Since colonial days, boys have acted in a variety of American public performance venues, including legitimate theater, minstrel shows, circuses, saloons, vaudeville, movies, and television. During the mid-nineteenth century, reformers began to investigate the effects of professional performing on children and to regulate it. Today, the entertainment industry employs thousands of boys who are governed by a complex—but not always effective—patchwork of health, education, and labor codes. The United States inherited its theater practices from England, where the tradition of using boys as actors was long established. The Hallam troupe, arriving in 1752, was the first professional theater company to establish itself in America. Actor-manager Lewis Hallam employed three of his own children, Lewis, Jr., Isabella, and John, as working members of the troupe. Lewis Hallam, Jr., later reported that he was twelve years old and too frightened to say his one line when he made his debut as a servant in The Merchant of Venice during the group’s first colonial performance. Although recent scholarship has discovered inconsistencies concerning his age, it is certain that Lewis continued to play small roles until the death of his father brought the company under new management. In 1758, Lewis Hallam, Jr., assumed the roles intended for a young, romantic male lead, playing opposite his mother, who re-
494
Performers and Actors
Gary Coleman and Conrad Bain in Diff’rent Strokes, 1981 (Kobol Collection/NBC TV)
mained the troupe’s leading lady. Lewis Hallam, Jr., is typical of many boys who performed as children; his historical fame is the result of his adult career rather than the juvenile work that formed its foundation. Until his death in 1808, Lewis Hallam, Jr.—actor, manager, and theater owner—was an important figure in American theater for more than fifty years. John Howard Payne (1791–1852) has been hailed as the first infant prodigy of the American stage. When Payne made his debut in New York in 1809, critics favorably compared him to William Betty. Betty was the same age as Payne and had made a sensation as London’s first child star. Although his career had been short
(1802–1804), he had made a lasting impression, and the public was eager to welcome a successor. Audiences praised Payne for his performances in the roles of Romeo and Hamlet. However, since Payne was nearly eighteen when he began his career, he was certainly not an “infant” and may scarcely be considered a boy actor. In addition to acting, as an adult Payne wrote plays and composed the lyrics to “Home, Sweet Home!” Young boys were popular performers in all forms of nineteenth-century entertainment, especially the circus. Some worked with their families; others were loaned out to or adopted by professional performers. The possible exploitation or abuse of these children became a matter of concern to many reformers, including Elbridge T. Gerry, who founded the Society for the Prevention of Cruelty to Children (SPCC) in 1874. For the next forty years, the SPCC was the primary protector of juvenile performers. Gerry uncovered many instances of severe abuses of performing children. For example, a small boy billed as “Prince Leo” had been purchased from his parents by a circus acrobat who beat the boy continually to force him to walk the tightrope and perform dangerous tricks. Another professional acrobat who had adopted a sister and brother, aged seven and eight, burned them with hot irons and locked them in a closet when they failed to perform satisfactorily. There are documented cases of children abducted from Europe and Japan, forcibly trained as acrobats, and then sold to circus acts. “Families” of acrobats in the circus often comprised several unrelated boys who had been acquired—by sale, rental, or adoption—to supplement their “father’s” act. Because of such practices, Gerry was able to persuade legislators to establish
Performers and Actors the nation’s first law protecting performing children. New York’s Act to Prevent and Punish Wrongs to Children was passed in 1876 and forbade the exhibition of children as singers, musicians, acrobats, gymnasts, riders, or participants in any dangerous acts. The provision against singers and musicians was most likely aimed specifically at the large numbers of Italian street performers. In the late nineteenth century, hundreds of Italian men and boys came to this country as little more than indentured servants under “padrones.” Padrones provided room, board, and employment but were often unscrupulous. They sent small sums back to Italy in payment for the children’s services but provided the boys with barely livable conditions and forced them to perform harsh labor. A great many of the boys were street musicians, forced to roam the city for eighteen or more hours a day in all weather and beg for money before returning to a small apartment to sleep on a small piece of floor with ten or twelve others. One of the first triumphs of Gerry and the SPCC was the conviction of a notorious padrone named Ancarola. At the time of his arrest, he had just “imported” seven boys between the ages of nine and thirteen. One testified that he was under contract to play the violin for Ancarola for four years. Although the public applauded the SPCC in its work against circuses and street musicians, there was a different reaction when the nonexhibition law was enforced on more genteel entertainments. Juvenile casts of Gilbert and Sullivan operettas were quite popular in the 1880s and 1890s. Gerry maintained that these performances were not only against the law but also harmful to the long-term health of children. The SPCC did strive
495
to protect children from all types of abuse, but Gerry seems to have had a particular focus on—some have called it an obsession—with performers. Both modern and contemporaneous critics have questioned the validity of his extreme position. Members of the press often ridiculed Gerry and attacked him for interfering in the lives of performers in operetta and legitimate theater. The general public believed that these performers led pampered, protected lives and received training for future careers. In a period when most poor children worked to support their families, performance was seen as a less harmful alternative to labor in factories or mines. Compulsory education for all was a new idea just beginning to gain acceptance. Many people believed that some vocational training was all that most children needed, and boy actors presumably received vocational training. Beginning in 1888, a dramatic version of Little Lord Fauntleroy provided work for dozens of boy actors on the legitimate stage. “Fauntleroyism” swept the country. Major cities had their own productions, and road companies toured the show into the mid-1890s. Wallie Eddinger, Jr., age seven, starred in the West Coast production. In New York, a boy and a girl, Tommie Russell and Elsie Leslie, alternated in the title role. Russell began acting professionally at age two. According to a Pennsylvania newspaper report, the “mentally overworked” Russell had a breakdown in 1892. He later quietly pursued a career in real estate. Like Russell, the vast majority of boy actors do not maintain acting careers in their adult lives (New York SPCC, archives). During the early twentieth century, Progressive reformers raised an outcry against child labor of all types. State after state passed laws restricting child labor
496
Performers and Actors
and requiring education. A national debate developed concerning whether performance should be regulated under the child labor laws. The SPCC, the National Child Labor Committee, and reformers such as Jane Addams lobbied against children acting. The theater industry mobilized against them, using as spokesmen such popular performers as Francis Wilson and Joseph Jefferson, both former boy actors. Wilson, who began his career in minstrelsy and theater, was a particularly eloquent crusader for the right of boys to perform on stage. He later became the first president of the Actors Equity Association (AEA, formed in 1913) and was partly responsible for its solidarity. Professional actors, whose own income could easily depend on the presence of a child star, argued for a child’s right to work. The two sides squared off in test cases in Massachusetts, Illinois, and Louisiana during 1910, 1911, and 1912 respectively. Former boy performers of the time, including George M. Cohan, Buster Keaton, Milton Berle, and Fred Astaire, testified that they thought of the SPCC not as a protector but as an enemy trying to take the food out of their mouths. The results of the court cases were mixed. New York state ceded the regulation of child acting to the AEA, which fought for safe and sanitary conditions and better pay. It did not establish any safeguards specifically for children. During the Great Depression in 1938, the federal government finally passed the Fair Labor Standards Act, in part to restrict child labor. However, because many officials thought that acting resembled play more than labor, the Fair Labor Standards Act exempted child performers from its provisions. When the National Child Labor Committee undertook a
brief study of child actors in 1941, it reported an extreme variation among state laws at that time. There was and is today no consensus on what constitutes fair labor for performing children. With the advent of film and later television came increased opportunities for work, fame, profit, and exploitation. The enormously popular Our Gang comedies of the 1920s relied on the permanent youth of their characters. As soon as an actor outgrew his part, he was replaced with a younger child. Over a span of seventeen years, 176 children belonged to Our Gang. One of the first boys to achieve national stardom was Jackie Coogan, who began work at age three and starred with Charlie Chaplin in The Kid in 1921. Coogan worked steadily and earned more than $4 million, only to discover upon reaching adulthood that his parents had spent most of the money. He sued them for his earnings and received a settlement of $126,000—one-half of all that remained of his vast fortune. Another notorious case involved Freddie Bartholomew, whose aunt brought him to Hollywood from England. After he achieved stardom in David Copperfield (1935) and other Metro-Goldwyn-Mayer (MGM) films, his parents, grandparents, and sisters filed twenty-seven lawsuits against him for a share of his earnings. Bartholomew spent so much time in court that his employer fired him for nonperformance and legal fees consumed his savings. As a result of the attention drawn to these cases, California passed the Coogan Law (1939), which required 50 percent of a child’s earnings to be held in trust for him or her. California also instituted the position of “studio teacher,” a combination of educator and health and safety
Performers and Actors
497
Macaulay Culkin in Home Alone, 1990 (Kobol Collection/Smetzer, Don/20th Century Fox)
custodian who theoretically has the power to stop production if a child’s welfare is endangered. In practice, the law provided protection only to those under long-term studio contracts, which soon ceased to exist. In addition, parents and teachers frequently cooperated with producers in evading the education and safety regulations. Since many states have few restrictions on child labor, some production companies still seek out locations specifically so that their child performers can be unregulated. During the post–World War II baby boom, television idealized the American family, providing enormous earnings and exposure for many boy actors. During the 1950s and 1960s, Jon Provost spent seven years starring on Lassie, while Jerry
Mathers, Tony Dow, and Ken Osmond appeared in Leave It to Beaver. As in vaudeville, some adult performers incorporated their own children into their acts. Lucie Arnaz and Desi Arnaz, Jr., appeared on The Lucy Show. Ricky and David Nelson joined the cast of their parents’ show, The Adventures of Ozzie and Harriet. Today, hundreds of children work in the television industry performing in shows, movies, and commercials. As in previous eras, the playlike quality of the work of acting disguises the fact that it is essentially labor. The continued existence of economic exploitation was dramatized by the experiences of Macaulay Culkin and Gary Coleman. Culkin began work at age four, making several films before starring in
498
Performers and Actors
the enormously popular 1990 film Home Alone and its sequel two years later. Culkin was aggressively marketed by his father, a former child and adult actor, who was able to obtain multimillion-dollar contracts for his son. Culkin’s tremendous popularity revitalized the market for child actors and raised their income levels. After Culkin earned $23 million in two years, Forbes included him in its 1993 list of richest entertainers. Although Culkin’s parents received management fees of 15 percent on his income and that of his four working siblings, by 1995 they were bankrupt. Culkin petitioned the court to authorize the release of money from his trust fund to pay their rent. Following his parents’ subsequent separation, a bitter, two-year custody battle ensued, with each parent claiming a stake in Culkin’s fortune. His mother was awarded sole custody in 1997. As a boy, Gary Coleman starred in the television series Diff’rent Strokes, but found his fortune gone when he reached adulthood. In 1994 he sued his parents in an effort to recover it. In 1991 former child actor Paul Petersen founded the nonprofit organization A Minor Consideration to support and assist former and current young performers. Based in California, A Minor Consideration has uncovered and publicized numerous abuses of the health, safety, and education statutes and has lobbied for better enforcement. Petersen has drawn attention to the physical exploitation of premature babies who are sometimes used on hospital shows to simulate newborns in order to evade minimum age limits. According to A Minor Consideration, prior to its involvement, officials routinely ignored abuses, and no film production company had ever lost its “Certificate of Eligibility
to Employ Minors.” Nor had any child ever been denied a work permit. Petersen, who was an original Mouseketeer and cast member of The Donna Reed Show, joined the Young Performers Committee of the Screen Actors Guild (SAG). In 1998, at Petersen’s instigation and under the direction of Lisa Rapport of Wayne State University, SAG sponsored the first scientific study of the psychological effects of celebrity on children. SAG also recently helped secure the passage of the first revision of the Coogan Law since 1939. The California statute is designed to provide economic protection to young performers. Effective January 1, 2000, the earnings of a child actor, musician, or athlete are, for the first time, solely his or her own. Shauna Vey See also Films; Jobs in the Nineteenth Century; Melodrama; Television: Domestic Comedy and Family Drama; Vaudeville References and further reading Brooks, Tim, and Earle Marsh. 1979. The Complete Directory to Prime Time Network TV Shows 1946–Present. New York: Ballantine. Cary, Diana Serra. 1979. Hollywood’s Children: An Inside Account of the Child Star Era. Boston: Houghton Mifflin. Hewitt, Barnard. 1959. Theatre U.S.A.: 1665–1957. New York: McGraw-Hill. Myers, Robert J., and Joyce Brodowski. 2000. “Rewriting the Hallams: Research in 18th Century British and American Theatre.” Theatre Survey 41, no. 1: 1–22. New York SPCC (New York Society for the Prevention of Cruelty to Children). Scrapbook collections in the archives contain the following clippings: On Wallie Eddinger, Jr., see New York Herald, November 1, 1892; Peoria, Illinois, Transcript, February 10, 1892; and Everybody’s Magazine, September 1, 1903. On Tommie Russell, see Tyrone, Pennsylvania, Daily Herald, January 25, 1892; New York Recorder, May 1, 1892; and New York Herald,
Pets December 29, 1897. On Elsie Leslie, see Everybody’s Magazine, September 1, 1903; and New York World, April 10, 1910. Petersen, Paul. 2001. “A Minor Consideration.” Gardena, CA. www.aminorcon.org (accessed March 1, 2001). Vey, Shauna. 1998. “Protecting Childhood: The Campaign to Bar Children from Performing Professionally in New York City, 1874–1919.” Ph.D. diss., City University of New York. Wilmeth, Don, with Tice L. Miller. 1996. Cambridge Guide to American Theatre. Cambridge: Cambridge University Press. Zelizer, Viviana A. 1985. Pricing the Priceless Child: The Changing Social Value of Children. New York: Basic Books.
Pets One of a boy’s closest and most significant companions may be his pet. Pet keeping is of relatively recent origin in American culture and is strongly associated with the emergence of the middle class. Today it is very common in families with children. Psychological attachment characterizes many child-pet relationships and is built on the reciprocity of interactions with the animal. Pet keeping benefits boys’ social development, mental health, and self-esteem. The costs of pet keeping include distress caused by what happens to the pet, financial costs, duties, and health risks. Abuse of animals is a societal concern, but there is potential for pets to contribute to humane attitudes. Pets offer boys in particular an outlet for nurturing behaviors. Boys and their pets must be placed in historical perspective. Today having pets is socially acceptable—a $20-billion-peryear business in the United States—but pet keeping in the form we know it today is a relatively recent development. Two hundred years ago animals were kept, but
499
the relationship was not one of special affection and virtual family member status. Dogs and cats were kept in sixteenth-century England for their usefulness in shepherding, ratting, hunting, and so forth. The then-extant breeds were called by names indicating their function, such as “fynder” (or water spaniel); only later were modern pedigreed breeds developed. It is easy for us to assume dogs and cats must always have been regarded with the same esteem in which we hold them, but diaries and memoirs from the 1700s do not discuss the relationship between pets and owners. William Shakespeare’s references to dogs highlight distasteful connotations such as vulgarity, subversion, and bestiality. Someone showing affection or interest in an animal was subject to ostracism and satire. Individuals who did keep pets were those protected by both wealth and rank from the economic costs and the social derision. For example, King Charles II was notorious for doting on his lapdogs (Ritvo 1987). By the eighteenth century, however, companion animals were increasing in England and, with some lag time, in America too. Books and periodicals appeared featuring dogs, especially sporting dogs. But not until well into the Victorian period did the institutions of dog fancying appear. The first formal dog show was held in 1859; the Kennel Club for owners or registered breeds of dogs was founded in 1873, and a year later came the first canine stud book to track breeding. Analogous organizations for cats appeared in a few decades. This infrastructure was accompanied by a vast Victorian literature expressing sentimental love for pets. Consistent with its origins, pet keeping by the lower classes was criticized as an indulgence at the expense of the family’s children.
500
Pets
A boy asleep with his dog under the covers (Bettmann/Corbis)
Two explanations of the late-eighteenth- to early-nineteenth-century increase in the acceptability of middleclass pet keeping have been offered. First, it may be linked to industrialization. Industrial technologies made it economically possible for many people to support pets. But the practices of breeders who would manipulate strains of dog by selecting for purposeless or exaggerated features reveal how pet keeping expressed industrial society’s dominance over nature. In Harriet Ritvo’s view, animals symbolically represent nature, and people cannot form affectionate ties to nature until they dominate it (see also Tuan 1984).
Second, pet keeping may have reflected the concern for character formation in the early-nineteenth-century United States (Grier 1999). An emerging middle class was charting a new ideal of family life that can be called “domesticity.” The home stood in contrast to the commercial domain and its rough pursuit of selfinterest. The special mission of the domestic realm was to cultivate the countervailing virtue of gentility, which combined self-control and softened feelings. The potential for such kindness was extolled by parenting advisers in the antebellum United States as something “natural.” As agents of a sea change in American attitudes toward parenting,
Pets they followed thinkers such as JeanJacques Rousseau in the view that children are innocent, good-hearted beings. Kindness to animals, in this perspective, was regarded as a foundation of virtue. In this context, masculine violence stood out as especially problematic. Public and private corporal punishment, wife beating, child abuse, and beating of animals were targets of reformers. If children were naturally good, the special proneness to transgression of boys (later to be men) needed explanation. Expressing an older idea, reformers believed that childhood cruelty had a “hardening effect.” Parents should be vigilant about any sign of boyhood cruelty to animals. Harming an insect could be a step down the slope to domestic violence. Voluminous literatures provided cautionary tales and exemplars for children; for parents, advice focused on the importance of instilling self-consciousness of the effects of one’s actions and of dealing gently with young sentiments even when correcting them. These matters affected not only the family but the moral progress of society as a whole (Grier 1999). For boys especially, pet keeping was thought to be critical for socialization in two ways. A pet in the house provided practice material for children learning to act kindly and gave mothers the “small world” where they could intervene and instruct at critical moments, such as when a child might be inclined to strike or hurt an animal. In addition, animals themselves were regarded as exemplars that could teach such virtues as gratitude, fidelity, and enduring love. Middleclass parents were encouraged to keep many different animals for their children. By the early twentieth century, these practices were rationalized by the influential psychologist G. Stanley Hall, who
501
founded the “child study” movement that captured the attention of many middle-class mothers. He was also inspired by Charles Darwin’s theory of evolution. In Hall’s theory of psychological development, just as the embryo repeated the earlier stages of human evolution, so too did the course of childhood represent the stages of human cultural evolution. Pets played important roles in this repetition, culminating in an interest in horses, the last step before the industrial stage. A boy without pets was, in his view, deprived of his evolutionary inheritance. Outside the home, an active dog complemented boys’ relatively greater mobility (in comparison to girls’). Boyhood adventures with dogs (including hunting) have been romanticized in American culture. Boys and Pets Today Knowledge of children and pets today still bears on questions of values and ethics, but it also draws on a growing body of empirical evidence. Not all of the work done has addressed or detected important differences for boys versus girls, so much of what follows applies to both sexes. According to the American Veterinary Medical Association, in 1996, 58.2 million U.S. households (58.9 percent) owned one or more pets; about half of these households owned dogs, cats, or both. The animals included 59 million cats, 52.3 million dogs, 12.6 million birds, 5.7 million rabbits, 4.8 million rodents, 3.5 million reptiles, 56 million fish, and 4 million horses. In surveys, dogs are consistently the favorite pet of about 50 percent of children, as are cats for about 30 percent. Another 20 percent favor some other group listed above (Statistics Research Group 1997). Pets are common in families with school-age children and adolescents. A
502
Pets
majority of parents believe pet keeping is good for children. In households with children aged eight to twelve, 75 to 90 percent have pets (Bryant 1990). The number of pets is not influenced by socioeconomic status, although rural families have more pets than urban or suburban ones. More children in the family predicts fewer pets. Most children want pets regardless of the reasons for a family not having them, and only 3 percent of children in non-pet-owning homes have no interest in pets (Kidd and Kidd 1990). Boys and girls are equally likely to own or spend time with pets, and childhood pet ownership is a good predictor of adult ownership. Psychological Dimensions The concept of psychological attachment describes a quality of child-pet relationship beyond ownership. Attachment is an emotional tie that bonds one person to another or, in this case, to an animal. It endures over space and time and is expressed by such actions as spending time with the pet; showing interest in it; having positive ideas about it; holding the pet and cleaning up after it; sleeping near the pet; giving gifts; and feeling that the pet knows what the child is feeling, is a family member, and likes the child. Older children express their attachment in emotion words and in thoughts of the pet. A child that becomes attached to a pet at an early age is more likely to hold positive attitudes toward pets in adulthood (an effect that is weaker for boys than girls). In addition, loss of a pet (through death or through the pet’s being given away, abandoned, or lost) is like the loss of any other close partner. Children need the opportunity to grieve and mourn a lost pet. Important roots of attachment lie in the degree and kind of interactions that
transpire between child and pet. Human children are remarkable in the degree to which they can extend their developing social abilities flexibly across the species boundary (Myers 1998). From early infancy, boys and girls differentiate animals from both inanimates and other people. Analyses of dog-child pairs show that the child takes the most initiative in interactions but also that the dog acts as a responsive partner. Observations of children interacting with various species show them adjusting their interactive moves to accommodate the animal. Children also incorporate animals into their verbal world, talking to them, especially in times of stress. Most eleven-year-olds believe animals are capable of linguistic communication and moral responsibility. Animals’ salience is suggested by their frequent appearance in the dreams of children, up to about age eight. The patterns of interactiveness described above are analogous to other findings that show how a child’s sense of self comes about through comparisons with others and through others’ reflected appraisals of the child. The give-and-take of enduring relationships offers a child the opportunity to clarify his or her sense of self. Arguably, given the responsiveness of many pets, the child’s self is defined within a more-than-human community of others. Pets (and other animals) help deepen the sense of what it means to be human by allowing comparisons and a sense of commonality not otherwise available. Benefits and Costs The psychological dimensions of the child-pet relationship underlie some of its effects. Benefits include companionship, involvement in play, and feelings of closeness and warmth. Pets may affect health
Pets positively, for example, by decreasing blood pressure. Several factors influence the social, emotional, and psychological benefits of pets, including age, gender, and especially the type of bond felt with the animal. A close bond with a pet has been found to correlate with a child’s empathy, cooperativeness, social competence; a pet lowers anxiety, reassures, and reduces problem behaviors and withdrawal from society. Boys (but not girls) with high pet attachment are reported by teachers to do better at school. Species of animal and family structure do not appear to directly affect these benefits (Melson 2001). The use of pets in mental health treatment was pioneered in the 1960s and today includes using animals to treat special conditions, such as autism, severe learning disabilities (such as Down’s syndrome), emotional illness, and multiple disorders. Part of the success of highly interactive animals such as dogs in such therapy may be due to the animal’s ability to stimulate more interaction and initiation on the part of the child. Benefits and costs have also been studied from the child’s point of view. Four areas of benefit include mutuality (helping each other), enduring affection, selfenhancing affection (the pet makes the child feel good about him- or herself), and exclusivity (sharing private feelings). Costs children perceive include feeling distress over the pet’s death, having it given away, caring for it when it is sick or hurt, doing pet chores, being blamed for something it did wrong or for not caring for it, and worrying for its safety (Bryant 1990). Other, more objective costs include dog bites (2.8 million children were bitten in 1995); pet-related human health concerns (allergies, intestinal parasites, psittacosis or “parrot fever,” rabies, etc.) and the dirt
503
and messiness caused by pets; financial and time burdens; and building and space restrictions. Children also express fears of animals, though fewer boys (51 percent) than girls (73 percent) do so. The most frequently feared animals across sexes are snakes, lions, spiders, tigers, dogs, crocodiles, and bears. Humane Attitudes and Nurturance The connections between being abused as a child, abusing animals as a child, and developing sociopathology later have received much attention in recent years (Ascione and Arkow 1998). Sporadic, nonsevere, or infrequent tormenting of an animal, however, is not uncommon and may represent things other than profound emotional disturbance, such as experimentation, lack of understanding, imitation of a role model’s behavior, cultural differences, and short-term stress. It should, however, always be taken seriously. In general, many children are disposed to show humane attitudes, that is, compassion and a positive attitude toward care and treatment of animals. Boys tend to score lower on tests of these attitudes than girls. Parental attitudes strongly affect them, but studies have inconsistently found pet ownership to do so. Humane education programs, such as those provided by the North American Association for Humane Education, have shown that it is possible for children to improve on measures of positive attitudes toward animals. Intensive programs, especially those that enhance empathy by role-playing, have the greatest effect. Of particular interest concerning boys is nurturing behavior. Until about age five, both sexes show similar degrees of interest in nurturing babies. But after that age, girls increase and boys decrease this behavior. However, boys, but not girls,
504
Photographs by Lewis Hine
gain more over the next few years in their knowledge of the care of puppies and kittens. Baby care is associated with gender differently than is pet care, so that by the elementary years pet care is an especially important avenue for the expression of nurturance for some boys (Melson 2001). Gene Myers References and further reading Ascione, Frank, and Phil Arkow, eds. 1998. Child Abuse, Domestic Violence and Animal Abuse. West Lafayette, IA: Purdue University Press. Bryant, Brenda. 1990. “The Richness of the Child-Pet Relationship.” Anthrozoös 3, no. 4: 253–261. Grier, Katherine C. 1999. “Childhood Socialization and Companion Animals: United States, 1820–1870.” Society and Animals 7, no. 2: 95–120. Kidd, A., and R. Kidd. 1990. “Social and Environmental Influences on Children’s Attitudes toward Pets.” Psychological Reports 67: 807–818. Melson, G. 2001. Why the Wild Things Are. Cambridge, MA: Harvard University Press. Myers, Gene. 1998. Children and Animals. Boulder, CO: Westview Press. Podbersek, A., Elizabeth Paul, and James Serpell, eds. 2000. Companion Animals and Us. Cambridge: Cambridge University Press. Ritvo, Harriet. 1987. The Animal Estate. Cambridge, MA: Harvard University Press. Serpell, James. 1986. In the Company of Animals. Oxford: Basil Blackwell. Statistics Research Group. 1997. U.S. Pet Ownership and Demographics Sourcebook. Schaumburg, IL: American Veterinary Medical Association. Tuan, Yi Fu. 1984. Dominance and Affection: The Making of Pets. New Haven: Yale University Press.
Photographs by Lewis Hine Lewis Wickes Hine (1874–1940), a Progressive-era social reformer, photographed children at work for the Na-
tional Child Labor Committee (NCLC) between 1906 and 1918. His documentary photography was part of a larger effort of the NCLC to educate the public about the dangerous and immoral conditions under which American children worked in mills, agriculture, canneries, mines, glass factories, tenement sweatshops, and a variety of street trades. The NCLC and Hine had as their ultimate goal passage of a national child labor bill, although they also worked for the passage and enforcement of state child labor legislation, especially in the South. Hine had a particular interest in the conditions faced by immigrant children and paid special attention to recording the ages of working children to document widespread abuse of existing state laws. His photographs thus provide images of working-class and rural boys of specific ages. They document not only the boys’ clothing, general appearance, and physical surroundings but also the often dangerous jobs that comprised the working lives of a less-than-carefree boyhood— the normal experience of many boys at the beginning of the twentieth century. Born the son of an owner of a coffee shop in Oshkosh, Wisconsin, Hine came to New York City in 1901, after studying briefly at the University of Chicago. He took classes in sociology at Columbia University and eventually earned a master’s degree in education at New York University. From 1901 until 1908, he taught botany and nature studies at the Ethical Culture School in New York. At some point during his teaching, he taught himself to use a camera. In 1905, while studying the social conditions of the poor, he was hired to photograph immigrants arriving at Ellis Island. His photographs graphically illustrated the poverty of the new arrivals, as well as their fears, their hopes,
Photographs by Lewis Hine and the dignity with which they withstood the roughshod bureaucratic treatment to which men and women and boys and girls were subjected. Shortly thereafter he completed the “Pittsburgh Survey,” a photographic investigation of the working and living conditions of men and boys employed in the steel industry of that city. From 1906 onward, images from both the Ellis Island and Pittsburgh photographic projects were published in Charities and the Commons (later renamed The Survey), a weekly magazine published in New York by social reformers. Eventually such other Progressive-era magazines as McClure’s also bought and published Hine’s photographs. Widespread publication of his dramatic images brought Hine to the attention of the National Child Labor Committee, which hired him on a part-time basis in 1906 and 1907 to photograph families with children doing piecework in the tenements of New York. The NCLC had been formed in 1904 to gather information, publish reports, and mobilize public opinion in order to pressure individual states and the federal government to pass child labor legislation. In 1908 Hine’s job became a full-time one; for $100 a month and expenses he traveled to Ohio, Indiana, and West Virginia to photograph conditions under which children worked in coal mines and glass factories. The next year he carried out photographic investigative assignments in textile mills in New England and Georgia and photographed immigrant children at work in the shrimp- and oyster-packing industry of the Gulf states. By the time he left the NCLC in 1918 in a dispute over salary, Hine had traveled thousands of miles throughout the southern, midwestern, mid-Atlantic, and New England states, producing thousands of photographs that
505
“Day Scene. Wheaton Glass Works. Boy is Howard Lee. His mother showed me the family record in Bible, which gave birth July 15, 1894. 15 years old now but has been in glass works two years and some nights. Started at 13 years old. Millvill, N.J. Nov. 1909.” (Library of Congress, Hine caption)
the NCLC used extensively in its posters and reports and as the basis for exhibitions at its annual meetings while also making them available to newspapers and other publications. In his captions Hine made it clear that he was concerned about the impact of work before the legally permitted age of fourteen on all aspects of the lives and development of boys (and of girls as well). The kind of jobs most boys did paid them little and did almost nothing to teach
506
Photographs by Lewis Hine opment of the young boys who worked as “greasers,” “breakers,” and “couplers” 500 or more feet below the surface. He wrote in outrage of the unhealthy conditions in the mine faced by a boy named Willie: Waiting all alone in the dark for a trip to come through. It was so damp that Willie said he had to be doctoring all the time for his cough. A short distance from there the gas was pouring in so rapidly that it made a great torch when the foreman lit it. Willie has been working here for 4 months. . . . Jan. 16 I found Willie at home sick. His mother admitted he is only 13 years old. (Caption for National Archives and Records Administration photograph, negative 1920, S. Pittston, PA, January 7, 1911)
“Manuel the young shrimp picker, 5 years old and a mountain of child labor, oyster shells can be seen behind him. He worked last year & understands not a word of English. Biloxi, Miss. Feb. 20, 1911.” (Library of Congress, Hine caption)
them the value of work while surrounding them with unhealthy and inappropriate conditions that limited their physical, educational, intellectual, and moral development. In poster after poster Hine charged that child labor was making boys “human junk” (Hine 1915; Guimond 1991, 82; Kemp 1986, 10). Using crude flash photography in underground coal mines, Hine captured the grimy faces and stunted physical devel-
Although Hine believed that the worst offenders against child labor laws were textile mills, he objected most strongly to the moral risks of urban youngsters serving as “newsies,” messengers, and delivery boys. Moving in and out of the worst districts of the cities, according to Hine, these boys carried messages to drug dealers and prostitutes, hung out in pool halls, gambled and smoked in alleyways and streets, and sometimes experimented with drugs and sex at a young age. Yet the strength of Hine’s work lies in his respect for the children he photographed. He usually avoided taking candid (unposed) shots of boys (and girls) and instead encouraged them to look straight into the camera lens. Thus Hine allowed the boys in his photographs to construct their own identities, and they often smile. Hine’s photographs reveal that boys who labored were determined and fighting to
Photographs by Lewis Hine
507
“View of the Ewen Breaker of the Pa. Coal co. The dust was so dense at times as to obscure the view. This dust penetrated the utmost recesses of the boys’ lungs. A kind of slave-driver sometimes stands over the boys, prodding or kicking them into obedience. S. Pittston, Pa. Jan. 10, 1911.” (Library of Congress, Hine caption)
retain their dignity in the face of exploitive working conditions. They were not simply victims. They were actors in their own right. Constance B. Schulz See also Jobs in the Twentieth Century; Newsboys References and further reading Major collections of Lewis Hine photographs are located at the Library of Congress; the National Archives and Records Administration (NARA); the George Eastman House in Rochester, New York; and the University of Maryland Baltimore County Library. Guimond, James. 1991. American Photography and the American Dream.
Chapel Hill: University of North Carolina Press. Gutman, Judith Mara. 1967. Lewis W. Hine and the American Social Conscience. New York: Walker. ———. 1974. Lewis Hine 1874–1940: Two Perspectives. New York: Grossman. Hine, Lewis. 1915. “The High Cost of Child Labor.” Brochure. Washington, DC: Library of Congress. Kemp, John R., ed. 1986. Lewis Hine Photographs of Child Labor in the New South. Jackson: University Press of Mississippi. Rosenblum, Walter, Naomi Rosenblum, and Alan Trachtenberg. 1977. America and Lewis Hine: Photographs 1904–1940. Millerton, NY: Aperture. Trattner, Walter I. 1970. Crusade for the Children: A History of the National Child Labor Committee and Child
508
Placing Out
Labor Reform in America. Chicago: Quadrangle Books. Westbrook, Robert. 1987. “Lewis Hine and the Two Faces of Progressive Photography.” Tikkun 2 (April–May): 24–29. Reprinted in Leon Fink, ed. 2001. Major Problems in the Gilded Age and Progressive Era. 2d ed. Boston: Houghton Mifflin.
Placing Out Although placing out (a method of placing boys and girls in families other than their biological ones) had antecedents in the colonial era, it became especially important in the nineteenth century as a method of removing impoverished children from their urban family homes and placing them in farm family homes. The method was first widely employed by Charles Loring Brace’s New York Children’s Aid Society (CAS), and later child welfare agencies in New York and other cities placed out as well. Placing out meant removing needy city children from their natural families and sending them by train to farm communities in the East and Midwest, where they were usually expected to work for the families with whom they lived. The children were often from immigrant families, and virtually all of them were white. Some boys and girls profited from this exchange, but others suffered from it. Gradually, in the early twentieth century, placing out was replaced by foster care. In colonial America, children were placed by their parents or local authorities in a family other than their own for a variety of reasons. Boys were placed under indenture to master craftsmen to learn the skills of a trade, but many parents also hired out their boys for wages, usually to farmers as agricultural labor. In cases of illness, even very young children were
sent to live with families recommended for their ability to achieve a cure. Sometimes unruly boys were sent to live under the governance of another family. And during times of family crisis, such as extreme poverty or the illness, imprisonment, or death of a parent, relatives or local authorities would send a child to live temporarily in another family or to be informally adopted. Although historians long have speculated about the reasons for placing out children, Helena Wall (1990) has argued that economic considerations usually underlay this widespread practice. Families taking in children were expected to provide parental care for them, but they also received compensation, sometimes from local authorities but especially from the labor of boys during their childhood years. Placing out practices changed in the nineteenth century with the growth of immigration, industrialization, and urban poverty. In the 1830s, 1840s, and 1850s, New York and other East Coast cities attracted large numbers of immigrants, mostly from Ireland and Germany. Many arrived in the city penniless and ended up living in overcrowded apartments with relatives. Immigrant fathers found it difficult to find jobs that paid enough to support their families. Mothers worked full-time keeping apartments clean, finding enough for everyone to eat, and caring for young children. By the time their sons were old enough to work, sometime between the ages of six and twelve, depending on the mental and physical maturity of the boy, needy parents sent them into city streets to earn money doing errands and odd jobs and selling newspapers. Parents expected that sons would repay their families for child care in infancy and early childhood by going to work and turning most of what
Placing Out they earned over to their mothers and fathers. Such working boys spent much of each day without parental supervision. Some even slept out in city streets. At the same time, middle-class fathers who earned good money in business or the professions could afford to support their families in large city or suburban homes. Middle-class mothers had servants to help them with housekeeping and child care. Such prosperous families could afford to keep their children in school and never let their sons (or daughters) run about city streets or seek employment before their late teens. To the middle class, the influx of poor immigrant families to U.S. cities was alarming. They worried especially about needy children who, if they grew up unsupervised and uneducated, might never emerge from poverty and eventually turn to crime. Middle-class men and women found a partial solution to the problem of urban child poverty in the placing-out program of the CAS, the agency founded by Charles Loring Brace in 1853. Brace came from a middle-class New England family and was educated as a minister. As a young man he traveled in England, where he observed how poor children were placed out in Canada and Australia to help populate the British Empire. He also visited Germany, where the “Friends in Need” program placed vagrant city children with rural families. When Brace began his ministry in New York, he was alarmed by the large number of boys who worked and seemingly lived in the city’s streets. He and other ministers tried to preach to the boys about Christian values, but the boys were unruly, often threw stones and yelled at the pastors, or ran and fought among the benches set out for the meetings. This experience led Brace to seek another method of reform-
509
ing needy urban youths. With the formation of the CAS, he sought financial support from other concerned members of the middle class to “save” the city’s impoverished children. In 1854, Brace began his placing-out program of removing children of working age from their city families and placing such children in farm families. Brace admired enterprising city boys who were street-smart and hardworking. However, he believed there was little future for them in cities, which were both unhealthy places in which to live and areas where there was so much social stratification that poor boys might never rise above the working class and obtain wellpaying jobs. Brace, like other members of the middle class of his day, believed that living on a farm was healthier than living in a city and that the countryside was a place where it was easier for boys to overcome class barriers and rise up the social scale. And although Brace admired the boys and hoped to provide them with a better future, he was critical of their families, who were usually immigrant and poor. He felt separating enterprising, needy boys from such families was beneficial to the boys. Of course, placing out was not an entirely original program. In addition to its European antecedents, there was in the United States the system long employed by public welfare authorities of indenturing impoverished orphaned children to families that would provide them room and board and some education in return for their labor. Orphanages and reformatories also indentured children to work after they had spent a year or more in institutional care. However, Brace’s placingout program was unique in that it did not involve formal indenture agreements between the CAS and farm families willing
510
Placing Out
to take in children. Brace had enormous confidence in the goodwill of farm families and believed the protection afforded by a formal indenture contract, promising room and board and education for children placed on farms, was unnecessary. He also believed that indenturing made it difficult for the CAS to remove children from poor homes and that it prevented enterprising boys from leaving homes they did not like and moving on to better ones. Agents of the CAS walked through the poorer neighborhoods in New York City, recruiting children to be placed out on farms either in eastern states or in the Midwest. Parents were most likely to relinquish boys in their early teens who were old enough to work but who had not yet found well-paying jobs. Sixty percent of the children placed out by the CAS in its early years were male. Single mothers who could not support their families adequately on the low wages then paid women workers were especially willing to let their nonworking sons be placed out. Many boys were themselves eager to leave the city and seek employment and possibly adventure in the West. The agency recruited white boys and girls almost exclusively. It refused African American children, even though most were poor. The agency argued that in the rural Midwest, where few blacks lived, it was difficult to find farm families willing to take in nonwhite children. The construction of railroads between the East and Midwest made it possible for the CAS to take children some distance from the city to rural areas where laborers were scarce. Sometimes the railroads provided the CAS with discounted tickets for children on their way to being placed out.
Groups of twenty to thirty children, supervised by an agent of the CAS, embarked on an “orphan train” from New York City. When the CAS first began placing out, once the train reached what the agent believed to be a likely rural town, he ordered the children to disembark. The agent then contacted the mayor or other town officials and told them he had a group of city children in need of good farm homes. Town officials then called a meeting of families living in the area and lined the children up in front of them. The families then were invited to choose a child to take home with them. Sometimes farmers examined the children’s teeth or felt their muscles. Later the CAS sent agents out in advance to towns to line up farm families for children, and in some communities committees formed to prepare for the children’s arrival. The chief requirement the CAS had of prospective parents was that they be Christian. The agency placed children in both two-parent and single-parent families. If not all the children were chosen by families in a particular town, the agent set off on the train with the remaining youngsters and stopped at another town to find them homes. The CAS was so confident that all would go well with youngsters placed out in farm families that at first it had no method of checking up on boys and girls, except to have the children write the agency periodically. Eventually, the CAS sent agents to visit boys and girls placed out, but because the children lived on scattered farms, it was difficult for an agent to visit most youngsters more than once a year. Many children were probably treated well, but others may have been punished harshly or abused. Older boys had the best chance of leaving abusive
Placing Out families. Boys usually worked in the fields, often without much supervision, so escape was fairly easy. When there was little work to do on farms in the winter months, boys were free to attend school, where they might meet other local youths who could tell them of better families in which to live or of paying jobs for which they might be qualified. Children who were placed out were not always permanently separated from their natural families. The CAS informed parents where their youngsters were placed, and although the agency did not publicize it, of the 69 percent of boys and girls who lived with their families before placement in 1853 and 1854, 63 percent returned to those families after living on farms for a year or two. For many needy families, placing out was a temporary expedient: a method of providing for children when the parent or parents were unable to do so because of unemployment, illness, or some other family emergency. Brace’s Children’s Aid Society placed out an enormous number of children: 60,000 between 1854 and 1884. It was also widely imitated: by 1867 there were fifty similar agencies placing out children in cities across the United States. Such a large and unique child welfare program inevitably attracted criticism. Roman Catholics opposed placing out by the CAS because its founder, Brace, and most of its financial supporters were Protestant. Catholics argued that the agency deliberately removed poor Catholic youngsters from their natural family homes and placed them in rural, Protestant homes where the children might very well forget Catholicism and be converted to a different faith. Catholics were not so much opposed to placing out as they were against placing out by an aggres-
511
sively Protestant child welfare agency. Eventually, Catholics formed their own placing-out societies, such as the New York Foundling Hospital and the Boston Home for Destitute Catholic Children. Orphan asylum managers were also very critical of the CAS. They argued that it was a mistake to take children directly from city streets and place them in farm family homes. Boys and girls from desperately poor urban families needed some education and some disciplinary training before they were ready to live with respectable, more prosperous families than their own. Again, orphanage officials did not so much object to placing out as they did to placing out youngsters without first retraining them in institutions. Finally, as the number of poor children who left eastern cities on orphan trains grew, states in the Midwest began to object. They argued that easterners were emptying cities of juvenile delinquents and paupers and dumping them in rural areas, where they might very well turn to lives of crime or become dependent on rural communities for public assistance. Ironically, even as some midwestern states objected to children from the East being placed within their borders, these same states were placing poor children even farther west. Thus, in the late nineteenth century, the New York Children’s Aid Society still placed some children in rural Ohio, while agencies in Cleveland and Cincinnati were sending youngsters west to Indiana. By the 1890s, Indiana agencies were placing youngsters farther afield in Nebraska. The CAS responded to its critics in various ways. The agency tried to be more careful about placing sons and daughters of Catholic parents in Catholic rural homes. However, the CAS never accepted
512
Plantations
the idea that children needed to be reeducated in orphanages before being placed out. Brace believed that any length of stay in an orphan asylum disadvantaged youngsters by isolating them from the real world of human feeling and familial emotion. He believed orphanage life turned boys and girls into automatons who knew only how to march about in lockstep. As for his midwestern critics, Brace responded to them by placing fewer children in states where there was strong objection to such placement. Placing out continued into the twentieth century, and the New York Children’s Aid Society did not end the practice until 1929. However, as professional social workers entered the field of child welfare, they became concerned about haphazard methods of selecting families in which to place needy children and about families willing to take in youngsters mainly for the labor they provided. Eventually, placing out was transformed into foster care. Priscilla Ferguson Clement See also Apprenticeship; Foster Care; Indentured Servants; Orphanages References and further reading Ashby, LeRoy. 1997. Endangered Children: Dependency, Neglect, and Abuse in American History. New York: Twayne Publishers. Bellingham, Bruce. 1984. “‘Little Wanderers’: A Socio-Historical Study of the Nineteenth Century Origins of Child Fostering and Adoption Reform, Based on Early Records of the New York Children’s Aid Society.” Ph.D. diss., University of Pennsylvania. Brace, Charles Loring. 1872. The Dangerous Classes of New York and Twenty Years’ Work among Them. New York: Wynkoop and Hallenbeck. Hollaran, Peter. 1989. Boston’s Wayward Children: Social Services for Homeless Children, 1830–1930. Rutherford, NJ: Fairleigh Dickinson University Press.
Holt, Marilyn. 1992. The Orphan Trains: Placing Out in America. Lincoln: University of Nebraska Press. Wall, Helena. 1990. Fierce Communion: Family and Community in Early America. Cambridge, MA: Harvard University Press.
Plantations From the colonial period through the Civil War, boys in the American rural South grew up on plantations that differed in size from farms to vast estates. As Virginia was settled after 1607 and Maryland after 1632, family formation was impeded by high mortality. Surviving sons of planters labored with British indentured servants in tobacco fields surrounding the Chesapeake Bay and its tributaries. Not until the 1680s did African slaves significantly begin to replace indentured servants as the region’s primary labor force. By the 1740s in both the Chesapeake area and Carolina (settled after 1663), wealth accumulated through ownership of land and slaves produced a small class of well-to-do families, whose sons enjoyed privileges of education and a genteel lifestyle. After the American Revolution, residents of Virginia and Maryland carried this plantation system based on slave labor into Kentucky and Tennessee, and cotton production spread rapidly into the backcountry of South Carolina and Georgia. Sons of slave owners and enslaved boys experienced this early-nineteenth-century westward movement, as families rushed into Alabama, Mississippi, Louisiana, and Texas in search of profit from the cotton boom. Privileged white boys on large plantations enjoyed the freedom of country life, often chafing at their parents’ efforts to curtail and educate them, whereas sons of yeomen on
Plantations
513
A white boy plays with slave children on a southern plantation. (Library of Congress)
smaller farms worked the fields under the direction of a patriarchal father. Yet all of them grew up living in close proximity to enslaved boys, with whom they played in plantation yards, roamed the woods and streams, and labored in tobacco and cotton fields.
In the mid-seventeenth century, when yeoman farmers constituted about half the population in Maryland and Virginia, a plantation was essentially a farm of 200–300 acres, of which no land was fully cleared and no more than 50 acres would be cultivated with corn and tobacco,
514
Plantations
planted in small hills among stumps of girdled trees. Because of the prevalence of malaria, dysentery, and typhoid fever, boys grew up in severely disrupted families in which infant and child mortality was high and a majority of children could expect to lose one or both parents. The result was often blended families, as spouses remarried and children of one partner encountered new siblings in the children of the other. Because planters invested scarce resources in land and imported servants, these families lived in one- or two-room impermanent wooden structures, built directly on the swampy ground without brick or stone foundations. Family members cooked, ate, worked, and slept in the hall, or one room with a large hearth, and older boys and servants slept in a loft that was also used for storage. Boys were raised on dried Indian corn made into bread or boiled and supplemented with pork, beef, wild game, dairy products, poultry, eggs, and fruit from a planted orchard. Although boys received some education and were taught to read and perhaps to write and keep accounts, all family members engaged in fieldwork. While younger boys pounded corn, fetched wood and water, and rounded up livestock, older boys participated in the endless round of activity— setting out plants, hoeing weeds, picking leaves and hanging them up to dry, and packing dried leaves in hogsheads—that tobacco cultivation required. Frequently these boys worked with native-born or British indentured servants who were no more than boys themselves, purchased as half hands at the age of twelve or full hands at age sixteen. Servants were under the direction of masters for their four- to seven-year terms but eventually would be free to profit from the skills they had acquired. Because fathers often died when their sons were
young, some boys became masters themselves. If the plantation lacked sufficient income to support them, orphans were bound out to other families by the county court, but many fathers provided in their wills for guardians to manage property until sons reached age eighteen and could receive the inheritance they began to manage by themselves. Although most seventeenth-century yeomen were British immigrants, many of whom had arrived as indentured servants, an occasional African, such as Anthony Johnson on Virginia’s Eastern Shore, owned a plantation. In the early years the status of Africans was rather fluid: some may have been indentured servants, and a man like Johnson, who had been a slave, could work to buy his freedom. His son Richard would marry a white woman, sire four children, and inherit property (Breen and Innes 1980). The status of Africans deteriorated, however, by the 1680s as the price of tobacco plummeted, indentured servants became more difficult to obtain, and planters with capital began to purchase slaves. Over 50,000 slaves were brought to the Chesapeake area in the first forty years of the eighteenth century (Kulikoff 1986, 320); because merchants preferred cargoes that were “Boys & Girls of about 15 or 16 years of Age, of which 2/3 Boys & 1/3 Girls,” many of them must have been male teenagers. When the price of tobacco revived in the 1710s, planters with an adequate labor force were able to profit and to buy more land and slaves. The result was the growth of a class-conscious native-born gentry, able to import British consumer goods and to live a leisured, genteel lifestyle. Although yeoman and tenant families still worked the fields much as they had in the seventeenth century, boys on plantations
Plantations could be sons of wealthy planters, born to privilege and expected to assume social and political leadership. By 1732 Robert “King” Carter owned 333,000 acres of land, divided on his death among five surviving sons, four of whom he had sent to England to be educated. His grandson, Robert III, whose father had died when he was four years old, grew up in the large household assembled when his mother remarried a widower with several children. But his uncles held his inherited lands in trust and managed them until he reached the age of twenty-one. Five years later Robert Carter married sixteen-year-old Frances Anne Tasker of a wealthy Baltimore family and settled down at Nomini Hall and Williamsburg to manage his 70,000 acres of land and 500 slaves. Of seventeen children born to the Carters, twelve survived infancy, four sons and eight daughters who grew up in a Georgian brick house covered with white lime, surrounded by thirty-two dependent buildings. In 1773, when Benjamin was eighteen, Robert, sixteen, and John, four, Carter hired a Princeton graduate, Philip Fithian, to be their tutor and live in the schoolhouse with the older boys. Fithian found the boys “in perfect subjection to their parents,” and “kind and complaisant to the servants who constantly attend them” (Farish 1957, 26). Like other children of the Virginia gentry, they were trained by a dancing master who gathered neighboring children for two-day lessons at a different plantation each time he visited. Parents often joined these occasions, and after watching minuets precisely executed by the children, the entire group delighted in high-spirited reels and country dances. But the Carter boys reveled in the freedom of plantation life, bursting forth when freed from studying Latin,
515
Greek, and math to engage in their favorite pursuits—Ben to ride his horse, Bob to hunt or fish by the river, and their cousin Harry, who joined them in school, to hang around skilled slaves at the blacksmith and carpenter shops. Fithian was somewhat surprised at the pleasure Bob found in the company of slaves and sons of yeomen, “persons much below his Family and Estate” (Farish 1957, 48). And both Bob and Ben, who could quake in the presence of their patriarchal father, were quick to respond with fisticuffs. Ben told his tutor that two persons in a dispute should fight it out manfully in order to be friends again, whereas Bob was “volatile & unsettled in his temper” and often in trouble for coming to blows, even with his sister Nancy (Farish 1957, 48). Neither boy, in fact, fulfilled his father’s hopes for them. Ben died, perhaps from tuberculosis, at the age of twentytwo, and Bob, on a sojourn to England, was killed in a brawl outside a London coffeehouse. Thomas Jefferson, of the same generation as Ben and Bob Carter, grew up to exemplify the genteel sensibility and political leadership expected of sons on Virginia plantations. Mindful of John Locke’s belief that personality was formed by environmental influences, he came to fear that boys in a slave society would not acquire the self-discipline he so admired. “There must doubtless be an unhappy influence on the manners of our people produced by the existence of slavery among us,” he wrote in Notes on Virginia in 1781. The whole commerce between master and slave is a perpetual exercise of the most boisterous passions, the most unremitting despotism on the one part, and degrading submissions on
516
Plantations
the other. Our children see this, and learn to imitate it; for man is an imitative animal. This quality is the germ of all education in him. From his cradle to his grave he is learning to do what he sees others do. . . . The parent storms, the child looks on, catches the lineaments of wrath, puts on the same airs in the circle of smaller slaves, gives a loose to the worst of passions, and thus nursed, educated, and daily exercised in tyranny, cannot but be stamped by it with odious peculiarities. The man must be a prodigy who can retain his manners and morals undepraved by such circumstances. (Jefferson 1944, 278) Robert Carter III was renowned as a humane master and was wealthy enough to free his 509 slaves. But other parents of the Virginia gentry, including Jefferson himself, would be trapped by debt and the institution of slavery, as their efforts to instill genteel restraint in their children were undermined not only by their own indulgence but also by plantation life. After the American Revolution, cotton joined rice as the staple crop on plantations in the Low Country of South Carolina and Georgia. In an area where slaves could outnumber whites by 9 to 1 and a labor force number 100 to 1,000 slaves, plantation boys grew up deeply influenced by the Creole language and culture known as Gullah. It was said of young Benjamin Allston of All Saints Parish, between the Waccamaw River and the sea, that he spoke like a slave, “not only in pronunciation, but even in tone” (Joyner 1984, 208). White and black families shared the deeply rural rhythms of plantation life in cycles of birth, growth, sickness, and death. Boys on Low Country plantations suffered not only from the childhood diseases—measles,
mumps, whooping cough, scarlet fever, croup, and colds—but also from the fevers that prevailed in late summer and early fall—malaria and sometimes yellow fever. Families who could afford it fled to Charleston, Savannah, or smaller towns or to encampments of rustic log or clapboard houses in the pine barrens. A planter such as Thomas Chaplin, who owned 376 acres and about thirty slaves on St. Helena Island in the 1840s, struggled to raise the needed funds to house his four surviving children in St. Helena Village for the summer. Public education did not take hold in the rural South, and he also found it staggering to send his sons, Ernest, Daniel, and Eugene, to private boarding schools. The Chaplin boys lived in a two-story, sixroom clapboard house, and grew up riding horseback, hunting, boating, and fishing with young slaves as their companions. Ernest was not sent to school until he was eleven and a year later still could not write. When Daniel, Eugene, and their sister Virginia also went off to school, their father was hard-pressed for cash, which he and other planters tried to raise by occasionally selling off a valuable slave child (Rosengarten 1986). By the 1820s and boom years of the 1830s, South Carolina planters looked to fertile western lands to improve their fortunes, and the state joined the Chesapeake region in exporting slaves. After Indian lands were surrendered through conquest or treaty, white families moved into Alabama, Mississippi, Louisiana, and Texas. Migration removed boys from grandparents and the thick web of uncles, aunts, and cousins who had contributed to their socialization on the eastern seaboard. Members of nuclear families became more isolated and dependent on each other. Men embraced in-
Poliomyelitis dividualistic, competitive, and risk-taking behavior, and boys lacked the social graces, deference to authority, and perhaps emotional stability they may have gained in the East. Even a privileged boy like twelve-year-old Nicholas (Azby) Destrehan, who grew up on a sugar plantation near New Orleans in 1845, persisted in his own pursuits and resisted his father’s authority. Azby spent his days hanging around the sugar house, fascinated by the machinery. He also cultivated his garden and enjoyed his horse, dog, gun, ducks and geese, and fishing tackle and little boat. When his father tried to instruct him “in reading and figuring,” he admitted later, “I felt miserable at being called from my little amusements, I would sometimes get right mad and often damned my father . . . but this fit of madness only lasted during the time I was occupied at study; for as soon as I left the room I was as happy as ever” (Destrehan 1850). Although in the 1850s the plantation economy boomed from the Low Country to Texas, many of these southern boys would soon find themselves serving in the Confederate Army, an event that would destroy their unique boyhood forever. Jacqueline S. Reinier See also Indentured Servants; Slave Trade; Slavery References and further reading Breen, T. H., and Stephen Innes. 1980. “Myne Owne Ground”: Race and Freedom on Virginia’s Eastern Shore, 1640–1676. New York: Oxford University Press. Carr, Lois Green, Russell R. Menard, and Lorena S. Walsh. 1991. Robert Cole’s World: Agriculture and Society in Early Maryland. Chapel Hill: University of North Carolina Press. Cashin, Joan E. 1991. A Family Venture: Men and Women on the Southern
517
Frontier. New York: Oxford University Press. Destrehan, Nicholas A. 1850. “Memoirs” in “Letter Book.” Historic New Orleans Collection, New Orleans, LA. Farish, Hunter Dickinson, ed. 1957. Journal and Letters of Philip Vickers Fithian, 1773–1774: A Plantation Tutor of the Old Dominion. Williamsburg, VA: Colonial Williamsburg. Jefferson, Thomas. 1944. Notes on Virginia. First published in 1784. In The Life and Selected Writings of Thomas Jefferson. Edited by Adrienne Koch and William Peden. New York: Modern Library. Joyner, Charles. 1984. Down by the Riverside: A South Carolina Slave Community. Urbana: University of Illinois Press. Kulikoff, Allan. 1986. Tobacco and Slaves: The Development of Southern Cultures in the Chesapeake, 1680–1800. Chapel Hill: University of North Carolina Press. Reinier, Jacqueline. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers. Rosengarten, Theodore, ed. 1986. Tombee: Portrait of a Cotton Planter. New York: Quill Press.
Play See Games; Toys
Pokémon See Gambling; Toys
Poliomyelitis Poliomyelitis (usually referred to as polio) is a disease of the central nervous system caused by the poliomyelitis virus. It results in inflammation of the gray matter of the spinal cord, which in turn may lead to paralysis or wasting of muscles. Until the appearance of the first polio vaccine in the mid-1950s, polio was a serious health problem. Although it
518
Poliomyelitis
A young polio victim reads a comic book attached to the rim of his iron lung, ca. 1955. (Hulton Deutsch Collection/Corbis)
was not one of the major killers, treatments had only limited effect, and those who survived the acute phase of illness could be left with serious lifelong disability. For American boys, polio once represented a threat not only to life but also to ambition and livelihood. However, it is also a disease that can be prevented by vaccination. In the United States, the na-
tionwide vaccination program has been so successful that polio has been completely eliminated. Polio has been known for thousands of years. Hippocrates and Galen both described the effects of a disease that was probably polio, and the first modern description of polio was given by an Italian surgeon, Monteggia, in 1813. Polio was
Poliomyelitis endemic in pre–Industrial Revolution societies, a common infection of childhood but one that was rarely serious and that infrequently resulted in paralysis. Poor sanitation and hygiene meant that children were exposed to the poliovirus in infancy by the oral-fecal route, developed mild and often asymptomatic infection, and acquired immunity as a result. By the late nineteenth century, improved sanitation meant that this early exposure and immunity were lost. The age at which people became infected rose, and the effects of the disease worsened. In addition, polio changed from an endemic infection to an epidemic disease, often occurring in the summer months. In the United States statistical evidence about the national incidence of polio, or “infantile paralysis” as it was also known, was not collected separately from evidence about other diseases until 1909. In that year polio was responsible for 2 deaths per 100,000 of the population. In comparison, pertussis (whooping cough) killed 22 per 100,000, and tuberculosis killed around 190 per 100,000. Despite its relatively low death rate, polio was much feared during the first half of the twentieth century. This fear may have been due in part to the outward signs of polio evident in many children and adults, such as leg braces or calipers, which served as reminders of polio’s lifelong disabling effects and severely restricted boys’ activities. It may also have resulted from an awareness that polio was a disease of the affluent as well as the poor. Physical disabilities prevented boys from entering active adult jobs in agricultural or industrial settings, and boys with ambitions to enter professions such as law could find that long periods of hospitalization and rehabilitation would prevent them from gaining the education necessary to enter such professions.
519
The early decades of the twentieth century saw regular and frequent polio epidemics, including the major epidemic of 1916 centered in New York and the northeastern United States, which led to a sharp increase in polio mortality. Although mortality from many other infectious diseases fell markedly during the first half of the twentieth century, that for polio fell only marginally, and indeed the death rate began to rise again from the early 1940s. In the years from 1951 to 1954 (immediately prior to the first polio vaccine), an average of 16,316 cases of polio were reported annually in the United States, with an average of 1,879 deaths. By contrast, more than 500,000 cases of measles were reported annually in the years from 1958 to 1962 (immediately prior to the introduction of the measles vaccine), but the average number of deaths was only 432. Before the development of an effective polio vaccine, other forms of prevention were tried. Summer epidemics meant that public facilities such as swimming pools were closed during the summer months, and schools would stay shut until the epidemic had passed. Boys did not have to become infected to find their lives affected by polio: its presence in a locality was sufficient to threaten their sports and education opportunities. Some towns excluded travelers from epidemic areas. Parents were advised to keep children at home, with doors and windows tightly shut, even though this in itself created unhealthy conditions. Future president Franklin Delano Roosevelt was affected by polio in the 1921 epidemic. In 1926 he bought a hotel in Warm Springs, Georgia, where he had benefited from water therapy, and established a charitable foundation there with his colleague Basil O’Connor. In 1938
520
Pornography
O’Connor became president of the National Foundation for Infantile Paralysis, which began the fund-raising campaign “March of Dimes” and became central to the American attempt to overcome polio. Many forms of treatment were introduced in an attempt to limit the longterm effects of polio and to reduce the numbers of deaths. Those people who suffered respiratory failure because of the virus had no effective treatment available to them until the invention of the Drinker Collins respirator, commonly known as the “iron lung,” in the early 1930s. Treatment for most people with serious limb problems centered on immobilizing affected limbs with splints and calipers. Such treatments limited boys’ opportunities for sports and their prospects for manual work in adulthood. They could also have a negative impact on self-image, especially in boys who saw themselves as active, athletic individuals. Controversy about treatment was common, and alternative methods were often proposed with varying degrees of success. Sister Elizabeth Kenny, an Australian nurse who came to the United States in 1940, developed one alternative therapy. Kenny’s therapy, based on heat treatment and exercise, ran counter to the prevailing ideas about treatment. Her work was controversial, but she established her own clinics and foundation and gained great public support. In 1955 the first successful vaccine, produced by Jonas Salk, was introduced on a large scale. This vaccine, given by injection, used a killed form of the poliovirus to confer immunity. Its immediate success led to federal funding of vaccination programs. The oral vaccine, using the attenuated virus, was developed in the 1950s by Albert Sabin and used in mass vaccination from 1962, the
year when the number of reported cases fell below 1,000 for the first time. In 1979 the last known case of indigenous transmission of the wild poliovirus occurred in the United States. From 1980 to 1997 a total of 142 cases of polio were reported in the United States: 140 of these were linked to vaccine transmission, with 2 being recorded as “indeterminate.” In 1993 the wild poliovirus was declared to be eradicated from the Western Hemisphere. In 1998, with full vaccine coverage extending to 90 percent of three-yearolds, no cases of polio were reported in the United States. Bruce Lindsay See also Disease and Death References and further reading Daniel, Thomas M., and Frederick C. Robins, eds. 1997. Polio. Rochester: University of Rochester Press. Rogers, Naomi. 1992. Dirt and Disease: Polio before FDR. New Brunswick, NJ: Rutgers University Press.
Pornography Pornography, or porn, involves the written or visual depiction of sexual organs and practices with the aim of causing sexual arousal. It has served several purposes throughout its modern history—as a vehicle to criticize religious and political authorities, as a category to be censored for its alleged immorality, and as a means to provide soldiers and working men with tantalizing images of nude women. Much pornography targets a heterosexual male audience, and it often provides male youth with sex education. The word pornography derives from the ancient Greek word pornographos, which the Greeks used to refer to writing about prostitutes. The sixteenth-century
Pornography Italian writer Pietro Aretino (1492–1556) is considered the father of modern Western pornography because of his sonnets and prose depicting explicit dialogues and scenes between lovers. He thus broadened the meaning of pornography to include realistic genital or sexual behavior, which deliberately violates widely accepted moral and social taboos. In the seventeenth and eighteenth centuries in Europe, pornography was restricted to an educated elite male audience that was largely urban, aristocratic, and libertine. While providing titillation, it also served as a forum for criticizing religious and political authorities. For example, from 1740 to 1790 in France, pornography became increasingly political in its criticism of the monarchy, attacking the clergy, the court, and King Louis XV himself through depictions of their orgies. In England, the word pornography first appeared in the Oxford English Dictionary in 1857, and most of its variations, such as pornographer and pornographic, date from the middle to the end of the nineteenth century. The entry of the term into popular jargon coincides with its increasing availability to the masses. John Cleland’s Fanny Hill, or the Memoirs of a Woman of Pleasure, the first adult novel, was published in England in 1748. Arguably the most popular pornographic novel of all time, it was translated into numerous European languages during the nineteenth century. Pornography has had a shorter but more explosive history in the United States. Although little such writing was available in the colonial period, no less a citizen than Benjamin Franklin wrote pieces such as “Advice to a Young Man on Choosing a Mistress,” which remained unpublished in North America
521
until decades after his death. Almost 100 years after Fanny Hill was first published in England, three publishers in Massachusetts were fined or jailed or both for publishing that novel in 1820 and 1821. In the early nineteenth century, as the American public became more literate, decreased family size, and focused on health (including sexual health), the demand increased for literature providing advice. Such works included Charles Knowlton’s The Fruits of Philosophy; or the Private Companion of Young Married People (1832), which contained contraceptive information, and William Alcott’s Young Man’s Guide (1833), addressed to male youth. As Catholic immigration increased, Protestants read works such as Maria Monk’s detailed exposure of sexual activities in a Montreal convent. Before the publication of Harriet Beecher Stowe’s Uncle Tom’s Cabin (1851–1852), Monk’s Awful Disclosures of the Hotel Dieu Nunnery (1836) was the best-selling novel in the United States (Gardella 1985, 25). The consumption of pornography, however, was limited to the elite, who could afford expensive imported books and prints (Kendrick 1996, 77). Pornography began to be more affordable in 1846 when an Irish immigrant, William Haynes, published a version of Fanny Hill in New York, followed by 300 other titles over the next thirty years. During the Civil War, both commercial sex and publication of explicit literature expanded. After the war, cheaply produced pulp novels—dime novels for adults and half-dime or story papers for boys—could be mailed at new second-class postal rates. Single men who lived away from home in the expanding cities, as well as married men, became consumers of commercial sex, and sexually explicit literature began to be known as pornography.
522
Pornography
The relatively sudden rise in publishing pornography caught many American civic leaders by surprise, especially because many states, including New York, did not have laws specifically forbidding the publication and sale of explicit sexual materials. Concerned by the potential impact on society in general and on male youth in particular, the Young Men’s Christian Association (YMCA) of New York launched an antiobscenity campaign and successfully lobbied for the passage of a state statute barring the sale of explicit materials. When the state made little effort to enforce the law, however, the YMCA realized that an incentive was needed to prosecute the publishers and producers of pornography. In 1871, Anthony Comstock, who became one of the most powerful moral censors in the history of the United States, contacted the YMCA, which gave him financial support and backed him as secretary of the New York Society for the Suppression of Vice. Comstock also was able to influence Congress to broaden and strengthen an 1865 law banning the use of the postal services to ship pornographic books and pictures and was appointed a special agent of the U.S. Post Office. But the spread of pornography was difficult to control. Transportation improvements provided a network for distribution of materials. Population growth from roughly 100 million to more than 152 million people between 1915 and 1953 increased the potential size of the pornography market by 50 percent (Lane 2000, 19). Young people settled in cities and could partake in such leisure activities as attending dance halls, amusement parks, and movie theaters. Sexual mores loosened in the 1920s. Movies became more titillating, including flashes of nu-
dity. Provocative books, such as Mademoiselle de Maupin (1835) by Théophile Gautier and The Well of Loneliness (1928) by Radclyffe Hall, were legalized. During World War II sexual mores were further liberalized, as Life magazine introduced pinups for “the boys at the front,” and Hollywood sent sexy performers to entertain the troops. Pornographers took advantage of this captive male audience, introducing “girlie” magazines for the troops and also providing them for sale at the nation’s newsstands. As these social changes liberalized sex in American society, Hugh Hefner, the founder of Playboy, helped to establish the pornography industry. In 1953, he offered a men’s magazine featuring fullcolor photos of nude women and portraying a sophisticated, urban lifestyle. Hefner’s first issue, with Marilyn Monroe on the cover, sold more than 50,000 copies in a little over a month; within two years he was selling more than 600,000 copies per month (Lane 2000, xvi). His success illustrated the demand for sexually explicit material, and competing magazines began to portray even more explicit photographs. Penthouse was the first men’s magazine to show pubic hair and to portray its models more realistically than the stylized portraits of Playboy, and Hustler strove to appeal to the average working-class man. After World War II, concern for the family and about male youth caused pornography to be associated with juvenile delinquency. In the 1960s, however, Supreme Court decisions in various obscenity cases further liberalized American sexual culture, establishing that sex and obscenity (defined as anything offensive to public morals) are not synonymous and that only materials with no redeeming social value whatsoever can be
Pornography proscribed. These decisions effectively laid to rest moral objections to pornography, and as pornographic films, books, and magazines proliferated, mainstream media became more sexually explicit. Today, pornography is a very profitable industry in the United States, which over the last quarter century has grown from $2 billion to at least $10 billion in annual revenue, about the amount Americans pay for sporting events and live music performances combined. Pornography also flourishes on the Internet, contributing an estimated $1 billion to $2 billion per year to the industry total, or 5 to 10 percent of money spent online in 1998 (Lane 2000, xiv). Americans buy millions of copies of Playboy, Penthouse, and Hustler each month, and more than 400 million Xrated videos are rented every year (Chancer 1998, 64). Not only is pornography produced and distributed for the purpose of making profits, but also appealing to sexual interest has become a marketing strategy for selling almost any product. Pornography is easily accessible to youth and is very popular among boys in the United States. In testimony to the Attorney General’s Commission on Pornography in 1985, Jennings Bryant reported that 100 percent of high school–age males surveyed had read or looked at Playboy or a similar men’s adult magazine. The average age for first viewing was eleven, and boys in high school saw an average of 16.1 issues (Bryant 1985, 128–157). A study of high school adolescents in Canada found that one-third of the boys but only 2 percent of the girls watched pornography at least once a month. Further, although girls use other sources of information to learn about sex, including teachers, parents, peers, books, and magazines, 29 percent of the boys surveyed indicated that pornography was their
523
most significant source for sex information (Check 1995, 90). Feminists argue that rather than presenting sexual relations based on intimacy and equality, such magazines as Playboy cater to the objectification of women and the exclusive genital satisfaction of men. The Canadian study found that 43 percent of boys and 16 percent of girls surveyed either thought holding a girl down and forcing her to have intercourse if a boy was sexually excited was okay, or they were not sure if it was okay. Boys who answered in the affirmative overwhelmingly were those who read and watched pornography (Check 1995, 90–91). Many adult magazines eroticize the vulnerability and innocence of youth. Playboy glamorizes the adolescent student as a sexual target, and the genre of Catholic schoolgirls is standard fare in both criminal child pornography and pseudo-child adult bookstore porn. Following the pattern of adult magazines, mainstream child pornography features young girls. But boys also are targets in child pornography, a situation that is valorized by organizations such as the North American Man-Boy Love Association (NAMBLA). Child pornography, or kiddie porn, not only appeals to pedophiles (persons who desire prepubescent youth) but to any person who looks at magazines and films that eroticize youth. In the United States any sexual depiction of a person under eighteen is considered child pornography, and any distribution of such material is a federal crime. Child pornography is the most actively prosecuted form of sexually explicit material, but it is increasingly difficult to monitor the age of objects of pornography on the Internet, which allows considerable anonymity for those running sites.
524
Portraiture
Current debate on censorship of pornography focuses on three aspects. Political and religious conservatives consider pornography a moral issue. Feminists view it as a political problem of power and lack of power. Both groups contest the objectification of women and children for male pleasure and believe that women and children have the right to be agents in their own sexual lives. Mia M. Spangenberg See also Prostitution; Sexuality; Young Men’s Christian Association References and further reading Bryant, Jennings. 1985. Testimony to the Attorney General’s Commission on Pornography Hearings. Houston, Texas. Chancer, Lynn. 1998. Reconcilable Differences: Confronting Beauty, Pornography, and the Future of Feminism. Berkeley: University of California Press. Check, James. 1995. “Teenage Training: The Effects of Pornography on Adolescent Males.” Pp. 89–91 in The Price We Pay: The Case against Racist Speech, Hate Propaganda, and Pornography. Edited by Laura J. Lederer and Richard Delgado. New York: Hill and Wang. D’Emilio, John, and Estelle B. Freedman. 1988. Intimate Matters: A History of Sexuality in America. New York: Harper and Row. Gardella, Peter. 1985. Innocent Ecstasy: How Christianity Gave America an Ethic of Sexual Pleasure. New York: Oxford University Press. Hunt, Lynn, ed. 1993. The Invention of Pornography: Obscenity and the Origins of Modernity, 1500–1800. New York: Zone Books. Kendrick, Walter. 1996. The Secret Museum: Pornography in Modern Culture. 2d ed. Los Angeles: University of California Press. Lane, Frederick S. III. 2000. Obscene Profits: The Entrepreneurs of Pornography in the Cyber Age. New York: Routledge.
Portraiture From the early colonial period until the invention and popularization of photography in the nineteenth century, the primary means of recording and remembering a young son’s, brother’s, or nephew’s appearance was a painted portrait. During this period, Americans commissioned portraits depicting boys in a variety of formats and compositional arrangements from painters of varying artistic accomplishment. Despite their heterogeneity, portraits of male children executed at certain historical moments nevertheless share conventions of dress, pose, accouterments, and setting. These pictorial codes were gender- and age-specific and enabled contemporary viewers, who were more familiar with portrait conventions than are people today, to position the portrait subject within family and social hierarchies. Moreover, these conventions were meaningful because patrons, artists, and society held similar assumptions about the status of young boys. Thus, portrait paintings provide valuable insight into the historical construction of boyhood. Furthermore, when studied in chronological order, portraits provide evidence of changes in the status of boys within the family and society. Artists working in different centuries portrayed the male child, his dress, his pose, the objects he holds, and his environment according to different conceptions of boyhood. And although some of the patterns and changes represented in portrait painting often parallel themes articulated in textual sources, portraits also convey information about the experience of male children not described in texts, such as how clothing, material culture, and artistic representation reinforce gender roles and gender relations.
Portraiture
525
Seven-year-old David Mason is dressed and poses as an adult man in this portrait with his sisters. Attributed to the Freake-Gibbs Painter, The Mason Children: David, Joanna and Abigail, 1670. (Museum of Fine Arts, San Francisco)
It should be noted that American portrait painting presents a biased image of American boyhood. First, it was the father who most often commissioned and paid for a portrait depicting a young son, which when finished would hang in the semipublic spaces of the family home, where it declared parental values toward children, family refinement, and social
position. Thus the painted portrait typically conveys a positive image of childhood, one designed to meet the approval of other family members and visitors to the home. Second, the father-patron expected the artist not only to paint a faithful record of his son’s unique features but also to artistically arrange his clothing, his comportment, and the setting. The
526
Portraiture
finished portrait is therefore neither an image of a sitter exactly as he appeared before the artist nor an idiosyncratic interpretation by the painter. Finally, the portraitist’s skill came at a price, often a substantial one, and thus only financially secure parents could afford to commission a portrait of a junior family member. As a result, the experience of boyhood represented by American portrait painting is skewed toward young male children from elite and middle-class families. Commissioned portrait paintings of impoverished immigrant, African American, or Native American boys or others who lived beyond the routes of itinerant painters are rare in the history of American art. Despite these significant limitations, however, American portrait painting provides insight into the status of children from families who were in the mainstream and at times in the vanguard of social attitudes toward young boys. Few portraits of children were painted in the early colonial period, but those that have survived portray male children between infancy and manhood in one of two ways. A boy no longer considered an infant but younger than seven is typically dressed in long, floor-length petticoats, a pinafore or apron, and hanging sleeves (e.g., Anonymous, Robert Gibbs, 1670, Worcester Museum of Art, illustrated in Calvert 1992). These same articles of clothing were also worn by girls and women and can be seen in contemporary female portraiture. However, a boy older than seven but not yet considered an adult appears dressed in the breeches and hose worn by adult men (e.g., David Mason in Attributed to the Freake-Gibbs Painter, The Mason Children: David, Joanna and Abigail, 1670, Museum of Fine Arts, San Francisco, illustrated in Simpson 1994). In both cases, conventions borrowed from
adult male portraiture explicitly convey the young subject’s gender. Each holds objects associated with men, most significantly gloves and, in the case of the older child, a walking stick. More noticeably, both stand head square on the shoulders, one arm akimbo with the hand on the hip, commanding the surrounding space in emulation of conventions seen in contemporary portraits of elite men intended to express the subject’s masculinity. Even though both of these young children are marked as male, the social position of each is very different. Male children under the age of seven appear more like women and therefore subordinate to those older males who wear breeches. A boy recently breeched, however, has made his initial entry into manhood and now shares with other men a personal mobility—facilitated by breeches—and the corresponding independence, authority, and domination that accompany unrestricted movement. He does not, however, hold objects affiliated with adult authority, such as a baton of command, underscoring the point that he has not yet fully entered into manhood. Regardless of differences in their ages, each will grow to be a man, but by portraying each with the material culture and conventions also used to represent adult women and men, the artist reveals boyhood—defined as a period in a male child’s development that is distinct from adult men’s and women’s positions in the social hierarchy—to be nonexistent. Indeed, colonial society understood childhood not as a unique stage of development in a person’s life but as something one passed through quickly in order to enjoy the privileges of manhood and old age. Portraits painted during the first threequarters of the eighteenth century continue earlier conventions, albeit with modifications in clothing and iconogra-
Portraiture phy. From the 1730s to the 1760s, young males, no longer considered infants but who had not yet been breeched, appear dressed in a long robe (e.g., Samuel Gore in John Singleton Copley, The Gore Children, ca. 1755, Henry Francis du Pont Winterthur Museum, Winterthur, Delaware, illustrated in Rebora et al. 1995). Like the petticoats and pinafores worn in the previous century, these long skirtlike garments signify a lack of mobility and, subsequently, a subordination to those who wear breeches. Painters also continued to portray these young males standing in characteristically male poses, with one hand on the hip and the other pointing into the distance. Finally, a boy’s future dominant position in society is often encoded by the inclusion of a pet bird, squirrel, or dog he has trained and learned to control. Male children over seven years old are again portrayed breeched and posed like adult men (e.g., John Gore in John Singleton Copley, The Gore Children, ca. 1755). At the same time, details such as natural hair coifed in emulation of a wig or a black ribbon worn around the neck in place of a neckcloth signal they have not yet entered full manhood. Their relative immaturity is further underscored by the continued presence of small animals and the absence of items associated with professional pursuits, such as a telescope or a quill. As in the previous century, boys still appear as either subordinate like women or dominant like men, depending on whether they wear breeches, and in neither case are they portrayed according to conventions entirely unique to their age and sex. During the last quarter of the eighteenth century, a dramatic shift in the portrayal of boys between the ages of four and twelve occurs in American portrait
527
painting. First, male children are dressed in trousers, a shirt open at the neck with a ruffled collar, and a short jacket, an outfit eventually called a skeleton suit. Their natural hair is cut short with bangs. This clothing and hairstyle is ageand gender-specific and separates them from breeched males who wear wigs and from women and girls dressed in petticoats. Second, gender-specific toys are included for the first time in portraits, for instance, the toy drum. Third, books begin to appear in the hands of young boys and on nearby tables. Fourth, boys sit and stand in more relaxed poses. Thus, masculinity is no longer exclusively conveyed by male-oriented accouterments or the rigid rules of comportment reserved for adult men. Indeed, the introduction of informal poses for boys enabled artists to convey bonds of intimacy by picturing male children touching and leaning on siblings or parents (e.g., Daniel and Noah, Jr., in Ralph Earl, Mrs. Noah Smith and Her Children, 1798, Metropolitan Museum of Art, New York, illustrated in Kornhauser 1991). Taken together, these pictorial codes serve to identify young male children as now occupying a separate stage in life between infancy and manhood, one apart from the social position of adults. These changes parallel, reflect, and confirm parental interest in Enlightenment theories of childhood development. The influential ideas of John Locke and JeanJacques Rousseau led parents to seek ways to stimulate the natural development of male children by promoting physical and mental activity. The skeleton suit facilitates greater mobility than both petticoats and breeches, and the relaxed and artful pose of boys in portraits often expresses a potential for movement less evident in earlier portraiture. The inclusion
528
Portraiture
By the mid-nineteenth century, portraits reveal children viewed androgynously. This boy and girl are dressed alike but masculinity is depicted by riding on the hobby horse. Unknown, The Hobby Horse, ca. 1850. (National Gallery of Art, Washington, DC)
of toys and books conveys the belief that physical development should complement intellectual growth. Although boys remain subordinate to adult men, boyhood now appears a distinct period of time in a male child’s life, no longer represented according to the conventions of adult portraiture.
Beginning in the 1830s, American portrait painting features yet another conception of boyhood. At this time both boys and girls dress in similarly styled pantaloons and petticoats with their hair cut alike (e.g., Unknown, The Hobby Horse, ca. 1850, National Gallery of Art, Washington, DC, illustrated in Chotner
Poverty 1992). Boys, however, are portrayed with an increasing variety of toys, including drums, whips, pull-toys, rocking horses, and small pets that distinguish them from girls, who most often hold dolls. Children of both sexes stand or sit calmly in domestic or nondescript interiors or out of doors, their young bodies filling the entire frame of the portrait. Unlike portraits of their fathers or grandfathers as children, young boys of this period appear androgynous and consciously separated from the world of adults. Once again the change in portrait conventions parallels a shift in how parents understood the experience of their sons. In the nineteenth century, adults subscribed to romantic notions of boyhood as a period of innocence to be cherished and preserved. Many parents believed a child was born sinless, closer to angels than men, and when he entered into manhood he experienced the Fall. Portrait paintings from this time express the adult desire to represent a child as unaware of sexual difference by depicting boys and girls in similar clothing and hairstyles. But even when a male child appears unsexed, the depiction of genderspecific toys nearby indicates he is developing a natural masculinity that will blossom into manhood. Furthermore, artists conveyed a boy’s separation from the outside world by portraying young subjects in settings where adult activities are absent and by filling the space of the composition with a boy’s body. This focus on a child’s body encodes the parents’ desire to instill in their sons a sense of self-restraint and control, especially when in the company of other adults. Indeed, boys are portrayed in comparatively static poses, unlike the artfully relaxed positions that conveyed the potential for movement found in earlier portraits. By
529
the mid-nineteenth century, the representation of boyhood in American portraits indicates it is a distinct period in a boy’s life, one cut off from the world of adults and very much unlike the experience of his colonial ancestors. Kevin Muller See also Clothing References and further reading Calvert, Karin. 1992. Children in the House: The Material Culture of Early Childhood, 1600–1900. Boston: Northeastern University Press. Catalogue of American Portraits, National Portrait Gallery, Smithsonian Institution. 2001. http://www.npg.si.edu/inf/ ceros.htm (accessed March 24, 2001). Chotner, Deborah. 1992. American Naive Paintings. Washington, DC: National Gallery of Art. Kornhauser, Elizabeth Mankin. 1991. Ralph Earl: The Face of the Young Republic. Hartford, CT: Wadsworth Atheneum. Lovell, Margaretta. 1988. “Reading Eighteenth-Century American Family Portraits: Social Images and Self Images.” Winterthur Portfolio 22, no. 4 (Winter): 243–264. Rebora, Carrie, Paul Staiti, Erica E. Hirshler, Theodore E. Stebbins Jr., and Carol Troyen. 1995. John Singleton Copley in America. New York: Metropolitan Museum of Art. Simpson, Marc. 1994. The Rockefeller Collection of American Art at the Fine Arts Museums of San Francisco. San Francisco: Fine Arts Museums of San Francisco.
Poverty Poverty is an economic condition experienced by boys whose families are unable to provide them with adequate food, shelter, clothing, and education. Impoverished boys are especially likely to suffer health problems, live in physically dangerous areas, work outside the home at a young age, and leave school at an early age. Over
530
Poverty
Impoverished boy and his family living in Elm Grove, Oklahoma, 1936 (Library of Congress)
time, the methods of assisting boys from poor families have changed considerably. Nonetheless, no method has served to eradicate poverty among children, and by the late twentieth century, the largest group of persons living in poverty was boys and girls under the age of eighteen. In the seventeenth and eighteenth centuries when America was first settled,
most persons made their living by farming. If both his parents died or if his father died, a boy was very likely to become poor. Mothers of young children found it difficult to maintain a farm on their own. If a boy were old enough to help his widowed mother farm, he would soon find himself working long hours in the fields and barn and would be unable
Poverty to attend school regularly, if at all. If a boy and his siblings were too young to help their mother on the farm, local poor relief officials would probably remove the children and place them with other farm families. Boys who were indentured or apprenticed to other local farmers were expected to work for them in return for some education and preparation for future employment. In some cases, boys received what they were promised, but many times they were denied schooling, sometimes physically abused, and often required to perform menial chores that did not prepare them adequately for selfsufficiency as adults. By the nineteenth century, with the growth of cities and immigration, poverty among boys became a more serious social problem. First, large numbers of Irish and German immigrants came to the United States in the 1830s, 1840s, and 1850s, and later, after the Civil War, millions of persons from eastern and southern Europe flocked to the United States in search of opportunity. Families often arrived with little savings and had to take up residence in crowded and unsanitary housing. Many people shared the same toilets, garbage disposal was erratic, and illnesses spread rapidly. Boys were very likely to contract diseases such as measles, diphtheria, and tuberculosis. If they remained healthy, fathers and sons and daughters of recent immigrants or migrants to the city had to find work immediately to avoid destitution. Most immigrants had few skills, and consequently, the jobs they obtained were usually low-paying. Textile mills in New England and in the South welcomed whole families, and mill owners put all members, including children, to work. However, wages were very low, and boys and girls rarely received much education while employed in textile
531
mills. Factories other than textile mills employed mostly adult men, and their sons then sought work in street trades, selling newspapers or other small items or collecting junk and selling it to junk dealers. Boys from poor families who had to spend much of their days earning money in the streets lacked close parental supervision. Many attended school only sporadically between the ages of eight and twelve, when they quit to find full-time employment. Some slept out in city streets to avoid crowded and sometimes troubled homes; some engaged in illegal activities such as stealing; others joined gangs; and many frequented pool halls, theaters, movie houses, and gambling parlors. Impoverished parents who felt their sons were not providing their families with enough financial support or who were getting in trouble with the law too much sometimes placed the boys in juvenile reformatories. In the nineteenth century, as in earlier centuries, boys were especially likely to be poor if their fathers died or deserted their families. Mothers with young children had a difficult time finding work to support them. Domestic service and sewing jobs were almost the only jobs available to urban women at the time, and neither paid enough to support a family adequately. Sometimes impoverished mothers could get by with a little food and clothing supplied by local welfare officials, but often such aid was insufficient, and mothers had to take their children with them to the local almshouse. There poor families received shelter, food, and clothing temporarily. Local officials also separated families and indentured boys in almshouses to local farmers or other citizens who put the youngsters to work. Contracts for this type of boy labor usually lasted until the
532
Poverty
youth was twenty-one years old. As with indenturing and apprenticeship in earlier times, sometimes boys were treated well and got the education and training promised them, but at other times they were physically and emotionally abused and poorly prepared to support themselves in later life. By the late nineteenth century, many citizens had come to suspect that placing children in almshouses alongside adult poor persons for any length of time was doing the youngsters more harm than good. As a result, most states passed laws requiring that children be removed from almshouses and placed in other institutions such as orphanages or placed out with families. In the absence of a welfare system that provided much aid for impoverished mothers or an economic system that provided them adequate income, they continued to seek places to put their sons and daughters, either temporarily or permanently. White boys might remain in an orphanage for a year or so and then be returned to their mothers if the women were able to support the children, or if the boys themselves were old enough to work and help support their mothers. Younger boys whose mothers could not support them might leave an orphanage after a year or so to be indentured out to work for other families. In either case, family poverty usually led to family separation for boys and girls. The experience of poverty among African American boys in the eighteenth and nineteenth centuries was somewhat different from that of white youngsters. Boys who grew up in slavery lived in poverty. Their masters usually provided them minimal food and clothing and, by the 1830s, most southern states forbade them from receiving any formal educa-
tion. Older slaves cared for young slave boys, while their parents worked in the fields or plantation house. As soon as they were physically able, slave boys were expected to work—either around the slave cabins and plantation house or in the fields. After the Civil War, most freed slaves continued to farm in the South. Few were able to accumulate much money, and most labored as sharecroppers, farming the land of a white owner and sharing the crop they harvested with him. Boys on sharecropping farms worked alongside their fathers in the fields. In the off-seasons they accompanied their fathers to find day-labor jobs in cities to help support their families. Some were able to attend school, but because their labor was so necessary on the farm, few African American farm boys in the South attended school regularly. African American families that moved north in the late nineteenth and early twentieth centuries often lived on the edge of poverty, although they were very likely to keep their sons and daughters in school, often through high school. Black parents had great faith in education, and mothers labored at domestic service jobs and fathers at day-labor jobs in order to earn enough to keep their sons and daughters in school. However, despite their education, boys and girls rarely found well-paying jobs after graduation. Racist employment policies kept boys out of higher-paying factory jobs before World War I. African American fathers, even if they had fairly good jobs, were usually not able to secure comparable employment for their sons. When African American boys were orphaned, there were few welfare programs to assist them. Most orphanages did not accept black children, and even though African Americans themselves founded a
Poverty few orphan asylums after the Civil War, they were small and could not accommodate many youngsters. Consequently, African American boys whose parents were dead or unable to care for them were usually taken in by other relatives or friends. Native American boys in the nineteenth century experienced poverty in still different ways than did other groups of boys. The U.S. government became determined to remove Native Americans from areas of the country where whites wanted to settle. As a consequence, Indians from the Southeast and the Great Lakes area were forced out of their ancestral homes and made to travel long distances, in what came to be called the “Trail of Tears,” to lands in Oklahoma. On the way, boys and their families experienced extreme poverty. Other tribes were also forced onto reservations on land that was not particularly productive. In the late nineteenth century, the U.S. government also removed Indian boys and girls from their homes and educated them in military-style boarding schools in an effort to break their connections to their tribes and reeducate them to an “American” way of life. By the early twentieth century, many of the problems of the previous century continued to plague impoverished boys and their families, but the welfare system changed somewhat in ways that probably advantaged some boys and their parents. Overcrowded housing and poor sanitation still took their toll on young lives. Needy mothers had little access to medical care, and so their sons and daughters were often born physically and mentally impaired. Childhood diseases spread rapidly in the neighborhoods of the poor, and when their high fevers went untreated, some boys suffered blindness and brain
533
damage. In 1921, the passage of the Sheppard-Towner Maternity and Infancy Protection Act helped to improve children’s health somewhat. The legislation provided matching grants to the states to extend prenatal and postnatal care to any mother who sought it. Clinics staffed by nurses appeared throughout the country to provide medical advice and assistance to mothers and their children. The law was quite effective, but it ended in 1929 thanks to opposition from legislators who feared the spread of socialism and doctors who feared the loss of business as mothers turned to public clinics for medical aid. Eventually, provisions of the Social Security Act of 1935 resurrected many of the features of the Sheppard-Towner Act. Mothers of impoverished boys gained some help in providing them medical care in the early twentieth century, and they also gained some aid from states and localities to provide their sons with the basic necessities of life. As before, boys were most likely to be poor if their mothers were divorced or deserted. Such women still had a difficult time finding well-paying jobs and, when they became desperate, still relied on orphanages and foster care programs to take in their sons and daughters. Beginning in 1911 in Illinois, states began to pass “mothers’ pension” laws that gave impoverished single mothers money to help them support their families. Before the Depression, all states but two had passed such laws, although the money they provided mothers was minimal, and very little of it went to African American mothers, among the poorest in the nation. In 1935, with the passage of the Social Security Act, mothers’ pensions (funded by states) were taken over by the federal government program called Aid to Dependent Children (later renamed Aid to Families
534
Poverty
with Dependent Children). ADC provided needy mothers with money to help them buy the food, clothing, and housing that they and their sons and daughters needed. Although ADC was not overly generous, it did provide more aid than mothers’ pensions had, and that aid went to more mothers (including African Americans). The Great Depression of the 1930s was a particularly traumatic time for impoverished boys. The number of youngsters who lived in poverty grew as the unemployment rate skyrocketed. Parents found it hard to feed and clothe their children, and those who could not pay their rent were evicted and often ended up living in shanties they constructed by hand. Boys who were old enough to find work often quit school, hopped on railroad freight cars, and crisscrossed the country searching for employment. They camped out with other young and old tramps in vacant lots. During the 1930s, there was also a massive drought that produced soil erosion and drove thousands of southwestern farmers off their land. Many fled the Dust Bowl lands for California, where families with sons and daughters found it easier to find work picking fruit and vegetables than did single men. Wages were extraordinarily low for such families, many lived in shanties, and their children were very likely to become sick and quite unlikely to attend school regularly. During World War II, economic conditions improved, and the country almost achieved full employment. In the years after the war, both sanitary conditions and medical care improved, helping all children, including impoverished boys and girls. Even so, certain groups of boys continued to suffer from the consequences of poverty, including poor hous-
ing and ill health. Rural children (many of whom were African American) who lived far from medical facilities were especially at risk; they often did not receive vaccinations for diphtheria, whooping cough, tetanus, and smallpox. Native American boys and girls living on reservations where medical facilities were minimal were especially likely to suffer from pneumonia, influenza, typhoid, and dysentery. As economic conditions improved in the 1950s and 1960s, the remaining poverty in the country seemed an anomaly. To end that poverty, President Lyndon Johnson launched a War on Poverty in the 1960s. The goal of the program was to prevent poverty, and so, many of its programs were directed at young people. Head Start offered free preschool care to boys and girls, Upward Bound helped boys and girls prepare for college, and the Job Corps sought to train boys for jobs that paid a decent wage. Johnson lost interest in the War on Poverty as the war in Vietnam heated up, and his successor, President Richard Nixon, dismantled much of the program. Nonetheless, Head Start and Upward Bound continued, and Medicaid and food stamps (created in 1963 and 1964, respectively, and not officially parts of the War on Poverty) also continued. Medicaid paid the doctor and hospital bills of poor children and adults, and food stamps made possible a healthier diet for impoverished families. Despite improved welfare programs, in the late twentieth century poverty among boys and girls remained a serious social problem. In the late 1970s and 1980s the distribution of wealth in the country became more inequitable. The wealthiest Americans gained a greater share of the national income (the top 20 percent controlled 46.9 percent of the wealth in 1992), and poorer Americans lost ground
Preachers in the Early Republic (in 1992 the poorest 20 percent of the population had just 3.8 percent of the nation’s wealth) (Jones and Weinberg 2000, 4). In 1992, about one child in five lived below the poverty line. The majority of them were white, and most of their parents had jobs. They lived scattered throughout the country in communities large and small, although poverty of children in urban areas was most obvious. As in the past, boys living in families headed by a single mother were most likely to experience poverty. Jobs for such mothers still paid little, child care was expensive, and welfare payments were not adequate to lift these families out of poverty (Sherman 1994, 4–8). As in the past, impoverished boys of today face serious health risks. They are more likely than nonpoor children to be small for their age and to suffer from deafness, blindness, or physical or mental disabilities. They are also more likely than their nonpoor contemporaries to score lower on intelligence quotient (IQ) tests, have learning disabilities, fall behind grade level in school, and drop out of high school. Impoverished boys living in inner cities, especially those who are African American, have few decent-paying job opportunities available to them. Most urban factories have closed down, and jobs in the growing service economy are located mainly in suburbs. The ubiquitous advertising industry continues to create wants among youth, including those who are poor. Needy boys and girls also watch television and want the same clothes, shoes, and music that more well-off children seek. One way poor boys obtain these objects is through the underground economy of drug dealing. Violence and the sale of drugs appear to be inseparable. Boys who live in impoverished urban areas where drug dealing is common are very
535
likely to be victims of violent assault or to witness violence and murder in their communities. Priscilla Ferguson Clement See also Foster Care; Gangs; Guns; Illegal Substances; Juvenile Courts; Juvenile Delinquency; Orphanages; Placing Out; Reformatories, Nineteenth-Century; Reformatories, Twentieth-Century; Runaway Boys References and further reading Ashby, Leroy. 1997. Endangered Children: Dependency, Neglect, and Abuse in American History. New York: Twayne Publishers. Jones, Arthur F., Jr., and Daniel H. Weinberg. 2000. Current Population Reports: The Changing Shape of the Nation’s Income Distribution, 1947–1998. Washington, DC: U.S. Census Bureau. Nightingale, Carl Husemoller. 1993. On the Edge: A History of Poor Black Children and Their American Dreams. New York: Basic Books. Riley, Patricia, ed. 1993. Growing Up Native American. New York: Avon Books. Sherman, Arloc. 1994. Wasting America’s Future: The Children’s Defense Fund Report on the Costs of Child Poverty. Boston: Beacon Press. West, Elliott. 1996. Growing Up in Twentieth Century America: A History and Reference Guide. Westport, CT: Greenwood Press.
Preachers in the Early Republic Around the year 1800 at a Kentucky camp meeting, a ten-year-old boy began to exhort a crowd of the pious and the curious. Held aloft by two men so that he could be seen by all, he dropped a handkerchief and cried: “Thus, O sinner, will you drop into hell unless you forsake your sins and turn to God.” This display of eloquence, self-possession, and piety in so young a lad is said to have stunned
536
Preachers in the Early Republic
his audience and moved many to tears of repentance. Such are the wonders on which the lore of early southern evangelicalism lingers, and there is no mystery about the reason. Young, single white men who embraced religion in their teens and early twenties comprised the ranks from which evangelicals in the South recruited their ministers for many decades after the American Revolution. Those promising male converts came to be called “young gifts,” a term connoting the consensus that they possessed unique spiritual talents and had been bestowed upon the churches by an approving deity. In both senses, they embodied the glorious future of evangelicalism. Presbyterians profited least from the young gifts raised up in their midst. Most church leaders insisted that all those called by God to preach required the weightier imprimatur of a classical education at Princeton or at least a few years in a backwoods academy dispensing rudimentary knowledge of Latin and Greek. That requirement discouraged many young men, who had neither the money nor, they feared, the ability to master the mysteries of ancient languages and academic life. Their reticence made it impossible for the Presbyterians to train and field a large number of ministers quickly and to fill pulpits as the U.S. population moved southward and westward. As a result, Presbyterians in the South generally confined their postwar evangelism to enclaves of receptive Scots-Irish settlers and fell far behind in the competition for new members. By contrast, the Methodists and Baptists emerged in the decades after the American Revolution as the South’s strongest evangelical churches, in part because both groups dispensed with a for-
mally educated clergy. They regarded inner claims to divine appointment as sufficient authorization, the truth of which would be tested when young men apprenticed as itinerant preachers. No time was lost when a young gift rose up in their midst. Among the Baptists, that promising young man was first encouraged to open public religious meetings with a prayer or to close them by delivering an exhortation or leading the congregation in a hymn. He might also be urged to expound on passages from the Bible at household gatherings of family and neighbors. If he completed those exercises satisfactorily, he was then licensed to preach and, if he proved his mettle in the pulpit, could expect to be ordained as an “elder” (as Baptists styled their ministers) within a few years. The Methodists cultivated an eager recruit by licensing him to exhort or appointing him to serve as a class leader and then urging him to accompany an itinerant on his rounds for a few months. If a young man showed the makings of a minister, he was assigned to a circuit and received “on trial” into the “traveling connection” of itinerant preachers. If he proved his worth, after two years he was admitted to their itinerancy in “full connection.” Not only were the Baptists and Methodists able to marshal more preachers than the Presbyterians, but they were also prepared to use them more effectively. Most Presbyterian clergy were settled ministers serving a particular congregation (or two or three neighboring churches), whereas many young Methodists and Baptists began their ministerial careers as itinerants, traveling and preaching to both established congregations and gatherings of the unchurched. That mode of deploying their clergy enabled the Baptists and Methodists to reach an increasingly dis-
Preachers in the Early Republic persed population—the tens of thousands of southern families who, during the decades after the American Revolution, filtered southward into the Georgia frontier and swarmed westward into Kentucky, Tennessee, and southern Ohio. Drawing mainly on young, single men as itinerants also ensured an inexhaustible supply of cheap and enthusiastic young evangelists, a group attuned to the concerns of the lay faithful, especially younger men and women, from whose ranks they had recently been plucked. Fresh-faced youths also drew the merely curious to religious meetings. The spectacle of a “boy preacher” caused as much of a sensation at the turn of the century as it had in the 1740s, when George Whitefield first claimed that title. Popular acclaim for John Crane, who from the age of nine attracted audiences in middle Tennessee, won him his first circuit at sixteen, and Jacob Young, at the comparatively ripe age of twenty-six, swelled with pride when another Methodist minister rode away after hearing his sermon, shouting: “Young Whitefield! Young Whitefield!” At the same age, Jeremiah Norman noted that his sermons brought out “perhaps more than would have been if they had not the expectation of hearing the young performer.” Prodigies bowled over the Baptists, too. After twenty-one-year-old Wilson Thompson wowed a Kentucky congregation in 1810, a senior minister dubbed him “the beardless boy,” a name by which, as Thompson said, “I was spoken of for some years.” In later life Jeremiah Jeter recalled that when he was traveling with another young Baptist preacher in western Virginia in the 1820s, “it was represented that two Bedford plowboys had suddenly entered the ministry and were turning the world up-
537
side down, exciting almost as much interest as a dancing bear” (Heyrman 1997). In itself, the process of culling novice preachers from the ranks of young male converts stirred local excitement. William Watters, who joined the Methodists in 1771 and soon thereafter entered the itinerancy, reported that in his Maryland neighborhood, “my conversion was . . . much talked of, as also my praying in a short time after without a book, which, to some, appeared a proof that there was a notable miracle wrought on me indeed.” Decades later, the appearance of likely prospects still aroused the laity’s interest: in 1810, Martha Bonner Pelham in southern Ohio gossiped in a letter to her sisterin-law in Virginia that a “smart revival” among the Methodists had yielded one young male convert “who is expected to make a preacher.” When novices proved their powers to evoke strong emotions, congregations rejoiced, like the Methodists of one Kentucky society who came away from Jacob Young’s first sermon “bathed in tears,” so gratified that “they clustered round me, shook my hand.” And when his contemporary Thomas Cleland showed talent as an exhorter at a local religious gathering, “it was noised abroad that ‘little Tommy Cleland’ . . . had commenced preaching,” and his neighbors early sought him out to speak and pray in private homes and to offer spiritual counsel to troubled souls (Heyrman 1997). Most men singled out as young gifts at first professed their unworthiness. Undeniably sincere in his humility was the fledgling Methodist itinerant who, as he confided his fears to Francis Asbury after retiring for the night, trembled so much that “the bed shook under him.” Among the Baptists, Wilson Thompson hesitated even to enter the pulpit, fearing that “it was too sacred a place for me,” and
538
Prostitution
quailed at “the very thought of attempting to preach before the old and wise men of the Church, and before preachers.” Such worries prompted the young Methodist Philip Gatch to test his preaching skills in Pennsylvania rather than his native Maryland neighborhood, feeling that “it would be less embarrassing to me.” But after displaying due modesty about assuming so great a calling, most novices threw themselves into the Lord’s work with the untiring energy and unflagging zeal of all youthful aspirants. Often within a matter of months, the same young men who had agonized over entering the clergy were casting themselves as latter-day apostles, boasting of their heroic sufferings for the faith and their skill in winning new converts (Heyrman 1997). However well they may have succeeded, southern Baptists and Methodists did not set out to create a cult of youth. Even though congregations prayed for young gifts to be raised up from their ranks, even though multitudes thronged to the sermons of “boy preachers,” evangelicals never took the position that religious virtuosity resided exclusively or even mainly in the young. But their clergy and pious laypeople did create a climate within the churches that celebrated youthful adepts. The working of wonders among the young at once attested to divine approval of evangelical aims while also advertising their affinities with the primitive Christian church. And in practical terms, postwar Baptists and Methodists had set themselves an ambitious agenda of proselytizing a predominantly youthful population spread over a vast territory. Such a goal dictated their reliance on a traveling clergy, which meant that much of the energy fueling the engines of evangelism would come
from men who had spiritual conviction, physical stamina, and, in some cases, financial support from their families. In practice, then, young preachers were endowed with extraordinary authority as spiritual models and religious leaders. Christine Leigh Heyrman See also Early Republic References and further reading Andrews, Dee E. 2000. The Methodists and Revolutionary America. Princeton, NJ: Princeton University Press. Boles, John B. 1972. The Great Revival, 1787–1805: The Origins of the Southern Evangelical Mind. Lexington: University of Kentucky Press. Cartwright, Peter. 1856. The Autobiography of Peter Cartwright, the Backwoods Preacher. Edited by W. P. Strickland. Cincinnati: L. Swormstedt and A. Poe. Hatch, Nathan. 1989. The Democratization of American Christianity. New Haven, CT: Yale University Press. Heyrman, Christine Leigh. 1997. Southern Cross: The Beginnings of the Bible Belt. New York: Alfred A. Knopf. Young, Jacob. 1857. Autobiography of a Pioneer. Cincinnati: Jennings and Pye; New York: Eaton and Mains.
Prostitution Prostitution is the exchange of sex for money or one or more of the necessities of life and is also commonly referred to as “sex work,” “commercial sex,” “sex trading,” “survival sex,” and “hustling.” Alternate terms for prostitution have evolved to deflect stigma from the individual and to emphasize various nonstigmatizing aspects of the sex-for-necessities exchange. For example, “hustling” is also used as a more general term to describe an assortment of illegal or quasilegal activities in which a young man participates to earn income. “Sex work”
Prostitution is another more general term that technically includes a range of activities in the sex industry, such as pornography and stripping. This particular term evolved in part to associate the activity with work, career, and entrepreneurship (Browne and Minichiello 1996). For the most part, these terms are used interchangeably to refer to prostitution. The growth of commercial sex in the United States occurred in the early nineteenth century in the context of capitalism and urbanization. Poor boys lived in urban neighborhoods where prostitution of females proliferated. By the 1850s, girls as young as eleven or twelve years old were apprehended by local authorities for prostituting themselves in cellars and doorways. There is some evidence that boys engaged in this kind of activity as well, especially in such neighborhoods as the notorious Five Points district of New York City. The city’s courts issued indictments for sodomy only in cases in which force was used or there was considerable disparity in age, and some arrests involved sodomy with boys as young as eleven. In the 1860s the poet Walt Whitman wrote of bringing home young men he met in the streets of New York City, Brooklyn, or Washington, D.C. The author Horatio Alger, who was dismissed from his Massachusetts pulpit in 1866 for the “revolting crime of unnatural familiarity with boys,” could avoid censure in New York, where he wrote about the street boys he so admired. There also is some evidence that on the western frontier, cowboys hired younger males to spend the night with them, and that soldiers in the army also sought and probably paid for young male company (D’Emilio and Freedman 1988, 123, 124). Although male sex work is not a new occurrence, it only recently has been the sub-
539
ject of systematic investigation, in part because it is covert and highly stigmatized. Although there are no precise estimates, there are believed to be at least 100,000 and as many as 300,000 young men involved in prostitution in the United States (Cohen 1987; Deisher, Robinson, and Boyer 1982). Male sex workers are typically in their teens and are hired by older men (Coleman 1989). Heterosexual prostitution among youth is rare. Adolescent sex workers typically do not have pimps, partly because as men they do not feel as vulnerable as women and perhaps because as males they have been socialized to seek and expect independence. Although most sex workers in the United States are white, ethnic minority youth are overrepresented in the population (Fisher, Weisberg, and Marotta 1982). There are two main categories of male sex workers. One subgroup, called “street sex workers,” solicits services in bars and clubs, on the streets, and in bus and train stations. By and large, these young men come from lower-class backgrounds and are more likely to use hard drugs, to have no other occupation, and to have more clients but fewer steady ones (de Graaf et al. 1994; Waldorf 1994). A second subgroup includes young men who solicit through escort agencies, phone chat lines, and the Internet. These young men, called “call boys,” tend to come from middle-class backgrounds, have more education, and have stable living arrangements. With very few exceptions, research studies have focused on “street” sex workers. Thus information about adolescent male prostitutes reviewed below provides a description of a segment of the sex working population—those at highest risk. A boy’s first hustling experience typically occurs at age fourteen, with more ac-
540
Prostitution
tive sex work involvement starting around age fifteen or sixteen (El-Bassel et al. 2000). However, the average age of initiation appears to be dropping, and it is becoming increasingly common to see boys as young as twelve engaging in sex work (Deisher, Robinson, and Boyer 1982). Some young men stay involved in prostitution for extended periods, whereas others move on to other means of supporting themselves, both legal and illegal. Because younger men are favored by clients and because the lifestyle is dangerous and taxing, few youth stay involved in sex work well past adolescence. Unlike female prostitutes, who may continue on far into adulthood, boys tend to age out of prostitution earlier, usually in their midteens and early twenties (Sponsler 1993). Although there is variability, the pathways leading to male sex work are inextricably linked to family stress, parental alcohol or drug problems, and abuse and neglect, quite frequently compounded by poverty. Abuse is a major precipitant of running away from home. Other young men are “thrown away,” that is, forced to leave, commonly because of their sexual orientation. Homeless youth who cannot or do not return home must learn strategies for survival. Although some obtain jobs in the formal economy, most lack the skills and resources to do so and must quickly turn to illegal or quasilegal means of financial support. Indeed, homelessness is the single greatest risk factor leading to involvement in sex work. Although estimates vary widely, it appears that most young men involved in streetbased prostitution are or have been homeless (Weisberg 1985). Young men are typically unaware of male prostitution before leaving home. In contrast to females who are introduced to sex work by pimps, young men usually learn about it from
other youth, who may themselves be involved in it. Boys support themselves through prostitution when they experience insurmountable barriers to work in the legal economy. For example, some are too young to get a work permit, and others do not have and cannot get the proper identification necessary for the job application. Others are concerned that working in the formal economy would allow them to be traced to their parents, from whom they have run away. And on average, sex workers have no more than a tenth grade education (Fisher, Weisberg, and Marotta 1982), further limiting their occupational prospects. Nor are they aware of services or programs that might prepare them for the workplace. A substantial minority have children (El-Bassel et al. 2000), which creates additional financial pressures. Some young men turn to prostitution because they believe it is superior to other criminal survival acts such as selling drugs. In fact, young men are less likely to be arrested for prostitution than for other illegal activities (Coleman 1989). Sex workers also experience psychological and social barriers to working in the formal economy. Early negative family experiences, including physical and sexual abuse, and leaving home at a young age contribute to developmental and skill deficits as well as mental health problems that interfere with traditional work. Sex work may be a way, and in many cases the only way, for a young man to have power and control over his life. Boys report that they gain various benefits from sex work. These include freedom, entertainment, excitement, sex, time to socialize, peer group support, and a favorable amount of income for nominal effort. In addition, a need for adult male attention and affection or a desire
Prostitution to control adult males may be underlying motivations for some young men. Substance use is a critical factor in the lives of many young men involved in sex work. Although boys may use drugs prior to becoming involved in sex work, typically substance use increases drastically after they begin hustling (Coleman 1989). Alcohol and marijuana are the substances most commonly used, followed by crack cocaine (El-Bassel et al. 2000). For some, substance use is a means of self-medicating for experiences of psychological distress and depression. Injection drug use is a serious concern for many and is linked to the transmission of HIV and hepatitis B and C. In a series of two studies done on the West Coast, as many as 50–70 percent of adult street hustlers and almost 40–50 percent of nonstreet hustlers had injected a substance (Waldorf 1994). Rates of injection among adolescent sex workers are believed to be lower on the East Coast. The most commonly injected substance is heroin, with cocaine, amphetamine, crack cocaine, and ketamine also reported. Youth heavily involved in drug use are compelled to get income to support their habits; thus serious drug use is often tied to heavy involvement in sex work. Sex workers have great difficulties accessing drug treatment programs and often have trouble completing them, sometimes because they find them restrictive or that staff are insensitive and intolerant. Contrary to the stereotype, males involved in sex work do not necessarily view themselves as gay or bisexual. Indeed, as many as 25–40 percent view themselves as heterosexual and have female partners in addition to their male clients (El-Bassel et al. 2000; Pleak and Meyer-Bahlburg 1990). Although sex work may not be their preferred means of
541
supporting themselves compared to other illegal activities (e.g., robbery, burglary, selling drugs, and pimping young females), they turn to it when other avenues are closed to them. For many who identify as straight, having sex with men can trigger confusion, shame, and distress. Straight-identified sex workers are difficult for service providers to contact and assist. Gay and bisexually identified youth involved in sex work experience multiple sources of stress and isolation, often as a result of their stigmatized sexual identity as well as their involvement in prostitution. They are significantly more likely to run away from or be thrown out of their homes than their straight peers. And homelessness, as has been discussed, is a major risk factor for involvement in sex work. These boys are frequently marginalized and shunned by families and traditional services and systems. A substantial minority of boys involved in sex work are “transgendered”: that is, they experience their gender as female and dress and behave accordingly. They work in both female and transgendered “stroll” areas (sex work venues) and, less frequently, in male venues. Transgendered youth face special risks and are frequently the target of harassment and violence, both by clients and by other men. Anecdotal reports even suggest that they are murdered at higher rates. Sex workers are infected with HIV, syphilis, hepatitis B and C viruses, and other sexually transmitted infections at high rates. Although there are few studies that focus exclusively on hustling youth, existing data indicate that 25–50 percent of young men involved in sex work are HIV-positive (Elifson, Boles, and Sweat 1993; Waldorf 1994), and even more are infected with syphilis and hepatitis B and
542
Prostitution
C viruses. Young men who inject drugs in addition to engaging in sex work are even more likely to be HIV-positive, although sexual risk is primary. More gay-identified youth are infected than their bisexual and heterosexual counterparts. However, most of the time young men contract infections through their romantic or unpaid partners, not through their sex work clients. Contrary to the stereotype, sex workers do not customarily have unprotected sex with their paying partners. In fact, sex workers use condoms during anal intercourse, a potentially risky activity associated with the transmission of pathogens, the majority of the time (Pleak and Meyer-Bahlburg 1990; Waldorf 1994). In addition to condom use, sex workers have a range of other strategies they use to reduce their risk of exposure to disease and of exposing their clients, including the activities they agree to engage in (Browne and Minichiello 1995). It is typical for a youth to receive only oral sex, a lower-risk activity, and many will even use a condom while doing so. However, sex work does have its threats to sexual safety. Sex workers, regardless of sexual identity, are more likely to engage in risky behavior with steady clients, with those to whom they are sexually attracted, or when they are in dire need of drugs (de Graaf et al. 1994). Furthermore, congruent with their sexual orientation and desires, gay youth are more likely than straight or bisexually identified young men to receive anal intercourse, an activity associated with higher risk for exposure to sexually transmitted infections. (This does not mean, however, that straight-identified boys do not engage in the act, only that they do so less frequently. Sexual identity is a thorny issue, particularly for young people, and identity is, at best, an imperfect
prediction of behavior.) Based on these well-established behavior patterns and other epidemiological data, there is no good evidence that prostitutes are vectors of HIV transmission to their clients. Indeed, as noted above, boys are much more likely to be infected with a sexually transmitted infection, including HIV, by a romantic or nonpaying partner than a paid client. Sex workers rarely use condoms with their male romantic or unpaid partners or with female partners. Because working life is associated with negative social judgments, it is important for young men to create a separate “sphere” in personal relationships, where condoms are not used (Joffe and Dockrell 1995). However, sex work still has significant hazards. Violence is endemic. Young men are often victimized. The mortality rate for male sex workers is high; they overdose on drugs, contract fatal illnesses, and die from violence at high rates (Sponsler 1993). Sex workers, particularly transgendered youth, are targeted by police. Incarceration related to sex work, drugs, or other survival strategies is very common. The majority of adolescent male prostitutes have been arrested at least once (Weisberg 1985). Prostitution can be psychologically and socially destructive. A young sex worker is at a critical stage of development, yet typically is not in school, lacks legitimate employment, and lacks access to positive and nonexploitative role models. As a result, he risks missing out on developing the personal, social, occupational, and educational skills necessary for success in the adult world. The stigma of prostitution is often internalized, and sex workers experience low self-esteem. Working at night and sleeping during the day further break down
Prostitution contact between the youth and the rest of the world. Heavy involvement in sex work can have a negative impact on relationships. Sex workers report that they have trouble forming stable relationships; intimacy is thwarted by fears of closeness or affection, the partner’s reaction to his past, and a dislike for control or restriction in relationships (Fisher, Weisberg, and Marotta 1982). Their need for services is great, including health care, mental health care, and drug treatments, as well as HIV prevention intervention. Yet there are numerous serious barriers to their receiving services. First, it is difficult for service providers to identify and reach the youth, given their unpredictable lifestyles. Outreach efforts in which the staff of community-based organizations go to the areas where youth are working are key and effective. However, not surprisingly given their difficult backgrounds, sex workers are generally disinclined to get involved with traditional systems, and many do not use services at all or are hesitant to do so. There also may be mismatches between sex workers’ needs and service structures and insufficient funding for appropriate services. Most traditional facilities are ill-equipped to meet their needs. The process of leaving sex work can be difficult and protracted. Sex workers get their social support on the street and experience stigma and ostracism from their old communities. Their financial needs and limited educational and occupational experiences do not dissipate over time. The longer they are involved in sex work, the more pronounced the mismatches become between their developmental competencies and what is expected of them by society, and the harder it is to join the “straight” world. Yet numerous
543
young men do leave prostitution by choice or circumstance. Physicians and social service organizations play an important role in this transition. Many take a harm-reduction approach, helping youth to stay safe if and when they engage in sex work and decreasing their reliance on sex work for survival, often by helping them obtain housing, addressing mental health concerns, and helping them develop skills for work in the formal economy. Others exit sex work on their own because of circumstances related to failing health, drug treatment, or incarceration. Marya Viorst Gwadz See also Same-Sex Relationships; Sexuality; Sexually Transmitted Diseases References and further readings Browne, J., and V. Minichiello. 1996. “The Social Meanings behind Male Sex Work: Implications for Sexual Interactions.” British Journal of Sociology 46, no. 4: 598–622. ———. 1996. “The Social and Work Context of Commercial Sex between Men: A Research Note.” Australian and New Zealand Journal of Sociology 32, no. 1: 86–92. Cohen, M. 1987. Juvenile Prostitution. Washington, DC: National Association of Counties Research. Coleman, E. 1989. “The Development of Male Prostitution Activity among Gay and Bisexual Adolescents.” Journal of Homosexuality 17, no. 2: 131–149. de Graaf, R., et al. 1994. “Male Prostitutes and Safe Sex: Different Settings, Different Risks.” AIDS Care 6, no. 3: 277–288. Deisher, R., G. Robinson, and D. Boyer. 1982. “The Adolescent Female and Male Prostitute.” Pediatric Annals 11, no. 10: 819–825. D’Emilio, John, and Estelle Freedman. 1988. Intimate Matters: A History of Sexuality in America. New York: Harper and Row. El-Bassel, N., R. F. Schilling, L. Gilbert, S. Faruque, K. L. Irwin, and B. R. Edlin. 2000. “Sex Trading and Psychological
544
Prostitution
Distress in a Street-based Sample of Low Income Urban Men.” Journal of Psychoactive Drugs 32, no. 2: 259–267. Elifson, K. W., J. Boles, and M. Sweat. 1993. “Risk Factors Associated with HIV Infection among Male Prostitutes.” American Journal of Public Health 83, no. 1: 79–83. Fisher, B., D. K. Weisberg, and T. Marotta. 1982. Report on Adolescent Male Prostitution. San Francisco: Urban and Rural Systems Associates. Joffe, H., and J. E. Dockrell. 1995. “Safer Sex: Lessons from the Male Sex Industry.” Journal of Community and Applied Social Psychology 5, no. 5: 333–346. Maloney, P. 1980. “Street Hustling: Growing Up Gay.” Unpublished manuscript.
Pleak, R. R., and H. F. Meyer-Bahlburg. 1990. “Sexual Behavior and AIDS Knowledge of Young Male Prostitutes in Manhattan.” Journal of Sex Research 27, no. 4: 557–587. Sponsler, C. 1993. “Juvenile Prostitution Prevention Project.” WHISPER 13, no. 2: 3–4. Waldorf, D. 1994. “Drug Use and HIV Risk among Male Sex Workers: Results of Two Samples in San Francisco.” Pp. 114–131 in The Context of HIV Risk among Drug Users and Their Sexual Partners. Edited by R. J. Battjes, Z. Sloboda, and W. C. Grace. NIDA Research Monograph. Rockville, MD: National Institute on Drug Abuse. Weisberg, D. K. 1985. Children of the Night: A Study of Adolescent Prostitution. Lexington: D. C. Heath.
R Radio
and farmers or they returned to live with their natural families. Prosperous, reform-minded New York City men founded the nation’s first reformatory for boys in 1825. Concerned about the growth of youthful crime in the city and about the failure of adult prisons to reform young criminals, well-off New Yorkers formed the Society for the Reformation of Juvenile Delinquents in 1823 and completed construction of the New York House of Refuge two years later. Bostonians established a similar institution in 1826, as did Philadelphians in 1828. All three cities were in the earliest stages of industrialization, and their new factories attracted teenage farm boys seeking employment and excitement in the city. Such boys usually came to the city alone and unsupervised by parents. When not working, boys hung out on street corners, gambled, drank, and sometimes yelled obscenities at passersby. Immigrants from Ireland and Germany also migrated to northeastern cities. Weakened by the long sea voyage, some died and left their children to manage on their own. Others made it to the United States, but fathers had to work long hours for low pay, and mothers had to labor so long and hard to provide basic food and housing for themselves and their families that children of immigrants often went unsupervised. They wandered the city streets, played pranks on one another, swam naked
See Great Depression; Toys; World War II
Rap See African American Boys
Reading See Books and Reading, 1600s and 1700s; Books and Reading, 1800s; Books and Reading, 1900–1960; Books since 1960
Reformatories, Nineteenth-Century Reformatories were institutions created to punish and reform youthful boys who were poor and homeless or who had committed crimes. Constructed first in northeastern cities and named “houses of refuge,” reformatories for juveniles spread to other cities as the nation expanded westward. Most served only white boys, although some admitted black youths as well but kept them segregated from whites. Both the courts and parents of misbehaving boys committed youngsters to reformatories. These asylums closely resembled prisons, although they did provide boys with basic language, math, and vocational skills. Upon release boys either went to work for artisans, shopkeepers,
545
546
Reformatories, Nineteenth-Century
Inmates march to their dorms at the Approved School, Jeffersonville, Indiana, 1938. (Bettmann/Corbis)
off the wharves in the summer, built huge bonfires for warmth and entertainment in the winter, stole apples from the stalls of peddlers, and in general made older, more established city residents uneasy. Before the construction of reformatories, boys arrested for committing crimes entered city jails and, if convicted, adult prisons. By 1816, jails and prisons in New York, Boston, and Philadelphia were so overcrowded that boys were not always separated from adult criminals. Reformers feared both the rise of the juvenile crime rate and the probability of boys learning to commit ever more serious crimes in prison. As cities appeared in the Midwest and Far West, they copied the institutions, like juvenile reformatories, already established in eastern cities. In both parts of the country, founders of reforma-
tories were chiefly concerned with saving white males from lives of poverty and crime. Although the vast majority of children in reformatories were white boys, officials also admitted some black youths and some girls of both races as well. When reformatories admitted blacks, they usually kept them segregated from whites. Girls were also housed separately from boys. In the nineteenth century, there were few reformatories built in the South, where most black boys and girls lived, perhaps because it was such an agrarian area and reformatories were chiefly an urban phenomenon. By and large, Protestant men founded reformatories. They expected to teach the boys in their charge Protestant religious and moral values. Reformers were distrustful of immigrants, many of whom
Reformatories, Nineteenth-Century were Catholic, and quite willing to convert Catholic boys in their charge. Alarmed by the Protestantism preached in reformatories, Catholic parents and religious leaders in some cities eventually formed their own reformatories to preserve the faith and values of Catholic boys. When a house of refuge or reformatory first opened its doors, the boys who entered came directly from prisons and jails. Courts soon committed others. Nonetheless, throughout the nineteenth century, most boys in reformatories had not committed crimes but were simply poor, homeless, or vagrant. The state had the right to intervene in the lives of poor youngsters thanks to the doctrine of parens patriae. Articulated first by a Pennsylvania court in 1838, the doctrine allows state governments to provide for a boy or girl when their natural parents do not properly care for them. Eventually, not only courts but also impoverished parents placed boys in reformatories. Working-class parents expected their sons to find jobs as soon as they were physically able and to contribute a good part of their earnings to the family. Throughout the nineteenth century, when most men could not earn enough at unskilled jobs in factories to support their families, the labor of children, especially boys, was essential to the well-being of many families. When boys kept their earnings for themselves or refused to work or even to attend school, they angered their parents. Needy families were rarely willing or able to support a boy who neither earned his own keep nor helped his family financially. Parents sometimes committed such boys to reformatories for disciplinary purposes. The first reformatories constructed were large, walled institutions. Boys were
547
housed in cells furnished only with a small cot, table, and chair. Windows had bars on them. Boys wore uniforms and marched from their cells to the washroom, the dining hall, the schoolroom, and back. Often officials expected boys to remain silent except during occasional periods of recreation. In the second half of the nineteenth century, some cottagestyle reformatories were constructed in which boys of the same age lived in more homelike buildings, often supervised by a couple who served as parental figures. In all reformatories boys spent most of their time at school and at work. Officials believed that if boys were to be saved from lives of poverty and crime, they had to acquire basic literacy skills as well as learn proper habits of work. Reformatories contained classrooms where boys spent about four hours a day learning reading, writing, and arithmetic from male and female teachers. Since most boys in these asylums were thirteen to fourteen years old and illiterate, they were a difficult bunch to teach. Many were foreign-born, had found Englishspeaking public schools unwelcoming, and so had attended them rarely. Others faced no language barrier but often had to stay out of school to help their parents and so fell behind in school and dropped out. Still others may have wanted to go to school, but until the 1880s in many cities there were not enough public schools to accommodate all who wanted to attend. Since most boys remained in reformatories for just one to three years, often in that time they could not catch up on all the education they had missed when younger. Nonetheless, teachers in reformatories estimated that by the time they were released, most boys could read the Bible, and many could also write a letter to their friends.
548
Reformatories, Nineteenth-Century
Just as important as formal education was vocational training. Reformatory officials believed boys had to be prepared for employment if they were going to be properly reformed. At first, reformatories made connections with contractors who paid asylum officials for the labor of boys in asylum workshops. There boys learned to bind books, cane seats in chairs, and make baskets and umbrellas. Most boys disliked the boring, repetitive labor in contract shops and rebelled against it, usually by not working very hard but occasionally by burning down asylum workshops. During economic depressions, contractors sometimes closed the shops entirely, leaving the boys with no work. By 1884, free adult laborers objected so strongly to contract labor in institutions that many states banned it. Thereafter, reformatory officials established their own workshops in which they tried to teach boys how to use tools and to make and maintain objects needed in the institution. Boys did not always accept reformatory values and discipline passively. They created their own subculture within institutions. Older boys sometimes sexually exploited younger boys. Many boys made fun of youths who cooperated with officials and tried to follow the rules and reform. At the Western House of Refuge in Pennsylvania, boys who reported violators of the rules to officials were ridiculed by fellow inmates as “softies” or as “lungers” because they would “lunge” forward willingly to gain favor with officials. Catholic boys who resented Protestant efforts to convert them would sing out of turn and yell obscenities during Sunday religious services in asylums. The ultimate rebellion against reformatories was running away, and a minority of boys were always willing to do just that, even
at the risk of physically injuring themselves. Boys stole ladders and scaled the walls of reformatories, only to fall and break arms or legs in the process. Officials did not treat such infractions of the rules lightly. Punishments of boys in reformatories were often quite harsh. At the very least, a boy would be whipped. He might also be placed in a solitary, darkened cell, sometimes without clothes, and forced to eat bread and water for days at a time. At the Westborough Reform School in Massachusetts in the 1870s, officials locked disobedient boys into a “sweatbox” that was 10 inches deep by 14 inches wide, with three 1-inch slits for air holes. Boys who spent a week crammed in such a box often found it difficult to stand up and walk after being released (Pisciotta 1982, 415). Officials expected that with proper discipline, education, and vocational training, boys would be reformed within one to three years after entering a reformatory. At that point officials preferred to indenture the boys out to local artisans or farmers, who would agree to feed and clothe the boys until they were eighteen years of age in return for their labor. In this fashion boys could be separated from their parents, whom asylum officials disliked and distrusted, and their reformation completed by hardworking citizens. Until the Civil War, most boys left reformatories to be indentured. However, during the war, when there was an upsurge of boys incarcerated in reformatories probably because their fathers were at war and their mothers were working outside the home and unable to keep them under control, reformatory officials found it difficult to find enough persons willing to indenture boys. Many potential employers of boys were themselves preparing to fight in the war and unable
Reformatories, Twentieth-Century to take on and train boy laborers. After the war, as mechanization of farming proceeded rapidly, many farmers, always the most likely to indenture boys, replaced boy labor with machine labor. When indenturing became less popular, reformatory officials, unwilling to expend more dollars to retain boys until they were eighteen and old enough to live and work on their own, returned most boys to their families. Asylum officials never liked having to return boys to mothers and fathers who had presumably not cared for them adequately in the first place, but both boys and their parents probably preferred this arrangement. Boys did not have to live with strangers often far away from their families, friends, and their city homes. Parents did not lose their children permanently but regained their company and their labor fairly promptly. Priscilla Ferguson Clement See also Apprenticeship; Foster Care; Indentured Servants; Orphanages References and further reading Hawes, Joseph M. 1971. Children in Urban Society: Juvenile Delinquency in Nineteenth-Century America. New York: Oxford University Press. Hess, Albert G., and Priscilla F. Clement, eds. 1993. History of Juvenile Delinquency: A Collection of Essays on Crime Committed by Young Offenders, in History and in Selected Countries. Vol. 2. Aalen, Germany: Scientia Verlag. Mennel, Robert M. 1973. Thorns and Thistles: Juvenile Delinquency in the United States, 1825–1940. Hanover, NH: University Press of New England. Pisciotta, Alexander W. 1982. “Saving the Children: The Promise and Practice of Parens Patriae, 1838–1898.” Crime and Delinquency 28, no. 3 (July): 410–425. Schneider, Eric C. 1992. In the Web of Class: Delinquents and Reformers in Boston, 1810s–1930s. New York: New York University Press.
549
Reformatories, Twentieth-Century In 1900, the United States had fewer than 100 reformatories, almost all of them in the Northeast or the upper midwestern states. The average age of boys committed to them was fourteen, although some took boys as young as ten. The average stay lasted slightly less than two years. Most of the boys had immigrant parents, but approximately 15 percent of them were African Americans (Schlossman 1995, 375). Over the course of the twentieth century, the administrators of the juvenile justice system became increasingly disenchanted with reformatories as a method for the rehabilitation of youth. Early in the century, reformers attempted to transform these institutions into surrogate homes and schools for poor boys and young criminal offenders. When this process failed, subsequent generations worked to remove most boys from institutional settings and place them into alternative programs for rehabilitation and training. Although these efforts did result in the removal of most noncriminal boys from reformatories, the implementation of these proposals never matched the rhetoric. Widespread social fears about increasing crime consistently forced the courts to make the removal of young criminal offenders from society their first priority. A late-nineteenth-century movement to substitute family care for institutional care sparked some of the earliest outcries against reformatories. Armed with studies performed by experts in the newly emerging sciences of psychology and sociology, Progressive reformers protested the monotony and lack of individual attention for children in institutional care. Convinced that delinquency resulted from environmental factors rather than from the immorality of the individual
550
Reformatories, Twentieth-Century
child, child welfare advocates argued that the efforts of reformatories to break down the will of a boy and compel his blind obedience to authority only increased the likelihood of his becoming a repeat offender. Instead, psychiatrists argued that children needed advisers to assist them in developing confidence in their own abilities. These reformers envisioned the newly created juvenile court system, first implemented in Chicago in 1899, as the type of system that could remain flexible and focus on the individual child (Rothman 1980, 215; Mennel 1973, 131–132). Juvenile court judges retained broad discretionary powers in determining placements for the youths who came into their courts. They examined the totality of the juvenile’s environment rather than simply the individual offense. Although their discretionary authority resulted in differing philosophies among these courts, most judges viewed reformatories as a last resort utilized only for serious or repeat offenders. In most juvenile courts, probation became the first option because it allowed children to remain with their families while receiving guidance and discipline from probation officers. If they deemed the home environment inappropriate for children (as they often did, especially in immigrant families), judges sought foster care options. Like Homer Folks, a leading reformer and champion of probation, many judges viewed incarceration of children as “opiates for the community” that “turn its mind away from its own serious problems” (Rothman 1980, 219–220). Yet in certain cases, even the reformers acknowledged the necessity of institutionalization. As a result, many judges and reformers advocated the creation of new state schools for boys that empha-
sized not only basic educational skills but also vocational training. Judges generally committed older boys to these institutions (the average age of commitment rose slowly until by the 1930s it was sixteen), and they shortened their stays in these schools from an average of almost two years to approximately one (Schlossman 1995, 373). Existing reformatories had to change both their methods and their public images in order to compete with these new schools. They changed their names from “houses of refuge” and “reformatories to “training schools,” “industrial schools,” or simply “boys’ schools.” They attempted to shift from a military model to a campus one, utilizing the latest scientific techniques to train their boys to survive in an industrial society. Administrators extended school hours and moved away from cell-block or dormitory living arrangements and toward the cottage design, with a small number of children supervised by an adult with the highest educational and moral background. Ideally, children would have their own rooms and feel they lived in a normal, familial community. One prominent example of this model was the George Junior Republic, opened in Freeville, New York, in 1895. At this school children voted, held political office, and had jobs as bankers, judges, police officers, and store owners, among others. The children made money and were allowed to retain it in a savings account. Those citizens who broke the rules went to jail, where they wore striped suits and worked on rock piles. By 1920 a combination of politics and internal dissension caused the disintegration of this school, but other schools, most famously Father Edward Flanagan’s Boys Town outside Omaha, Nebraska, followed its model in a modified form.
Reformatories, Twentieth-Century The Whittier State Reform School in California also implemented a radically different approach to juvenile rehabilitation. In 1912, its new director, Fred Nelles, emphasized the need for individual treatment programs based on psychological testing and counseling of inmates. With the assistance of the psychology departments at Stanford and the University of California at Los Angeles, Nelles recommended specific placements in cottages, classes, work assignments, and even recreational programs. He attempted to control the population of his school, focusing on younger, noncriminal boys who scored well on intelligence tests. He kept the boys in school for five and a half hours a day (an extraordinary amount of time for reform schools during this period), and beyond academics he emphasized character development through intramural sports and an active Boy Scouts program. As a reform school system grew in the southern states, another new model arose in 1897 with the founding of Virginia Manual Labor School for black youths. Most southern institutions admitted whites only, although a few had admitted both races while retaining strictly separate housing and curricula for whites and blacks. The Virginia Manual Labor School, however, hoped to provide agricultural and vocational training for blacks based on Booker T. Washington’s philosophy of racial uplift through labor. Even though the ideology of reformers and the juvenile court judges promoted dramatic change within the reformatory system, in reality the lasting changes did not match their ideals. Most historians agree that the number of incarcerated children did not decrease as a consequence of the strategies of the juvenile courts. A lack of funding by state legislatures meant an inability to garner equip-
551
ment for vocational training, to pay enough teachers to educate the inmates, and to hire psychiatrists to attend to therapeutic needs. Overcrowding destroyed the ideal of the cottage system. The undesirability of jobs at juvenile institutions meant that few employees had the required skills to assist the boys intellectually or emotionally. Military rules and language began to slowly return. Discipline and institutional maintenance, rather than rehabilitation, became the administrators’ chief priorities. Training schools typically isolated their students from the outside world, allowing them to write one letter and receive one visitor a month. Corporal punishment remained standard, and students often attacked each other. A national survey of boys discharged from training schools indicated that only 55 of the 751 boys had records free of disciplinary action (Mennel 1973, 277). Sexual assaults among inmates regularly appeared as one of the leading causes of such actions. The rise of scientific approaches to treatment also had disturbing consequences for juvenile offenders. Although many scientists believed criminal behavior derived from the child’s environment, another group argued that such activities resulted from genetic abnormalities. These scientists attempted to identify physical traits that suggested a propensity toward criminal behavior in youths. They measured heads and the strength of children’s grasps, among other physical characteristics, in an effort to identify children they labeled “moral imbeciles” or “defective delinquents.” Subsequently, they attempted to establish separate institutions for these children and occasionally even sterilized them. In the post–World War II period, the number of reform schools and inmates in-
552
Reformatories, Twentieth-Century
creased substantially. Between 1950 and 1970, the inmate population increased over 75 percent, from 35,000 to 62,000. The number of publicly operated reform schools grew to almost 200, not including several hundred camps, group homes, and private institutions. Yet crime rates continued to increase, and the length of stay in these schools continued to decline to an average of less than one year by 1970 (Schlossman 1995, 383–384). Sensitized by the civil rights movement and a growing distrust of law enforcement, critics began to question the methods of social control employed by both the juvenile court systems and reformatories. Lawyers, who had effectively been removed from the juvenile court process, argued that the system failed to protect the constitutional rights of the children to due process of law. Social scientists pointed to increasing crime rates as evidence that the juvenile justice system did not work. They viewed the expectations that the system could simultaneously prevent delinquency and solve nearly every youth problem as unrealistic. Furthermore, they theorized that entry into the juvenile justice system, and in particular periods of incarceration, stigmatized children and isolated them even further from the rest of society. Both groups emphasized the injustice of entrapping noncriminal youths in the system because of their poverty or unstable homes. A final effort to achieve juvenile rehabilitation through reformatories resulted in the creation of youth authority agencies that centralized control over placement and treatment of delinquent youths. Its advocates believed that rehabilitation problems would lessen once experts controlled the entire process. As in earlier ef-
forts, a lack of funding and qualified personnel undermined these attempts. Ironically, however, this process led to the emergence of the deinstitutionalization and community treatment movements, which sought to promote nonresidential (or at least noninstitutional) methods of juvenile rehabilitation. The most powerful youth authority developed in California, where in 1961 it led the movement to revise the state juvenile code. The new code distinguished more clearly between criminal and noncriminal juveniles within the system and increased the due process rights of youths. It also launched the Community Treatment Project in parts of Stockton and Sacramento. This social experiment grouped children according to their maturity level and assigned them randomly to either a reformatory or community treatment program. A select group of youth authority agents, given both small caseloads and additional resources, supervised these youths. The results from this program allegedly demonstrated the superiority of community programs, in terms of both rehabilitation and financial savings to the state. Despite arguments that the scientific justification for the experiment was shallow and misleading, these results caused momentum in favor of community programs to grow during the 1960s and 1970s. Most dramatically, the commissioner of the Massachusetts Department of Youth Services, after determining that he could not keep them “caring and decent,” decided to close all reform schools in the state in 1972 and replace them with a network of small, generally nonsecure group homes. In 1974, Congress further encouraged this approach by passing the Juvenile Justice and Delinquency Prevention Act, which
Revolutionary War provided grant funds for the development of state and local delinquency prevention programs emphasizing diversion and deinstitutionalization. Once again, however, these reforms did not effect widespread change. Incarceration rates dropped during the 1970s, but juveniles who would have gone to public institutions may instead have been sent to group homes and private custodial centers. By the 1980s, the rate of juvenile incarceration reached an all-time high of more than 200 juveniles per 100,000. More than 50 percent of this population was African American. Ironically, more recent developments in juvenile justice seem to advocate the sentencing of young offenders to penitentiaries rather than reformatories; the number of youths under age eighteen held in adult prisons rose from 3,400 in 1985 to 7,400 in 1997 (Talbot 2000). Paul Ringel See also Boys Town; Foster Care; Juvenile Courts References and further reading Mennel, Robert M. 1973. Thorns and Thistles: Juvenile Delinquency in the United States, 1825–1940. Hanover, NH: University Press of New England. Miller, Jerome G. 1991. Last One over the Wall: The Massachusetts Experiment in Closing Reform Schools. Columbus: Ohio State University Press. Rothman, David J. 1980. Conscience and Convenience: The Asylum and Its Alternatives in Progressive America. Boston: Little, Brown. Schlossman, Steven. 1995. “Delinquent Children: The Juvenile Reform School.” In The Oxford History of the Prison. Edited by Norval Morris and David J. Rothman. New York: Oxford University Press. Talbot, Margaret. 2000. “The Maximum Security Adolescent.” New York Times Magazine, September 10.
553
Religion See Bar Mitzvah; Muscular Christianity; Parachurch Ministry; Preachers in the Early Republic; Sunday Schools
Revolutionary War Boys, some as young as ten or twelve years old, played an important part in the American Revolution. In creating the Continental Army with George Washington as its commander in chief, the new Congress of the United States suggested a minimum age limit of sixteen for its servicemen. However, many recruiters did not ask too many questions about the age of volunteers when they had trouble finding enough men to meet their quotas. Most boys served the Patriot cause by enlisting in their state militia or by joining the Continental Army. Almost all served as ordinary soldiers, but a few became officers even though they were still teenagers. The youngest served as drummer boys. Some worked as spies behind enemy lines. Others joined the Continental Navy or served on privateers—pirate ships that preyed on enemy shipping. Those who served for the duration of the war usually had a combination of these experiences because they were moved between different kinds of activities as the war dragged on. Complete statistics on the service of boys in the Revolution, like data for all other servicemen, are patchy. Complete records survive for only a few regiments and then only for short periods of time. One such record for a four-year period from New Jersey shows that approximately 10 percent of soldiers were under eighteen. Another from Maryland for the year 1782 shows 25 percent between fourteen and nineteen, and if calculated only
554
Revolutionary War
on native-born rather than foreign-born soldiers, the number becomes 37 percent. A Virginia study shows that about 5 percent of that state’s troops were fourteen or fifteen years old. The preponderance of boys and young men in the army was a constant, but it seems likely that the number of younger soldiers increased as the war progressed and the army became more desperate for manpower. Boys had a variety of reasons for enlisting. Some joined because they supported the cause of independence from Britain, out of a desire to be part of a big adventure, from peer pressure, or from a combination of these. A few had less conventional reasons. One young man, Eli Jacobs, served in the Massachusetts Continentals and the state militia for a variety of short terms from the age of fourteen but only signed on for a longer term following an argument with his stepmother (Dann 1980, 59). Others were drawn into service by financial need. All during the war, states offered a variety of bounties, usually cash or clothing, to attract poor boys or men into the service. This was a particularly good way for boys to contribute to their families’ welfare. Research indicates that for many poor boys, the bounty money was probably a significant inducement. In Maryland, the bounty was equal to onequarter of the taxable property owned by the family of the average recruit. Enlistment brought in some needed ready cash to a family. Also, if the recruit was a dependent rather than a contributing family member, signing up meant he would now be fed and clothed by the army and no longer be an expense to his family. Other boys helped their families by serving as substitutes for older family members. As the war progressed, states had introduced a draft for eligible men.
However, the man drafted did not need to go if he could pay someone to go in his place or find a relative to swap places with him. For some poor families who could not afford to lose the labor or wages of a father or older brother, a younger son might be sent instead to meet the obligation. However, if his father did go off to serve, a son’s labor became even more essential to the family. It meant young boys had to assume greater responsibilities to their families at an earlier age. One fifteen-year-old who wanted to be part of the big adventure was Joseph Plumb Martin of Massachusetts. He was living and working on his grandfather’s farm when war broke out in 1775, and even though he did not really understand the Patriot cause, he very much wanted to be called “a defender of my country,” and he was sure the Americans were “invincible.” He was unsure of his ability to endure the hardships of a soldier’s life, but finally he let his friends talk him into signing on. They bantered with him, saying, “If you will enlist I will.” And so young Martin, at sixteen, found himself a soldier signing up for six months. He reenlisted when his term was up and continued to serve until the end of the war in 1783, during which time he endured much hardship (Martin 1993, 11–13). Service offered some boys the opportunity for social advancement. Jeremiah Greenman was seventeen and not yet trained in any skill when he enlisted in the army in 1775. No information is available about his political knowledge at the time, although his diary indicates that he acquired some as the war progressed. His diary also shows us that he went from having basic literacy skills to being a sophisticated writer and a thoughtful reader. By the war’s end, he had embarked on a program of self-education, choosing his
Revolutionary War books carefully from the recommendations of others or popular advice books. The war also tested his resolve. In the first year, on the Quebec campaign he had endured hunger, cold, and a nine-month period as a British prisoner of war—all before his eighteenth birthday. After a time at home following that experience, the young Rhode Islander, now age nineteen, signed up again, this time serving as a sergeant. He seems to have thrived under his new responsibilities and by age twentyone was an ensign in the Second Rhode Island Regiment (Greenman 1978). Although many boys who served ended the war as poor as they began, that was not true for all of them. Some boys signed on to go to sea on American privateers in the hope of making their fortunes. A privateer was a privately owned vessel that operated by special commission granted by Congress and was essentially engaged in piracy. Its mandate was to run down and seize enemy merchant ships. The goods seized in the attack were later sold and the proceeds split among the owners, officers, and crew. The potential rewards attracted men of all ages despite the considerable risks of life at sea and the danger of being seized by the British and either held as a prisoner or impressed to serve in the British Navy. Joshua Davis and Ebenezer Fox were two boys who decided to try their luck on the high seas, but neither had the opportunity to make their fortunes. Davis had served with the Massachusetts militia since he was fifteen but at nineteen decided go to sea. Unfortunately, his ship was captured by a British frigate, and he was forced to serve in the British Navy for nine months, then was held in a prison at Plymouth, England, before again having to serve at sea for the British. The young man was twenty-
555
seven before he was able to escape and return to Boston, a free man, in 1787, four years after the war’s end. Fox was no more fortunate. He had run away to sea at the age of twelve in 1775 and worked as a cabin boy on an American merchant ship traveling to the West Indies. After war was declared, he narrowly avoided capture by the British and decided to go home to Rhode Island to pursue a barber’s trade. Boredom prompted him to try his luck at sea again in 1779, and at sixteen he signed on to join the navy. Now he experienced battle against the British. He was part of a gun crew, responsible for swabbing out the cannon after firing and then ramming in the next shot of powder and ball. A few months later, the British captured the ship, and Fox was taken prisoner. The unfortunate seaman was held on the notorious prison ship the Jersey, moored off Long Island. In order to leave that awful prison, he agreed to join the British Army and soon found himself with a British regiment in Jamaica. From there, after a few months, he made his escape by fleeing to Cuba where, since Spain was an American ally, he hoped to find a ship bound for America. He was not disappointed. He signed on the American frigate Flora, but before it sailed, he was impressed onto a French naval vessel. The military alliance between France and the United States did not prevent the French captain from seizing any potential crew members he might need, no matter what their country of birth. Making his escape from the French warship, Fox fled to the Flora again and eventually returned home after the war’s end, glad to be back to the quiet trade of barbering at the old age of twenty (Fox 1838). Boys like Greenman, Fox, and Davis were lucky to survive their prison experiences. Prisons were notorious for the
556
Revolutionary War
diseases and the consequent high mortality rates among the prisoners. When Fox was held on the Jersey, 1,000 other prisoners were already there, short of food, held in poorly ventilated space, with disease spreading rapidly among them. Of the three, Greenman had a somewhat better experience. Taken prisoner at Quebec in 1776, he and his fellow prisoners received adequate food. In winter, smallpox had been rampant among the prisoners, but he had not caught it. In the spring, Greenman noted that he and his fellows were able to keep themselves healthy and strong by playing ball in the yard. They were fortunate in that they were exchanged and released nine months after their capture. Although barefoot on his release, he was otherwise well. The adventure of serving often led boys to use a wide variety of skills. Thomas Marble of Connecticut, for example, unintentionally served on land and sea. Enlisted in Connecticut regiments from the age of sixteen onward, Marble was with a detachment assigned to capture a British schooner and three sloops. To do this, he and his fellows were required to man whaleboats and sail them behind enemy lines to carry out the attack, which they did successfully. Before they could bring back their prisoners, they had to repair one of the sloops, which required all hands to turn their skills to ship repair too (Dann 1980, 327–330). Younger boys usually saw less direct action in the war. Many of them were serving with their fathers in some capacity. Despite their young age, they usually received separate enlistment papers. They carried out a variety of duties, sometimes working as their fathers’ servants, as messengers, or as drummers to the regiment. Israel Trask was ten years old when he went with his father, a lieu-
tenant in the Massachusetts line of the Continental Army, to camp in 1775. The young Trask served as a cook and messenger, though he noted that his father collected his pay and rations. The son of Major Putnam, a boy about thirteen, was also serving with this regiment as a drummer boy. That boy had the task, as many drummers did, of applying the lash as punishment to another soldier. The drummer was hoping he could use his father’s influence to get out of the unpleasant duty, but he was required to do it (Dann 1980, 406–414). Sometimes boys could move more inconspicuously than men behind enemy lines and so were occasionally used as spies. One such was William Johnson, who at the age of sixteen was serving in the New Jersey state militia. At twenty he became a spy for George Washington, entering the city of New York as a blackmarket merchant. His task was to make careful observations of military activity in the city and report it back to Washington. In addition, the commander in chief used Johnson to spread untrue reports in New York of American military plans in order to mislead the British (Dann 1980, 353–357). Caroline Cox References and further reading Coggins, Jack. 1967. Boys in the Revolution: Young Americans Tell Their Part in the War for Independence. Harrisburg, PA: Stackpole Books. Dann, John, ed. 1980. The Revolution Remembered: Eyewitness Accounts of the War for Independence. Chicago: University of Chicago Press. Davis, Joshua. 1819. Joshua Davis’ Report. Collections of the New England Historical and Genealogical Society. Fox, Ebenezer. 1838. The Revolutionary Adventures of Ebenezer Fox. Boston: Monroe and Francis.
Rock Bands Greenman, Jeremiah. 1978. Diary of a Common Soldier in the American Revolution, 1775–1783: An Annotated Edition of the Military Journal of Jeremiah Greenman. Edited by Robert C. Bray and Paul E. Bushnell. Dekalb: Northern Illinois University Press. Lender, Mark Edward. 1980. “The Social Structure of the New Jersey Brigade.” In The Military in America from the Colonial Era to the Present. Edited by Peter Karsten. New York: Free Press. Martin, Joseph Plumb. 1993. Ordinary Courage: The Revolutionary War Adventures of Joseph Plumb Martin. New York: Brandywine Press. Papenfuse, Edward C., and Gregory A. Stiverson. 1973. “General Smallwood’s Recruits: The Peacetime Career of the Revolutionary War Private.” William and Mary Quarterly 30: 117–132. Sellers, John R. 1974. “The Common Soldier in the American Revolution.” In Military History of the American Revolution. Edited by Betsy C. Kysley. Washington, DC: USAF Academy.
Rock Bands Rock bands have played a significant role in the identity formation of adolescent and preadolescent boys since the early 1960s. Whether as fans or as participants in their own bands, boys have formed strong group identities around the existence of rock bands and the cultural messages they present, especially those messages concerned with authority figures, sexuality, and self-worth. By the early 1960s, the rock band had become the significant unit of popular music, replacing the traditional format of a bandleader, single musician, or vocal group working with an orchestra or session musicians. Following the formula of guitar, bass, and drums established by the blues bands of the early 1950s and taking their inspiration from early rock and roll musicians such as Chuck Berry and Buddy Holly, rock bands also changed the
557
sound of popular music by making the guitar the central instrument. Beginning with the California surf bands of the early 1960s, such as the Champs, the Ventures, the Surfaris, and, most important, the Beach Boys, these four- or five-piece bands began to work together as self-contained units. In many cases, they made their own arrangements and composed their own material, jobs that traditionally had been performed in the music world by individuals separate from the musicians themselves. The small size and singleness of purpose of rock bands led to their becoming close-knit entities, a feature enhanced by their homogenous makeup. For at the same time that rock bands were developing, rock music began to split from its rhythm and blues roots; as it did so, followers of popular music generally split along race lines, with rock becoming a medium dominated mostly by white men. With a few notable exceptions— Jimi Hendrix, Janis Joplin, and more recently the Bangles, Los Lobos, Living Color, and the riot grrrl bands—participation in rock bands, too, became almost exclusively white and male. The great catalyst for most American boys’ interest in rock bands was the arrival of the Beatles in the United States in February 1964. Their impact on popular music was instant and dramatic and paved the way for many other British groups as well as scores of imitators. Not only did they provide a blueprint for all rock bands that followed, but like the various forms of popular music that preceded them, they inspired fashions and sensibilities that were immediately adopted by teenage boys and girls. Their collar-length pageboy cuts and Cubanheeled boots replaced the popular look of the 1950s—the leather motorcycle jacket
558
Rock Bands
and slicked-back haircut. This was the first change in a cycle of fashions inspired by subsequent bands and musical genres. The Beatles look, in turn, gave way to the bell-bottom jeans, T-shirts, and long hair of the late 1960s hippies; this look dominated the 1970s until the spiked hair and thin black ties of the new wave/punk movement became prevalent at the end of the decade. During the 1980s, jeans and black T-shirts with band logos worn by fans of the heavy metal “hair” bands became the fashion. In the 1990s, the baggy clothing and unkempt appearance of the alternative and grunge scene and the black clothing, makeup, body piercing, and tattoos of the Goth movement supplemented these rock fashion styles. Although each trend sought to replace a previous one, many looked back to and coexisted with other trends. Both boys and girls identified with a specific look; the look and the music that accompanied it became a significant part of their evolving identity. But rock fashions alone have not been the only force to shape adolescent male identities. For some fans, identities have been forged through the hobby of collecting the recorded music and memorabilia generated by a given band or musical genre. For others, a fascination with the lifestyles of the bands begun in the early teenage years manifests itself in later life in the desire to follow the band on tours, as demonstrated by the Deadheads, followers of the Grateful Dead, and fans of the band Phish. For these fans, the experience of seeing a particular band live, often on numerous occasions, creates a powerful group identity. An equally powerful shared experience for a teenage boy comes from joining a rock band, and many boys have been inspired to participate in a “garage band,”
named after the traditional rehearsal space, where they would attempt to master a repertoire of easy-to-play songs. Again, the Beatles and the so-called British invasion bands that followed them proved to be an early catalyst for garage bands. The success of the Beatles led to a widespread demand for inexpensive instruments, and manufacturers of cheap equipment enjoyed an unprecedented demand for their goods following the Beatles’ arrival in the United States. By the mid-1960s, one manufacturer of cheap guitars, Kay, was producing 1,500 guitars a day (Wheeler 1990, 240). Demand, however, was so great that the domestic market could not keep pace, and more than 500,000 guitars were imported into the United States in 1966 (Wheeler 1990, 234). Although some boys were formally trained, the majority were selftaught. The rough and unschooled music produced by many of these bands in their attempts to imitate the style of rock music currently in fashion inspired several genres of rock music, including punk in the 1970s and grunge and alternative in the 1990s. No one can say how many bands played only in the garage in which they practiced. Such is the volatile nature of teenage relationships that many probably played only a few rehearsals and others played only one or two performances at private parties or teen dances in front of their friends. Even so, participating in a band gave and continues to give adolescent boys a sense of belonging to a small clique, with its accompanying status in the larger peer group. As such, membership in a band closely resembles membership in a gang, with shifting allegiances, fighting, bonding, and shared identity all a part of the collective experience. But the lure of playing in a band is so great for
Roosevelt, Theodore those who experience it that many who participate in rock bands as adolescents often continue well into adulthood. Rock music’s traditional messages— sex, rebellion, hedonism, and adolescent identity—both influence and are influenced by teenagers as they journey through puberty. Musician John Mellencamp’s story, for example, reflects the situation of many; for him, the decision to become involved with music was clearly a rebellion against the family and community values of his childhood in Indiana. Like many rock icons such as Jim Morrison of the Doors and Kurt Cobain of Nirvana, Mellencamp recalled a problematic childhood associated with domestic violence. For him, rock music was a rejection of his domestic life; it “was something my friends and I calculated the right people would hate” (White 1990, 602). Such rebellion also finds voice in the lyrics of many rock songs exhorting listeners to party and indulge in drugs and alcohol, for many teens an ultimate rejection of family and community values. For others, such as singer Jon Bon Jovi, involvement with music was more about sex and self-esteem. In an interview, he recalled that at the age of fourteen when his first guitar teacher asked him why he wanted to play, he replied, “To get chicks, what else?” and added, “I was never very good at picking up girls, and I’m still not. . . . I had no lines, so music was my method” (White 1990, 763). Fellow New Jersey musician Bruce Springsteen also recalled the powerful effect the discovery of rock music had on the formation of his identity when he said, “I was dead until I was thirteen. I didn’t have any way of getting my feelings out. . . . So I bought a guitar. The first day I can remember looking in a mirror and being able to stand
559
what I was seeing was the day I had a guitar in my hand” (Marsh 1996, 15–16). Although many critics have heralded the death of rock music ever since its inception in the 1950s, the recent popularity of hard rock bands such as Korn and Limp Bizkit has shown that guitar-centered rock music still holds sway over a significant number of young white boys. It is a powerful tool in forming their identity at a crucial point in their development. Fashions in music, clothing, and appearance will undoubtedly change, but rock music in some form or another will probably exist for much longer, providing boys with a focal point for coming to terms with their personal and social identities. Bruce Pegg See also Music References and further reading Marsh, Dave. 1996. The Bruce Springsteen Story. Vol. 1, Born to Run. New York: Thunder’s Mouth Press. Wheeler, Tom. 1990. American Guitars: An Illustrated History. New York: Harper. White, Timothy. 1990. Rock Lives: Profiles and Interviews. New York: Holt.
Roosevelt, Theodore The statesman, environmentalist, and domestic reformer who served from 1901 to 1909 as the twenty-sixth president of the United States, Theodore Roosevelt also distinguished himself as a naturalist, outdoorsman, historian, soldier, explorer, commentator on contemporary affairs, and husband and father of six children. Roosevelt was born into wealth and comfort in New York City on October 27, 1858, the second of four children and the older of two sons of Theodore Roosevelt, Sr., and Martha Bulloch Roosevelt. His
560
Roosevelt, Theodore
privileged boyhood was sheltered in certain respects but expansive and stimulating in others. It was also heavily burdened by the very severe asthmatic condition that plagued him until his college years. One can locate in Theodore’s boyhood many roots of his adult character and of the morally upright, energetic, high-achieving man he was to become. The influence of his father was to prove especially profound. Known as “Teedie” among his intimates, young Theodore lived for his first fourteen years in a five-story brownstone located at 28 East 20th Street, which was a wedding gift to his parents from his millionaire paternal grandfather, Cornelius Van Schaak Roosevelt, a plate-glass importer and real estate investor. Theodore’s mother Martha (“Mittie”), born in 1835, had grown up in a relatively prosperous but sometimes struggling slaveholding plantation family in Roswell, Georgia, from where she moved to New York upon her marriage in 1853. Theodore’s father, Theodore, Sr., born in 1831, worked, along with an older brother, in grandfather Cornelius’s firm. Theodore’s three siblings were Anna (“Bamie”), born in 1855; Elliott (“Ellie”), born in 1860; and Corinne (“Conie”), born in 1861. The two other members of the household (aside from servants) were Martha’s mother “Grandmamma” (until her death in 1864) and, as governess, Martha’s older sister Anna (until her marriage in 1866). This core group of eight was frequently augmented by visiting relatives and family friends; Edith Carow (“Edie”), Corinne’s favorite playmate and Theodore’s childhood pal and future wife, was prominent among the latter. For the young people who lived there, the Roosevelt home was a happy and protective place. Both Martha and Theodore,
Sr., were warm and loving and extremely attentive to the needs of their children. Of the two, Martha was the more vivacious, eccentric, and literary, traits that apparently were transmitted in abundance to her older son. The father, however, was the more dominant presence and the more influential figure. Decades later, in an autobiography written after his presidency, Theodore worshipfully remembered his father as the best man I ever knew. He combined strength and courage with gentleness, tenderness, and great unselfishness. He would not tolerate in us children selfishness or cruelty, idleness, cowardice, or untruthfulness. As we grew older he made us understand that the same standard of clean living was demanded for the boys as for the girls. . . . With great love and patience, . . . he combined insistence on discipline. . . . He was entirely just, and we children adored him. (Roosevelt 1985, 7–8). The sheltered upper-class life of the Roosevelt children extended to their education, which was entrusted to private tutors, the first of whom was their aunt Anna. With the exception of a very brief and unsuccessful trial enrollment at a small private school near his home, Theodore’s first experience as a student in an educational institution came at Harvard University in 1876. Actually, because his parents clearly recognized his academic aptitude and interests, Theodore probably would have been sent to a top-flight boarding school prior to beginning his higher education. They were deterred from taking this path by the persistence of their older son’s struggle with bronchial asthma, which
Roosevelt, Theodore had begun around the age of three. Theodore’s asthma attacks were recurrent and extremely debilitating, at times bringing him to the verge of suffocation. Theodore, Sr., the boy’s principal source of comfort and rescue when an asthma attack struck, spared no effort or expense as he and the young victim jointly battled the terrifying disease. All sorts of remedies were attempted, ranging from such dubious ones as consuming black coffee, swallowing ipecac, and smoking cigars to such potentially more constructive approaches as carriage rides for fresh air, summers in the countryside, and a regimen of rigorous physical exercise. Nevertheless, the condition was not finally conquered until Theodore was in college. Between the ages of two and six, Theodore lived in a home in which the Civil War was an ever-present reality, particularly in light of his sectionally mixed parentage. Young Theodore came to side wholeheartedly with his fervently Unionist father, whereas the sympathies of his mother, aunt, and grandmother were solidly with the Confederacy. In deference to the sensibilities of his wife, whose brothers and other relatives were fighting for the South, Theodore, Sr., did not take up arms for the Union, although he hired a substitute for $1,000 after the draft was instituted in 1863. But he did work tirelessly, meeting with President Abraham Lincoln in Washington, D.C., to win legislation for and then to help to organize the voluntary allotment to their families of a portion of the soldiers’ pay. On East 20th Street, his son Theodore was increasingly demonstrative about his pro-Union position. But whether at home or away, Theodore, Sr., was never sparing in expressions of his great love for his wife, and the marriage and the family’s
561
unity survived the emotional strain of the war in remarkably good shape. Between 1865 and 1871 Theodore, although asthmatic and quite frail, lived a mostly happy life among his parents and siblings and their small, elite social circle. He was an avid young reader who was particularly attracted to stories of adventure and heroism, and he was full of curiosity about the world. When he was seven or eight, his chance viewing of a dead seal laid out on a Broadway sidewalk generated tremendous excitement and appears to have sparked his lifelong passion for natural science. Between May 1869 and May 1870, the Roosevelts experienced an extensive transatlantic tour through Great Britain, Belgium, the Netherlands, Germany, Switzerland, Italy, Austria, and France. During the trip Theodore, who genuinely enjoyed seeing so many new places in the company of his family, kept a detailed diary marked by language that was rather precocious for a ten- to elevenyear-old. Moreover, the endurance displayed by Theodore as he hiked in the Alps with his father and the apparent benefit of these hikes to the boy’s health led the father to advise the son soon after the journey: “Theodore, you have the mind, but you have not the body, and without the help of the body the mind cannot go as far as it should. You must make your body. It is hard drudgery . . . , but I know you will do it” (quoted in McCullough 1981, 112). The youth vowed to heed his father’s summons, and so began an enduring commitment to strenuous physical activity. The period 1872–1873 constituted a turning point for Theodore. After a humiliating encounter with two stronger boys during the summer of 1872, he began taking boxing lessons. Around the same time he received his first gun and
562
Roosevelt, Theodore
learned to shoot, and he learned taxidermy. Perhaps most important, he realized that his vision was severely impaired and acquired his first spectacles, which, he said, “literally opened an entirely new world to me” (Roosevelt 1985, 19). In October 1872 the Roosevelts embarked on a second overseas odyssey, this one significantly more dramatic and rewarding for Theodore than the first. The family again visited England and continental Europe but in addition followed a greatly expanded itinerary that included Egypt, Palestine, Syria, Turkey, and Greece. For Theodore the leading highlight was the winter of 1872–1873, which was spent living on a well-staffed dahabeah (houseboat) that carried the group about 1,200 miles up and down the Nile River. Theodore’s preoccupation during these months was collecting bird specimens, an activity abetted by the breech-loading shotgun presented to him by his father as a Christmas present and by his family’s generous tolerance of foulsmelling taxidermal operations on deck. He was also fascinated by Egypt’s ancient treasures. During most of the period from May 1873, when their father returned to the United States, until their own departure several months later, Theodore, Elliott, and Corinne lived as guests of the Minkwitz family in Dresden, Germany, where Theodore applied himself assiduously to the study of German and other subjects. When Martha and her children arrived back in New York in November 1873, they took up residence in an “infinitely more luxurious” new uptown house complete with a well-equipped gymnasium on the top floor at 6 West 57th Street, near Central Park (McCullough 1981, 135). Theodore’s grandfather Cornelius had passed away in 1871, leaving
each of his offspring a veritable fortune, and Theodore, Sr., put the best of everything into the family’s new house, even as he increasingly devoted time and resources to his many and varied philanthropic pursuits. In 1874 Arthur Cutler, a young Harvard graduate, was hired as Theodore’s tutor and immediately began to prepare him for the Harvard entrance examination. Theodore’s education up to that point had been uneven; he was very advanced in natural science, history, geography, German, and French but lagged in mathematics and classical languages. He and Cutler focused on the deficient areas, and the pupil’s diligence bore fruit. He passed the exam during the summer of 1875, after which he continued to study with Cutler in anticipation of his impending enrollment at Harvard. During his precollege years, Theodore blossomed physically and emotionally as well as intellectually. From 1874 on, the Roosevelts spent long summers on the northern shore of Long Island in Oyster Bay, New York, where as an adult Theodore would build Sagamore Hill, his and Edith’s permanent home and the summer White House during his presidency. These youthful summers were a time of rapid personal growth and immense happiness for Theodore, with the wild and beautiful natural environment, the presence of teenage girls, and an abundance of time for reading. During the summer of 1876, Theodore’s sister Anna, who often served as a third parent for the other three children, went up to Cambridge, Massachusetts, and secured a second-floor room in a boardinghouse at 16 Winthrop Street, which would serve as Theodore’s primary residence until his graduation from Harvard in 1880. On his own for the first time, he threw himself
Roosevelt, Theodore into his studies and his recreational and social activities while maintaining a regular correspondence with his parents and siblings in New York. In one letter to his father, he wrote: “I do not think there is a fellow in college who has a family that love him as much as you all do me, and I am sure that there is no one who has a father who is also his best and most intimate friend, as you are mine” (quoted in McCullough 1981, 165). At Harvard Theodore neither smoked nor gambled, drank only in moderation, and was determined to (and did) remain sexually “pure” until marriage. (He would marry Alice Lee shortly after his graduation, tragically lose her to Bright’s disease in 1884, and marry Edith Carow in 1886.) Even though the orientation of his churchgoing Presbyterian family had always been far more worldly than spiritual, Theodore also taught a Sunday school class in Cambridge. His intelligence, stamina, and diligence enabled him to achieve academic success as a freshman and to begin to excel as a sophomore in German, natural science, history, and rhetoric. His health was unexpectedly good; apparently, his asthma had finally been beaten. And his first published work, “The Summer Birds of the Adirondacks in Franklin County, N.Y.,” a pamphlet coauthored with Harvard classmate Henry Minot, was produced in 1877. For most people, it is nearly impossible to pinpoint the moment of transition from childhood to adulthood. For Theodore Roosevelt, however, it is easy: February 9, 1878, the day his forty-six-yearold father succumbed to stomach cancer after a brief, futile struggle. It was “the blackest day of my life” the young man soon afterward recorded in his diary. Theodore, Sr., had “shared all my
563
joys, . . . and soothed all the few sorrows I ever had. . . . The days of unalloyed happiness are now over forever” (quoted in Brands 1997, 85). Notwithstanding his grief, Theodore was fortified by images of his literary heroes and “discerned his duty. He must bear up under this trial and conduct himself as his father would have wished” (Brands 1997, 85). In the wake of his exceedingly painful loss, he matured rapidly, willingly assuming new responsibilities for his siblings and his mother. Having inherited from his father $125,000, a huge sum in 1878, he had the advantage of financial independence as he made the sudden transition from boyhood to manhood. The father would never cease to be the son’s compass; any contemplated course of action could always be evaluated in relation to the father’s exacting moral standards. The father’s devotion to and nurturing love for his wife and children became the model for the son’s similar devotion and nurturing. The father’s aversion to idleness would be honored by the son’s advocacy of and enthusiastic engagement in “the strenuous life” and by his uncommonly diverse and astonishingly productive career. And the father’s patriotism, hostility to corruption, and extensive philanthropic endeavors undoubtedly guided the son toward military service, public service, concern for the downtrodden, and an ambitious agenda of domestic reform. From 1878 until his own death in 1919, the adult Theodore Roosevelt would live a life he deemed worthy of his idealized father and in the process leave an enormous imprint on his country and the world. William N. Tilchin See also Muscular Christianity
564
Runaway Boys
References and further reading Brands, H. W. 1997. T. R.: The Last Romantic. New York: Basic Books. McCullough, David. 1981. Mornings on Horseback. New York: Simon and Schuster. Morris, Edmund. 1979. The Rise of Theodore Roosevelt. New York: Coward, McCann, and Geoghegan. Roosevelt, Theodore. 1985. Theodore Roosevelt: An Autobiography. 1913. Reprint, New York: Da Capo Press.
Runaway Boys Family life has never been uniform or stable in America, and the family evolved as the nation changed from rural and agrarian to urban and industrial and then to the contemporary suburban postindustrial society. Over time, boys have left their homes in response to a variety of personal and social problems. As early as the colonial era, some minors left home without permission, and many were confined with adult paupers and the indigent in municipal workhouses or almshouses until their relatives claimed them or were placed out as indentured servants or apprentices. Colonial-era selectmen found the runaway apprentice, like the bastard child or juvenile delinquent, a chronic problem. Benjamin Franklin (1706–1790) began his career as a Philadelphia printer in 1723 as a runaway Boston apprentice, and Benedict Arnold (1741–1801) was a Connecticut druggist’s apprentice in 1755 when he ran away to join the army. To some degree, eighteenth-century vagrants were discouraged by local laws against wandering strangers, but by the 1840s the railroad made working-class tramps more mobile. In addition, industrial America needed seasonal workers on farms, in orchards, on ranches, in mines, in lumber camps, and in canner-
ies. Boys who left home, voluntarily or not, often became tramps or hoboes, the migratory workers who hiked or hopped freight trains in search of work. In 1886, Harry Houdini (1874–1926), the magician and escape artist, ran away from home at age twelve to begin his theatrical career. The novelist Jack London (1876–1916) was a teenage hobo in the 1890s, as in more recent times were Supreme Court justice William O. Douglas; attorney Melvin Belli; and songwriters Merle Haggard, Roger Miller, and Bruce “Utah” Phillips. By the 1870s the “tramp menace” had begun to alarm civic leaders, who feared hordes of runaway boys and men descending on towns and cities. Charles Loring Brace, who founded the New York Children’s Aid Society, described this social problem in The Dangerous Classes of New York and Twenty Years Work among Them (1872), as did the detective Allan Pinkerton in Strikers, Communists, Tramps, and Detectives (1878). Partly to deal with homeless youths, Massachusetts founded the nation’s first state reform school in 1846, and antebellum Boston courts held juvenile sessions to separate young lawbreakers (runaways, truants, shoplifters, and petty thieves) from adult offenders. This reform led to the establishment of more formal juvenile courts in Boston, Chicago, and Denver by 1906. The most famous (fictional) runaway American boy was Huckleberry Finn, Aunt Sally’s orphaned foster child, who planned “to light out for the Territory” in the 1850s. The best-selling novelist Horatio Alger, Jr. (1832–1899), centered 100 melodramatic rags-to-riches novels on plucky boys who left home to seek fame and fortune in cities as newsboys, “street Arabs” (Mark the Match Boy, 1869), boot-
Runaway Boys
565
The director of a private charity that operates a shelter for runaways comforts a child in the dormitory. (Steve Raymer/Corbis)
blacks (Ragged Dick, 1867), or street musicians (Phil the Fiddler, 1872). Alger found inspiration on his visits to the ragamuffins and guttersnipes at Charles Loring Brace’s Newsboys’ Lodging House, who provided vivid material for his fiction. In the era between the Civil War and World War II, many American teenage boys dreamed of becoming hoboes, an exciting rambling life hopping freight trains or riding the rails beneath Pullman cars. The poet Robert Service celebrated the “race of men who won’t fit in . . . for theirs is the curse of gypsy blood” who created their own hobo slang, songs, sign language, and codes. Running away to go to sea; to join the army, navy, or the circus; or to become a cowboy has long been an American tradition, and the star-
struck girls who left home since the 1920s hoping to become Hollywood starlets have male counterparts too. For every rebellious flapper in the Roaring Twenties, there was a young sheikh who rejected puritanical values and lived for the present. The growing problem of juvenile delinquents living on the Omaha streets attracted the attention of an Irish Catholic priest, Father Edward J. Flanagan (1886– 1948). In 1917 he founded what would later be called Boys Town to provide a rural home for runaway boys, winning public support for the project and becoming a recognized expert on homeless juveniles. During the Depression an estimated 4 million Americans rode the rails, and at least 250,000 of these transients
566
Runaway Boys
were runaway boys. Many large construction projects, and later the Tennessee Valley Authority and the Hoover Dam in the 1930s, attracted young tramps to remote work sites. Boys who rode trains illegally (without paying) were the subject of socially conscious Hollywood movies like Wild Boys of the Road (1933) and Boys Town (1938), as well as a more recent PBS documentary, Riding the Rails (1999). The Depression and World War II disrupted millions of lives, and the postwar era of conformity did not embrace all Americans. Although historical data are not specific, the incidence of runaway boys in the twentieth-century United States may be related to the increased rate of divorce and the fragility of the nuclear family in industrial and postindustrial society. In 1957 the novel On the Road by Jack Kerouac (1922–1969) inspired a new generation of boys to leave home for adventures exploring the country, from New York City’s bohemian Greenwich Village to San Francisco’s North Beach. The Beat generation hitchhiked across the country, indulging in sex, drugs, and alcohol and rejecting mainstream social values. These young people dropped out of society, detaching from family connections and finding freedom in perpetual movement. In the 1960s the Beats influenced the counterculture hippies, attracting many runaway flower children to the Haight Ashbury district in San Francisco. Runaway boys in all eras have faced serious problems. Many who entered the hobo ranks were victimized by predatory homosexual men called “jockers,” who recruited and exploited these boys as their “punks.” A more recent form of this exploitation in big cities concerns men called “chickenhawks” recruiting run-
away boys or girls as prostitutes, often seeking inexperienced youths at urban bus and train stations. These are only some of the dangers runaways have faced. The brutal reality of life on the road was often overlooked or unknown, and even the most popular American illustrator, Norman Rockwell, depicted a runaway boy as a humorous innocent long after the dangers were understood. Most runaways found only loneliness, poverty, hunger, and exploitation in life on the road, quite unlike the romantic myths of the carefree hobo existence. Nonetheless, there is some evidence that boys who run away from home benefit from both positive and negative experiences by learning to be more self-confident and independent. Some boys return home more willing to accept parental guidance and to communicate more with parents. Anger with one parent is frequently cited by runaways as the reason for leaving home, and puberty is a confusing time for parents and their children as sexual maturity and insecurity become important issues. The act of running away may be an impetus to develop new strengths. A boy may run away from home to reaffirm his faith in his own independence and then return home to follow normal routes to socially acceptable goals. Still, for every boy who runs away from home to establish his independence, there are many more who leave home because of poverty. The number of American children living in poverty increased from 3.2 million in 1978 to 5.2 million in 1998, meaning that 22 percent of young children in the United States lived in poverty. Although most of these millions of children are African American (40 percent) or Latino (38 percent), the incidence
Runaway Boys of poverty among young white suburban children (16 percent) grew rapidly since 1975. These families include boys most at risk for running away due to poverty, family conflict, child abuse, or school problems. By 1971 runaways were recognized as such a serious problem that the National Runaway Switchboard was established in Chicago for crisis intervention with young people in danger of leaving home or attempting to return home. In cooperation with social service organizations and the Greyhound Bus Company, hotline staff encouraged runaways to phone their parents and offered a free bus ticket to their hometown. The number of runaways to San Francisco in the 1950s and 1960s prompted the creation of Huckleberry House in 1967, which may be the oldest of the modern shelters for runaways who distrust established social service agencies. Covenant House opened in New York City in 1968 as a free shelter for street children. Founded by a Franciscan priest at Fordham University, it has become a well-known resource for street children. In its first twenty years, Covenant House served 100,000 runaways and inspired similar shelters in dozens of other cities. Today there is hardly a city in the United States that does not have a runaway population and nonprofit shelters to provide temporary care. Ironically, as runaways became more common in so many cities, they seemed less noticeable as a social problem. The temporary shelters and services became a permanent feature of modern social services. Rising juvenile delinquency rates in the period 1950–1960 led to an overhaul of the juvenile justice system, culminating in the Juvenile Justice and Delinquency Prevention Act of 1974, a watershed law deinstitutionalizing status
567
offenders. This law altered state responses to the runaway problem and diverted boys from jails and adult correctional facilities to community-based treatment and rehabilitative shelters. However, the trend toward decriminalization provided federal and state funds for nonsecure shelters, inadvertently increasing the number of runaways on the streets. By 1999 National Runaway Switchboard data demonstrated that 1.3 million of the 63 million children in the United States were homeless or runaway youths or both. About 35 percent of those who seek help from social service agencies are boys fourteen to seventeen years old (86 percent) on the road for one week or less (61 percent). About seven out of ten runaway boys return home, but those who do not go home frequently find disease and violence on the streets. They often succumb to drug and alcohol abuse, malnutrition, and high-risk behavior leading to a desperate life of panhandling, prostitution, and short-term, poorly paid jobs. Peter C. Holloran See also Alger, Horatio; Boys Town; Franklin, Benjamin; Great Depression; Indentured Servants; Juvenile Delinquency; Orphanages; Poverty; Prostitution References and further reading Anderson, Nels, and Raffaele Rauty. 1998. On Hobos and Homelessness. Chicago: University of Chicago Press. Bock, Richard, and Abigail English. 1973. Got Me on the Run: A Study of Runaways. Boston: Beacon Press. Brevada, William. 1986. Harry Kemp, the Last Bohemian. Lewisburg, PA: Bucknell University Press. Flynt, Josiah. 1972. Tramping with Tramps. Montclair, NJ: Patterson Smith. Holloran, Peter C. 1994. Boston’s Wayward Children: Social Services for
568
Runaway Boys
Homeless Children, 1830–1930. Boston: Northeastern University Press. Minehan, Thomas. 1934. Boy and Girl Tramps of America. New York: Farrar and Rinehart. Nackenoff, Carol. 1994. The Fictional Republic: Horatio Alger and American Political Discourse. New York: Oxford University Press. Raphael, Maryanne, and Jenifer Wolf. 1974. Runaway: America’s Lost Youth. New York: Drake Publishers.
Schaffner, Laurie. 1999. Teenage Runaways: Broken Hearts and Bad Attitudes. New York: Haworth Press. Uys, Errol Lincoln. 1999. Riding the Rails: Teenagers on the Move during the Great Depression. New York: TV Books. Whitbeck, Les B., and Dan R. Hoyt. 1999. Nowhere to Grow: Homeless and Runaway Adolescents and Their Families. New York: Aldine de Gruyter.
S Same-Sex Relationships
ous lengths of time (homosexual behavior). • Boys can be preferentially attracted to other males for an extended (perhaps lifetime) period (homosexual or bisexual sexual orientation). • Boys are increasingly accepting this designation and coming to acknowledge to themselves and others their membership in a class of individuals (homosexual sexual identity).
Throughout history and across cultures, some boys within a society have been sexually attracted to other boys. In turn, some cultures at particular historic moments have encouraged, or even required through ritualistic means, boys to engage in same-sex behavior and emotional intimacies. Within these historical and cultural contexts, it is clear that some boys have shown greater interest in these intimacies, preferring over their life course to be primarily involved with other males. This pattern reflects a same-sex sexual orientation, which must be distinguished from same-sex behavior, which in turn must be distinguished from a same-sex identity (a label that is culturally available that one chooses for oneself). Although boys in most cultures and throughout recorded time have engaged in sexual behavior with other boys, some of these youths might very well have had a samesex sexual orientation; however, only recently has this pattern of same-sex attractions, erotic desires, and behavior been given a name—gay or bisexual. Three distinctions are necessary to understand modern concepts of same-sex relations among prepubescent and adolescent males.
These three may be consistent (a gayidentified young man who is exclusively attracted to and engages in sexual behavior with other males) or inconsistent (a heterosexually identified young man who is attracted to both males and females and exclusively engages in sexual behavior with other males). These definitions are compounded when cultural context is considered. For example, a Brazilian young man is permitted by his culture to engage in samesex behavior as long as he embraces machismo, or masculinity, by assuming traditionally dominant male roles, such as being the top, or inserter, during anal intercourse. Under these circumstances, he would not be considered gay, bisexual, or inappropriate. It is his receptive male partner who is considered to have had homosexual behavior and thus is labeled
• Boys engage in same-sex behavior for a variety of reasons and for vari-
569
570
Same-Sex Relationships
gay. Such cultural variations in the structure and meaning of juvenile male samesex attractions and behaviors have been observed and recorded by social scientists for decades. Transgenerational same-sex behavior, historically the most common, involves culturally sanctioned or ritualized sexual relations among boys and older male “mentors.” The boy may be as young as seven years, and the mentor may be a late adolescent, a young adult, or an elder in the community. Sexual relations occur once or continue for several years until the boy reaches physical or social maturity. This relationship may be culturally prescribed if deemed necessary for the boy to develop physically and spiritually into a man. Such behavior has been observed in ancient Greece and Rome, pre–Middle Ages Europe, Africa, various Islamic societies, early modern Japan and China, some Native American societies, and Polynesia. For example, in some Melanesian societies the homoerotic is indelibly coupled with the sacred, such as boy-inseminating rites performed in the belief that a prepubescent boy must receive semen through oral or anal sex donated by postpubertal males if he is to grow, masculinize, and become a social (warrior personality) and reproductive member of his society. Institutionalized marriages between boys and adolescent males have been documented in other cultures. Thus, modern attempts to demonize, criminalize, or pathologize male-male sexual relationships are inconsistent with the vast sweep of the historical and cross-cultural record. In nearly twothirds of nonindustrialized societies, select male-male sexual interactions are both normative and socially acceptable. Although cultures may ritualize samesex relations among its juvenile males,
they seldom encourage exclusive samesex relations as a lifetime pattern. The Greeks considered same-sex relationships (in which sexual relations were an essential component) between a youth and a male somewhat his senior to be the highest form of love; however, eventual heterosexual marriage and reproduction were expected of all. The historical record also indicates that even in cultures that disapproved of or condemned male same-sex sexual conduct, more latitude was given to juvenile male-male sexual interactions, perhaps because the behavior was perceived to be less meaningful and less indicative of a lifelong pattern. As one 1900 guide for raising healthy children suggested, the sexual appetite is at its highest level during the most selfish period—adolescence. If girls are not available, then better that boys “experiment” with each other than deflower young girls. This perceived “homosexual phase” is believed to disappear once more “mature” (that is, heterosexual) sexual relations are culturally feasible. Despite these cultural variations, one universal appears: regardless of cultural or historical context or judgment, some boys will be driven by their sexual orientation to preferentially desire, seek, and enact sexual relations with other males. Their proportion within any given population is difficult to determine but is likely to be under 10 percent. Among the Sambians of New Guinea, although all young boys begin their sexual careers at age seven exclusively with other males, ingesting their semen through oral sex, all marry at age sixteen and rarely have sexual relations with other males after age eighteen. However, around 5 percent of males continue to seek same-sex relations throughout their lives. Thus, 100 percent of Sambian boys behave homo-
Same-Sex Relationships sexually, but 5 percent appear to be homosexual (Herdt 1987). Social science research has seldom focused on same-sex behavior or orientation but rather on youths who identify as gay or bisexual. Little is known about the meaning and context of same-sex behavior among boys or the prevalence and lifestyles of those who have a gay or bisexual sexual orientation but who do not identify as such. Whether these individuals are typical of other male adolescents who are gay by orientation or behavior but do not identify as gay is unknown. Professional journal articles about gay and bisexual boys began to appear in the 1970s and increased dramatically in the 1990s. The first empirical study on homosexual/bisexual youths was published thirty years ago in a leading medical journal. The protocol focused on sexual experiences, the disclosure of sexual identity to others, participation in the gay subculture, and counseling experiences. Many of these youths were Seattle hustlers; the number of sexual encounters with men ranged from 1 to 3,000, with a median of 50. Suicidal gestures, emotional turmoil, and problematic behavior were common among the sixteen- to twenty-two-yearolds, the result of their self-identification as gay or bisexual at an early age and subsequent marginalization or rejection by parents and others. This study stood as the sole empirical investigation for fifteen years before several medical and clinical investigations, based on interviews or questionnaires given to small samples of troubled teenage boys who sought the services of mental or social support agencies, examined the stressors that placed boys at high risk for physical, mental, and social ill health. Most such youths reported school problems, substance abuse, and emotional
571
difficulties and had a history of sexually transmitted diseases, running away from home, and criminal behavior. Several had been hospitalized for mental health issues. Among these samples, at least one in three had attempted suicide, and others who had not, said they would consider it in the future. The researchers concluded that the stigma attached to homosexuality in American society made it highly unlikely that a boy could avoid personal and spiritual problems as the net outcome of acquiring a gay or bisexual identity. Many health care professionals were so pleased to have any information on this heretofore invisible population that few raised methodological objections about the population sampled or the instruments used in these studies or protested the conclusions drawn about gay youth from these findings. Advocates cited this research to argue for the inclusion of gay youths in the deliberations of physicians, therapists, and educators and to help youths adapt to their stigmatized identity, which was believed to be inherently problematic. The goals were to alleviate the distress of gay youths, who often faced violence, discrimination, disdain, and ignorance, and to promote interventions and programs for them. Indeed, the most frequent topics in health-related articles on gay youth published since 1970 include suicide, acquired immunodeficiency syndrome (AIDS), victimization, violence, pregnancy, and sexual abuse. Absent from this list are articles on resiliency, coping, and good mental health or articles describing youths with samesex attractions who live productive, happy lives—or even articles describing normative development for gay youths. Based on the limited social science research to date, it is apparent that gay/bisexual boys:
572
Same-Sex Relationships
• Are in many respects similar to all boys, regardless of sexual orientation. They too struggle with ageappropriate developmental tasks, such as negotiating attachment and separation from parents, linking their sexual desires with emotional intimacy, and assessing their standing among peers. Research has demonstrated that many variables, such as self-esteem, age of puberty, age of first sexual experience, and the negative impact of peer teasing do not vary by sexual orientation. • Are also unique from heterosexual boys because of both biological and social factors. Research has shown that sexual orientation is the result to some degree of genetic or early biological origins. Thus it is to be expected that in some domains same-sex attracted boys diverge from heterosexual youths. One common example is the hormonally induced demasculinization and feminization of gay youths. In addition, given the inevitable heterocentric and homonegative culture in which most adolescents live, it is to be expected that individuals with same-sex desires, even those who do not claim a gay or bisexual identity, are affected. Although all boys are susceptible to being called “faggot” and “gay,” gay youth are ridiculed more frequently, and the name-calling is likely to have more of an impact on their sense of self and safety. • Vary among themselves, perhaps even to the extent of being more similar in some regards to heterosexual youths than to each other. Although stereotypes emphasize a “gay lifestyle,” it is clear that gay
youths do not come in one package. Some are football players, whereas others are dancers; some are insensitive, selfish clods, whereas others are empathic, giving angels; some have had a thousand sex partners, whereas others are virgins. Young gay and bisexual boys as a group are more similar to heterosexual boys than to samesex-attracted girls in preferring visual sexual stimuli and casual sexual encounters. • Represent a minority of those with same-sex attractions. High school surveys reveal that 1 percent of young men report that they are gay or bisexual, but five to ten times that number report that they have same-sex desires, attractions, or behavior. Should only the former be the centerpiece of our research and attention? Too often information on gay/bisexual boys is derived from sexual identity categories rather than from their underlying sexual desires and behaviors. The new generation of same-sexattracted boys is clearly not as troubled as research would seem to indicate. More often than not, they reject sexual identity labels as a meaningful barometer of their lives. Indeed, such designations have become so plentiful and blurred that they are in danger of becoming irrelevant as predictors of behavior. The focus needs to shift from their troubled lives to their resiliency and abilities. The vast majority cope quite nicely with their sexuality and negotiate a healthy life. As a result of the 1990s visibility revolution, same-sexattracted youth face the prospects of a far better life than that experienced by any previous generation. American culture
Schoolbooks has changed and continues to change in their favor. Ritch C. Savin-Williams See also Prostitution; Sexuality; Sexually Transmitted Diseases References and further reading Bass, Ellen, and Kate Kaufman. 1996. Free Your Mind: The Book for Gay, Lesbian, and Bisexual Youth—and Their Allies. New York: HarperCollins. Ford, Clellan S., and Frank A. Beach. 1951. Patterns of Sexual Behavior. New York: Harper and Brothers. Herdt, Gilbert. 1987. The Sambia: Ritual and Gender in New Guinea. New York: Holt, Rinehart, and Winston. Herdt, Gilbert, and Andrew Boxer. 1993. Children of Horizons: How Gay and Lesbian Teens Are Leading a New Way out of the Closet. Boston: Beacon. Nycum, Benjie. 2000. XY Survival Guide: Everything You Need to Know about Being Young and Gay. San Francisco: XY Publishing. Ryan, Caitlin, and Donna Futterman. 1998. Lesbian and Gay Youth: Care and Counseling. New York: Columbia University Press. Savin-Williams, Ritch C. 1990. Gay, Lesbian, and Bisexual Youth: Expressions of Identity. Washington, DC: Hemisphere. ———. 1998. “. . . And Then I Became Gay”: Young Men’s Stories. New York: Routledge. Savin-Williams, Ritch C., and Kenneth M. Cohen. 1996. The Lives of Lesbians, Gays, and Bisexuals: Children to Adults. Fort Worth, TX: Harcourt Brace College Publishing.
Schoolbooks Schoolbooks define not only the content of the school curriculum but also values, interpretations, and beliefs that adults want children to acquire from schooling. The content is called the “manifest curriculum,” and it derives from a curricular chain that originates with a needed curriculum, which is translated by profes-
573
sional organizations and state and local educational agencies into a desired curriculum. Textbook publishers then create from the desired curriculum the prescribed curriculum. The second component of schoolbooks, values and beliefs, is the hidden or latent curriculum and appears in the specific roles assigned to different sexes and family members within stories; the characterizations given of different ethnic, racial, and religious groups within geographies and histories; the settings selected for word problems in mathematics; and other treatments that portray or imply what is acceptable, desirable, and valuable in American society. On top of these components but less obvious to the student is a pedagogical apparatus that consists of questions, suggested projects, review, assessment, and even aids to learning and recall. Although the majority of a child’s reading in the elementary school is from school textbooks as just defined, more and more often since 1990, schools have assigned trade books and original stories written for boys as well as girls. These are distinguished from textbooks primarily by their more focused content and lack of a pedagogical apparatus. Over the entire history of schooling in the United States, different textbooks for boys and girls were not produced; nevertheless, the treatments of boys and of girls in these books has varied. For example, in stories in reading textbooks written prior to the 1960s, boys would most often succeed at their endeavors by themselves, whereas girls, if they did succeed, did so usually with the assistance of an adult. Boys attending school in the seventeenth and eighteenth centuries started reading from a hornbook, which was a small board covered with a single, small sheet of paper imprinted with the alphabet
574
Schoolbooks
Learning to read—an illustration in McGuffey’s second reader, ca. 1840 (Library of Congress)
in upper- and lowercase; syllables such as ib, ab, and ob; and the Lord’s Prayer. Next in progression came the New England Primer, perhaps a speller such as Thomas Dilworth’s A New Guide to the English Tongue, and later the Bible and a Psalter. Noah Webster’s Blue-Back Speller, originally published in 1783 with the ponderous title A Grammatical Institute of the English Language, Part I, became a school favorite by 1790 and remained so for at least fifty years. The first major geography book published in the United States, Jedidiah Morse’s Geography Made Easy, appeared in 1784 and, like several other geographies published in the United States in the late eighteenth and early nineteenth
centuries, displayed little tolerance for religions other than Protestantism. What drove both the manifest and the latent curriculum in the early colonial period was religious training. No separation was made then between church and state, and textbooks similarly reflected the importance of Christianity in everyday life. The New England Primer, for example, which remained in use until almost the middle of the nineteenth century, begins with the alphabet, a syllabarium, and graded word lists but moves quickly to alphabet rhymes, “A— In Adam’s Fall, We sinned All,” and so on (Ford 1899, 69). These introductory exercises are followed by a catechism, Lord’s
Schoolbooks Prayer, and Apostle’s Creed, all reflecting the Calvinist concern with the child falling into the clutches of the devil. Just after the middle of the eighteenth century, however, more secular versions were imported from England, and after the American Revolution, morality and character building replaced the outright presentation of religion, particularly in reading textbooks. Nevertheless, the Bible continued to appear in the school curriculum in some parts of the country until at least the Civil War. Through the nineteenth century, a schoolboy would have encountered stories that more and more exalted individual achievement and material gain, reflecting changes in Protestant doctrine that occurred through that century. Nationalism was also a central feature of the latent curriculum, especially during the first half of the nineteenth century. With the common school (i.e., free public school) movement, beginning in the period 1830–1850 and with a continuing flood of immigrants entering the United States, the market for schoolbooks grew rapidly. Simultaneously, transportation, particularly by water, improved so that textbook publishers could reach a national market. The first to exploit this opportunity was the Cincinnati-based publishing house Truman and Smith, which developed a series of arithmetics written by a former schoolteacher, Joseph Ray, and a series of readers edited by William McGuffey and his brother Alexander Hamilton McGuffey. Ray’s arithmetics, consisting of four basic texts plus variations on them, sold more than 120 million copies from their first publication in 1834 until well into the beginning of the twentieth century. The McGuffey readers were equally successful, selling slightly more than 122
575
million copies and making millionaires of a number of their publishers. McGuffey himself received $1,000 in royalties for his efforts; in addition, he and his brother were paid a smaller amount for writing the fifth and sixth readers but were not given any royalties on these volumes. The McGuffey readers were revised frequently through the nineteenth century, the last major revision occurring in 1878–1879, after which about 60 million copies were sold. A boy who read the first editions, beginning in 1836, would have encountered a Calvinist world of sin and damnation with considerable praise for the western frontier, that pristine, uncluttered territory reaching through Ohio to the territories approaching the Mississippi River. A boy reading any of the series after the 1878–1879 revision would have encountered a far more secular world where regional interests were replaced by national ones. Even Abraham Lincoln’s Gettysburg Address was ignored to avoid offending potential markets in the South. After the period 1880–1890, achievement imagery began to decline in U.S. schoolbooks and was replaced by a social ethic. This change reflects a shift in American society from an agrarian life in which individual inventiveness was required to an urban existence in which cooperation within groups or communities was more important than individual action. The Protestant ethic that stressed hard work, competition, and individual thrift was replaced by a social ethic that derived its creativity and self-realization through group membership. The frontier was closed, both physically and mentally. For boys looking toward adulthood, expansion could no longer occur along horizontal dimensions but only along a vertical one, advancing within society rather
576
Schoolbooks
A teenage student with a pile of schoolbooks (Skjold Photographs)
than away from it. At the same time, educational psychology and the child development movements began to influence both the form and the content of schoolbooks. Type size was enlarged further, especially at the primary levels, vocabulary and sentence length were regulated, and illustrations became more common. A linear sequencing from simple and concrete to complex and abstract was adopted for most school subjects, leading to more child-centered schoolbooks at the lower grades, with an emphasis on self and family in social studies, nature stories in reading, and simple word problems based on everyday objects in arithmetic. Whereas the schoolbooks found in most schools prior to the end of the nine-
teenth century were selected by the school or school district, between 1890 and 1920 twenty-two states, mostly in the South and Southwest, instituted statewide adoption practices. Although the number of textbooks selected for each subject varied across these states, state adoption began as a means for controlling undesirable or illegal marketing practices by the publishers. In time, statewide adoption also became a mechanism for controlling the content of textbooks, particularly in the South where religious resistance to the teaching of evolution was strong. In 1925 Tennessee passed its famous monkey law, the Butler Bill, making it unlawful to teach any theory that denied the story of the divine creation of man as described in the Bible. Mississippi, Arkansas, and Oklahoma also passed similar laws, although Oklahoma’s was quickly repealed. From the Scopes Trial in 1925 to the 1982 U.S. District Court ruling against the required teaching of creationism to the 1999 decision by the Kansas Board of Education to remove questions on evolution from the statewide science test (reversed on February 14, 2001), the battle over the teaching of evolution has flamed intensely or smoldered but never totally burned out. With regard to schoolbooks, the main effect until recently has been a refusal by publishers to include discussions of evolution in science textbooks for the schools. Since 1990, however, major adoption states such as Texas and California have demanded adequate teaching of evolution, and the major publishers have reversed their earlier positions. Schooling and especially schoolbooks of the nineteenth century have often been attacked for their overemphasis on rote learning. However, many of the nineteenth-century schoolbooks that a boy
Schoolbooks would have encountered also encouraged logical reasoning and problem solving. Stoddard’s American Intellectual Arithmetic (Stoddard 1866, iv) stressed the need “to invigorate and develop the reasoning faculties of the mind.” Asa Gray’s Botany for Young People, Part II claimed that it was written “to stimulate both observation and thought” (Gray 1875, vii). Similarly, a textbook on English grammar required written analyses of sentence structure that included what today would be called transformational analysis: for example, changing phrases such as “heavy gold” into sentences (“Gold is heavy”), changing declarative sentences to interrogative ones, and combining multiple short sentences into a single long one. Prior to the Civil War, only a small number of textbook titles were published in the United States in languages other than English. The most common of these were the German-language ones, such as the Hoch-Deutsches Lutherisches ABC und Namen Büchlein für Kinder (1819). Afterward, however, non-English-speaking boys might have encountered bilingual readers in two types of schools. In the large cities of the Midwest that had substantial German-speaking populations (e.g., Milwaukee, Cincinnati, Chicago), GermanEnglish readers were often available. In some of these schools, the classroom language for certain days of the week was German and for the other days was English. (A small number of bilingual texts were also published in other languages, such as Spanish.) The other place where bilingual textbooks were encountered was on the Indian reservations, where missionary schools sometimes used bilingual readers. In most such schools, however, only English was allowed. In the last decades of the nineteenth century, several states, including Wiscon-
577
sin, passed laws requiring that core academic subjects—reading, writing, arithmetic, and American history—be taught only in English. (The Wisconsin law, called the Bennett Law and passed in 1889, was repealed in 1890, but in several states the laws remained.) With the outbreak of World War I, more states banned the teaching of school subjects in languages other than English (e.g., Nebraska). Nevertheless, after World War I, the Department of the Interior developed educational programs for immigrants and used materials in several languages developed by Francis Kellor, director of the New York League for the Protection of Immigrants. As mentioned earlier, racial, religious, and ethnic slurs occurred in U.S. schoolbooks from at least the end of the eighteenth century. This practice continued into the twentieth century until after World War II. The 1902 edition of Alexis E. Frye’s Grammar School Geography, in speaking of the “black or Negro race” (Frye’s emphasis) south of the Sahara desert, asserted: “Such natives are very ignorant. They know nothing of books; in fact, they know little, except how to catch and cook their food” (Frye 1902, 33). Appleton’s Elementary Geography defines Jews almost solely as people who “reject Christ as the Messiah” (1908, 15). The only fact of interest presented about Jerusalem is “Here our Saviour was buried . . .” (89). Some history textbooks from the 1920s and 1930s claimed that immigrants to the United States from eastern and southern Europe were ignorant, lacked respect for law and government, and might even want to see the U.S. government destroyed. Similar bias was expressed into the 1940s and 1950s against blacks and Chinese, with more subtle bias saved for Jews and occasionally
578
Schoolbooks
Catholics. Bias appeared both directly and indirectly, through language, stereotypical descriptions, and illustrations. In addition, it appeared negatively through the failure of textbook authors, particularly of history texts, to take a moral stand on issues such as segregation and racial quotas. For fear of offending any particular market segment, textbooks became more and more bland and morally blind during the twentieth century. One researcher, in an extensive analysis of history textbooks, defined the perspective taken by the majority of the authors as the “natural disaster” theory of history (FitzGerald 1979). Events such as poverty, discrimination, and Watergate just happened; no individuals were ever named as responsible. Earlier in the twentieth century, analyses of history texts found a strong tendency toward extreme nationalism, what Arthur Walworth (1938, viii) called the “drum and trumpet” school of history. In 1986, People for the American Way, an organization that has periodically reviewed school textbooks, found the majority of the major U.S. history textbooks to be free of bias and to encourage critical and creative thinking. They also found a reversal of the dumbing down of the texts reported by an earlier review committee (Davis et al. 1986). Civics textbooks, however, were criticized by another review committee for their failure “to encourage young people to uphold their rights and carry out their obligations as citizens” (Carroll et al. 1987, i). On the style of writing, the committee reported, “They are good reference materials, but they read like the Federal Register” (Carroll et al. 1987, i). Those who espouse a revisionist view of schooling in the United States view textbooks as an integral part of a hegemonic system, with those at the top carefully co-
ordinating their actions to ensure that those below them remain in their places. Through manipulation of the latent curriculum, morality, docility, respect for authority, and the other trappings of a compliant, faithful worker are instilled in boys. Even if people and organizations with divergent goals, resources, and morals could coordinate their efforts and somehow gain uncontested control of the latent curriculum, there is little evidence to show that the aspirations, beliefs, and behaviors of boys are much affected by what they see in schoolbooks. Richard L. Venezky See also Schools for Boys; Schools, Public References and further reading Appleton’s Elementary Geography. 1908. New York: American Book Company. Carroll, James D., et al. 1987. We the People: A Review of U.S. Government and Civics Textbooks. Washington, DC: People for the American Way. Davis, O. L., Jr., et al. 1986. Looking at History: A Review of Major U.S. History Textbooks. Washington, DC: People for the American Way. De Charms, Richard, and Gerald H. Moeller. 1962. “Values Expressed in American Children’s Readers: 1800–1950.” Journal of Abnormal and Social Psychology 64: 136–142. Elliott, David L., and Arthur Woodward, eds. 1990. Textbooks and Schooling in the United States: Eighty-Ninth Yearbook of the National Society for the Study of Education, Pt. 1. Chicago: National Society for the Study of Education. Elson, Ruth. M. 1964. Guardians of Tradition: American Schoolbooks of the Nineteenth Century. Lincoln: University of Nebraska Press. FitzGerald, Francis. 1979. America Revised: History Schoolbooks in the Twentieth Century. Boston: Little, Brown. Ford, Paul L., ed. 1899. The New England Primer. New York: Dodd, Mead. Frye, Alexis E. 1902. Grammar School Geography. Boston: Ginn.
Schools for Boys Gray, Asa. 1875. Botany for Young People, Part II: How Plants Behave. New York: Ivison, Blakeman, and Taylor. Hoch-Deutsches Lutherisches ABC und Namen Büchlein für Kinder. 1819. Germantown, PA: W. Billmeyer. Monaghan, E. Jennifer. 1983. A Common Heritage: Noah Webster’s Blue-back Speller. Hamden, CT: Archon Books. Stoddard, John F. 1866. The American Intellectual Arithmetic. New York: Sheldon. Venezky, Richard L. 1992. “Textbooks in School and Society.” Pp. 436–461 in Handbook of Research on Curriculum. Edited by Philip W. Jackson. New York: Macmillan. Walworth, Arthur. 1938. School Histories at War. Cambridge: Harvard University Press.
Schools for Boys More than two centuries before Horace Mann’s common school movement gave rise to coeducational, state-supported public schools in the 1840s, a lively complex of religious and independent boys’ academies was thriving along the eastern seaboard. New York City’s Collegiate School, which today enrolls 650 boys between kindergarten and grade twelve, was founded in 1628 in conjunction with the Dutch Reform Church. Because the school was forced to suspend operations during the American Revolution when the British occupied the city, Boston’s Roxbury Latin School, founded in 1645 by the Reverend John Eliot, “Apostle to the Indians” (1604–1690), is able to claim that it is the oldest American school in continuous operation. Today Roxbury Latin enrolls just under 300 boys in grades seven through twelve and boasts one of the highest scholastic profiles in the United States. In the colonial era, young children were schooled in “dame” or “petty” schools in which lessons in reading and writing were
579
conducted by women and in which both boys and girls might be enrolled. However, more advanced education in early colonial academies was typically available only to boys. These academies were called “Latin” or “free” schools because they culminated in the study of Latin grammar and literature. The schools were open to all boys within a designated township, often without charge. The practice of limiting enrollment in Latin schools to boys in the seventeenth and eighteenth centuries derived from the prevailing assumption that only boys needed an advanced education because universities and learned professions were then open only to men. After the colonial period, as the new nation expanded westward, the “academy” model accommodated boys and girls in school together, although teachers often positioned boys and girls on opposite sides of the classroom and sometimes set them to different tasks. Since their colonial beginnings, there has been a substantial connection between American boys’ schools and their British forerunners. Before the American Revolution, suitably placed colonial boys of means would often be sent to school abroad, a practice greatly reduced but by no means curtailed after the war. Educated colonials who did not send their children abroad for their schooling were likely to send them to nearby Latin or grammar schools, such as Roxbury Latin, modeled directly on classics-oriented English grammar schools. A boy might enter such a school at age twelve or when he had demonstrated sufficient literacy. He would matriculate to university or other training when he was fourteen to seventeen years of age, depending on his facility in Latin. If a boy was unable to master the requirements of the higher
580
Schools for Boys
Boys walking down the hallway at Georgetown Preparatory School, ca. 1940 (Library of Congress)
Latin courses, he would depart, as Benjamin Franklin departed the Boston Latin school, and seek employment in farming or in the trades. In the mid-nineteenth century a burst of reform in the English schools had a decisive effect on the shape of American education. Thomas Arnold’s Rugby School, which had so transformed Tom Brown in Thomas Hughes’s classic school saga, Tom Brown’s School Days (1857), was also transforming England’s other great private schools (called “public” in Britain) and grammar schools. These schools were organized into relatively small, relatively autonomous “houses,” each overseen by
a master and matrons but governed largely by older boys who were empowered to deal out physical punishment and to command menial labor. Up through this often elaborate hierarchy small, disestablished “fags” (temporary servants to older boys) like Tom Brown might emerge into “bloods” or “swells” (school heroes) with governing responsibilities themselves. This organizational scheme was grounded in a new wave of evangelistic piety and, combined with a phenomenal new emphasis on competitive “house” and school sports, came to be known as “muscular Christianity.” This Victorian notion of school, extolled with such en-
Schools for Boys thusiasm by Hughes in Tom Brown’s School Days, Horace Vachell in The Hill (1905), set at Harrow School, and in thousands of stories in Boys’ Own Paper and other publications for youth, would by the turn of the century embed itself firmly in the English national consciousness. This development occurred despite the fact that the schools and the milieu extolled were the exclusive preserve of the middle and upper classes. Although overshadowed nationally by Mann’s common school movement, the impact of Arnold’s Rugby School in the United States was felt in the reform of a number of private academies and in the founding or revitalizing, mainly in New England and the mid-Atlantic states, of college preparatory schools—for example, St. Paul’s in New Hampshire, Groton and St. Mark’s in Massachusetts, and Lawrenceville in New Jersey. These new schools characteristically instituted a house system with prefects, stratified students by “forms” instead of classes, and established traditions of dress, custom, and privilege; but not all schools created a cricket program, as did St. Paul’s. Typically, these new schools, like Arnold’s Rugby, had strong religious foundations. Many of the founding headmasters saw their school mission primarily as a Christian commitment; such was the Reverend Endicott Peabody’s orientation to Groton, the Reverend Henry Augustus Coit’s to St. Paul’s, and Father Frederick Sill’s to Kent. Each of these men was personally acquainted with the English school model, as were hundreds of other American schoolmen of remarkably similar vocations. Even Sill, who established Kent School and vowed not to make it resemble an English country house or to train Kent School boys to be squires, had the English model firmly in
581
mind, incorporated substantial parts of it, and used the rest as a point of departure. Without question the American private schools have looked to England, not only for structure but for a sense of continuity with their classical inheritance. Toward the end of the nineteenth century, the ranks of leading American boys’ schools were swollen by the founding of dozens of new “country day” schools, many of them west of the Appalachians. Cleveland’s University School (1890), Baltimore’s Gilman School (1897), Washington, D.C.’s St. Albans School (1909), and Dallas’s St. Mark’s School (1933) were heavily subscribed from the outset and soon demonstrated their capacity to prepare boys from those cities for leading universities. Although independent boys’ schools thrived throughout most of the twentieth century, they never—even when combined with hundreds of strong parochial boys’ schools—enrolled as much as 5 percent of the American school population. As the post–World War II baby boom generation began reaching school age, many of the defining features of American private schools and of boys’ schools in particular were energetically challenged. In a wave of what some cultural commentators have called the “adolescentization” of the United States, a number of traditional schools were substantially transformed between 1968 and 1975. An affluent generation called into question restrictive dress codes, limited opportunities for unsupervised socializing, required religious observances, and Saturday classes. Market research at the time revealed that the well-to-do were finding private schools, especially boarding schools, resistible. Some prospective students and their families felt single-sex schools to be especially astringent, and in
582
Schools for Boys
consequence there was a significant movement toward coeducation led by such premier boys’ boarding schools as the Phillips academies at Exeter and Andover, St. Paul’s, Hotchkiss, and Groton. When the coeducational conversion of former single-sex schools tapered off in the mid-1970s, the gender composition of independent schools had changed significantly. After the relatively late conversion to coeducation of strong boarding schools like Deerfield (1989) and Lawrenceville (1987), only a few boys’ boarding schools remain in New England. However, south of the Mason-Dixon line, a number of exemplary boys’ boarding schools, such as Virginia’s Woodberry Forest School and Tennessee’s McCallie School, continue to thrive. Throughout the wave of conversion to coeducation, most established day schools for boys were able to retain their founding mission. The movement to coeducation challenged many boys’ schools to articulate their mission and educational assumptions more forcefully. Former headmaster (St. Mark’s in Dallas and St. Paul’s in Concord, New Hampshire) and education historian David Hicks published an essay in The American Scholar (1996, 524–525) in which he argued that the modern coeducational boarding school had forgotten its founding purpose, which was not to enhance the opportunities and creature comforts of the privileged but rather to purge such children of unearned entitlements and “softness.” The astringency of independent school life, including its temporary separation of the genders, was intentional, Hicks argued, a part of an overall program of “salutary deprivation.” As boys’ schools reaffirmed their distinctive mission, they began articulating developmental and learning differences in boys and girls. Gender-based varia-
tions in tempo and learning style have been identified from the preschool through the high school years. Primary school girls generally demonstrate reading and writing proficiency earlier than boys do; middle school and high school boys’ mathematical-logical capacities accelerate more rapidly than those of girls. Females reach the peak of their pubertal growth spurt a year or two sooner than boys. Each gender-based physiological difference is accompanied by distinctive psychological and social adjustments. J. M. Tanner (1971) has demonstrated that girls’ skeletons and nervous systems are at birth more fully developed than those of boys and that the maturational gap increases somewhat through early childhood. From their preschool years through their late teens, boys reveal a number of other gender-specific contours in their skeletal, motor, and neurological development. Boys develop language skills, the capacity for quantitative analysis, and large- and small-muscle proficiencies at a developmentally different tempo from girls. The observed differences in boys’ and girls’ learning patterns have encouraged boys’ schools to avoid inappropriately hastening the arrival of fine motor and language skills while seeking more opportunities to incorporate large-muscle motor activities in learning. Boys’ schools from the early grades through high school are increasingly emphasizing physical application of learned principles. An all-boys learning environment may also bear positively on affective learning. Diane Hulse’s 1997 study contrasting the values and attitudes of boys at coeducational and single-sex middle schools found, perhaps counterintuitively, that boys at single-sex schools felt less defensive, were less susceptible
Schools for Boys to peer pressures, felt safer in school, reported more comfort in their boy-girl relationships, revealed more egalitarian attitudes about gender roles, and viewed masculinity as inclusive of a wider range of behaviors than did similar boys in a coeducational setting. Proponents of American boys’ schools also began to articulate the educational consequences of cross-gender distraction, which had been a long-standing concern among American educators. The century’s leading scholar of public schooling, James Coleman, complained in a 1961 book for and about state coeducational schools, The Adolescent Society, that the social agenda of American schools was threatening the learning agenda. Whether expressed or suppressed, sexual distraction has been seen as an impediment to focused activity and learning. Coleman saw cross-gender distraction most significantly at work in early adolescence through high school. In addition to the obvious distraction that results from heightened attention to grooming, posturing, and flirting, the suppression of erotic interest in the presence of attractive members of the opposite sex also diminishes the energy required to focus effectively on scholastic tasks. Alexander Astin (1977) has attributed the positive effects of single-sex colleges in the 1970s (when there still were single-sex colleges, including most of the leading colleges in the land) to “restricted heterosexual activity.” Corroborating this view at the high school level, Tony Bryk and Valerie Lee’s 1986 study comparing single-sex and coeducational parochial schools concluded with an invitation to reconsider schools in which adolescent boys’ and girls’ “social and learning environments are separated.” This study found only positive outcomes for boys and girls in
583
single-sex schools, as compared to their equivalents in coeducational schools. At the outset of the new millennium, widespread concern about the health, well-being, and scholastic achievement of boys has been expressed in both learned journals and the popular media. Dramatic incidents of schoolyard violence and mayhem at the hands of schoolboys and worrying gaps between boys’ and girls’ tested abilities have been chronicled in the United States, Great Britain, and Australia. The perceived decline in boys’ performance has notably not been the case for boys enrolled in boys’ schools, a development that has stimulated a renewed interest in boys’ schooling and in separate instruction of boys and girls within coeducational schools. School initiatives targeting boys at risk, such as the Nativity Mission schools in Boston and New York, have favored an all-boys structure with encouraging early results. Presently, American independent boys’ schools and parochial boys’ schools are enjoying healthy enrollments and renewed vigor. In 1995 an International Boys’ Schools Coalition was formally incorporated, including 108 North American schools as well as 53 schools from Canada, Australia, New Zealand, South Africa, Great Britain, and Japan. The mission of the coalition is to identify and to share best practices for schooling boys. Richard Hawley See also Military Schools; Schools, Public References and further reading Astin, W. A. 1977. Four Critical Years: Effects of College on Beliefs, Attitudes, and Knowledge. San Francisco: JosseyBass. Bryk, Anthony, and Valerie Lee. 1986. “Effects of Single Sex Secondary Schools on Student Achievement and
584
Schools, Public
Attitudes.” Journal of Educational Psychology 78. Coleman, James S. 1961. The Adolescent Society. New York: Free Press. Hawley, Richard. 1991. “About Boys’ Schools: A Progressive Case for an Ancient Form.” Teachers College Board 92, no. 3. Hicks, David. 1996. “The Strange Fate of the American Boarding School.” The American Scholar 65, no. 4 (Autumn). Hulse, Diane. 1997. Brad and Cory: A Study of Middle School Boys. Cleveland: Cleveland’s University School Press. Jarvis, F. W. 1995. Schola Illustris: The Roxbury Latin School. Boston: David Godine. Newberger, Eli H. 1999. The Men They Will Become. New York: Perseus Books. Riordan, Cornelius. 1990. Girls and Boys in School: Together or Separate. New York: Teachers College Press. Tanner, J. M. 1971. “Sequence, Tempo, and Individual Variations in Growth and Development of Boys and Girls Aged Twelve to Sixteen.” In Twelve to Sixteen. Edited by Jerome Kagan. New York: Norton.
Schools, Public Public schools are educational institutions, usually coeducational, that evolved from small, private or home-based programs in the colonial era to large, publicly funded schools in the twentieth and twenty-first centuries. Tax-supported education developed slowly in the eighteenth and early nineteenth centuries. Rural and urban schools evolved in quite different directions in the nineteenth century. Schools in the South were the slowest to develop and the most discriminatory against African American boys and girls until well into the twentieth century. Regardless of where they studied, boys typically learned their lessons from female teachers in an environment that stressed independent learning and competition. By the end of the nine-
teenth century, public schools prepared boys for work in agriculture and industry and girls for homemaking and childrearing. By the late twentieth century, schools sought to prepare boys and girls for specialized technological work roles. In the seventeenth and eighteenth centuries, education took place in family or workplace settings as boys learned tasks, manners, and values in preparation for their adult lives. Children learned to read at home from their parents or perhaps in a dame school kept by a neighbor. Boys from well-to-do families were taught the classical languages—Latin and Greek— from a tutor who boarded in their home, or they were sent to live in the household of a minister or schoolmaster. Those from Virginia or South Carolina were often sent to English schools, leaving home as young as the age of seven or eight and not returning to America until their early twenties. Letters reveal their loneliness and the concern of distant parents, but many such boys had been sent to English relatives who kept in contact with them. In the northern colonies, boys as young as thirteen matriculated in American colleges, such as Harvard, founded in 1636; Yale, established in 1701; and the College of Philadelphia (later the University of Pennsylvania), set up by subscription of citizens in 1755. After 1693, boys in Virginia could receive higher education at the College of William and Mary in Williamsburg. The religious revivals of the Great Awakening in the 1740s injected new vigor into education as churches founded colleges to educate youth and train ministers in their version of Christian faith. New Light Presbyterians established the College of New Jersey (later named Princeton) in 1746, New York Anglicans founded King’s College (later named Columbia) in
Schools, Public 1754, Baptists started the College of Rhode Island (later named Brown) in 1764, and the Dutch Reformed Church in New Jersey subsidized Queen’s (later named Rutgers) in 1766. The American Revolution stimulated great interest in tax-supported education in order to inculcate republican principles and instill the moral restraint essential for citizenship. Early plans for the establishment of public schools, however, were rejected by state legislatures, largely because the United States was still a rural society, and a uniform system did not mesh with the existing social structure. Farmers and rural craftspeople, who depended on family labor, expected their children to work. Schooling could only be intermittent, fitted into the hours of the day or seasons of the year that the rhythms of an agricultural society allowed. In 1789, Massachusetts required towns of fifty or more families to provide district schools for at least six months of the year and towns of 200 or more families to provide a grammar school. Congressional ordinances governing the Northwest Territory in 1785 and 1787 stipulated that one of thirty-six sections of land in each township be set aside to support schools, although the policy had little effect. The more common pattern was that parents initiated schooling by hiring a teacher and providing a building for a “district” school in states where districts had been established or a “subscription” school where such districts did not exist. Throughout the nineteenth century, publicly supported rural schools became more common, although their growth was slower in the South. Rural schools were in session about six months out of the year at times when children’s labor was not in high demand on farms. Although some children at rural schools were very young, most boys and
585
girls attended school between the ages of ten and fourteen; however, some boys whose labor was required in the fields nine months out of the year found time to go to school only during the three-month winter session. Such boys continued in school until late adolescence, unlike most farm girls, who completed their education by attending six months out of the year until they were fourteen. In rural schools, children as young as two or three mingled with older scholars. The youngest boys and girls sat in the front, nearest to the teacher, and older children sat in the back of the classroom. Henry Ward Beecher, born in 1813, went to Widow Kilbourne’s school in Litchfield, Connecticut, when he was three, bringing along sewing and knitting to keep him occupied. By age eight, he combined schooling with helping his father plant and hoe the garden, caring for the horse and cow, carrying wood, and drawing water from the well on winter mornings. Horace Greeley, born into a New Hampshire farming family in 1811, learned to read at age three from his mother. At age four, he boarded with his grandfather and attended district school with sixty other scholars, several of whom were almost grown. By the age of five, he rose at dawn to ride the horse while his father plowed, rushing to school when the lesson was almost over. When he was eight and nine, he skipped the summer session entirely to help his father clear land. Daniel Drake grew up in Kentucky in the 1790s and started school at the age of five in a log cabin with paper windows and a puncheon floor. He later reported that students of all ages recited the same lesson aloud, gathering energy as they spoke. In rural schools, the teacher typically set the children’s lessons and then examined
586
Schools, Public
them individually or in small groups for ten-minute periods. While one group of children was being examined, the rest listened or studied and prepared their own lessons. In later life, Daniel Drake found he could concentrate in almost any situation and thought it an advantage that he had learned to study in the midst of noise. He also appreciated the natural mingling of boys and girls, who—while they sometimes sat in different sections of the schoolroom—studied the same subjects and joined in running races and playing games. But Drake’s father also depended on the labor of his son, and at the age of nine, Daniel had to leave lessons to help clear land. When he was fifteen years old, an injury to his father ended his intermittent schooling altogether, and Daniel worked the family farm alone. Such would have been his fate, had his illiterate father not been determined to have at least one educated child, sending him at age sixteen to Cincinnati to study medicine. Education of boys in the rural South developed somewhat differently than it did in the rural Northeast, Midwest, and Far West. Before the Civil War, powerful southern white planters found little value in public education. They educated their own children in private schools, found little advantage in providing schooling for poor white children, and found no advantage in educating slave youngsters. During Reconstruction, the federal government and various private philanthropic groups joined with freedpersons to form public schools, which provided education to both white and black children in segregated facilities. Black boys and girls enthusiastically took advantage of these schools, since education had become a sign of freedom for blacks. When Reconstruction ended and whites resumed control of local and state govern-
ments in the South, they kept the segregated public schools that were created during Reconstruction. However, southern whites spent less on their public schools than did rural folk elsewhere in the country, and southerners kept their schools open just three months out of the year. Prosperous southern whites had little investment in public schools since they continued to send their own children to private academies. Other rural southern whites were poorer than farmers elsewhere in the country, and because they required the labor of their sons and daughters virtually year-round, they sent them to school for short terms. African Americans in the South continued to enroll their children in school, but they became discouraged with the quality of the schools, which deteriorated once whites resumed control after Reconstruction. Whites allotted less money to black than to white schools, and black schools became overcrowded. Black parents were particularly likely to withdraw their sons from school once they reached the age of eleven. Boys’ labor was required on black tenant and sharecropping farms, whereas girls could be spared to go to school longer to prepare for the one good profession open to them—teaching. Boys who lived in cities, most of which were in the Northeast, experienced schooling somewhat differently than boys who attended rural schools. Innovation that led to systems of tax-supported education occurred in northeastern cities as commercial relationships eroded apprenticeship and manufacturing concentrated in larger units. In postrevolutionary decades, only Boston supported a system of public education. Responding to the Massachusetts law of 1789, Boston established an annually elected School Committee that supervised grammar schools
Schools, Public for boys and girls aged seven to fourteen and a Latin school for boys over the age of ten. Only children who could already read were admitted to the grammar schools, and girls attended them fewer hours per day than boys and for shorter terms. Only about 12 percent of the city’s children attended these public schools, but other well-to-do boys attended academies, and some children of the poor attended charity schools (Schultz 1973, 8–25). In Philadelphia and New York, efforts to develop systems of tax-supported schools occurred through charity schools designed for the urban poor. Reformers in both cities were attracted to the plan devised by Joseph Lancaster, an English Quaker, through which one teacher could instruct large numbers of children with the aid of student monitors. During the urban depression that followed the War of 1812, prominent citizens worried about the prevalence of vice and crime in poor neighborhoods and began to investigate how discipline could be instilled in children of paupers at minimum public expense. In 1817, the Pennsylvania legislature designated the city of Philadelphia the first school district in the state, and children of indigent parents began to attend tax-supported Lancasterian schools. The method also flourished in New York, and in 1825 the eleven free schools and monitorial high school became the basis of that city’s public school system. In the 1820s New York teachers were called “operatives,” and the machinelike replicability of the Lancasterian method was extolled as a means of expanding educational opportunity throughout the nation. Children—aided by older boys acting as monitors—learned to read, march in drill, and earn tickets to buy prizes in more than 150 Lancasterian schools in locations as distant as Cincinnati, Detroit, and New Orleans.
587
Although Lancaster’s system seemed an educational panacea in the 1820s, it was too mechanical to mesh with the emerging domestic ideology of the new middle class. By the 1830s, school reformers admired the ideas of Johann Heinrich Pestalozzi, who urged creation of an atmosphere of security and affection to allow the unfolding of a child’s nature. As capitalism transformed the northeastern economy, middle-class families relied less on the labor of their children, giving them more time to spend in school. In this new economic context, middle-class reformers advocated statesupported and supervised systems of education, with standardized, full-time schools taught by professional teachers. In 1837, Horace Mann, secretary of the new Massachusetts State Board of Education, lobbied the legislature for graded public schools, ten-month terms, and a uniform curriculum. Age grading dovetailed with a standardized curriculum, permitting children of the same age to learn the same subjects from different teachers and hence be properly prepared for the next grade. As public schools expanded in the 1840s, Mann rejected the efforts of the American Sunday School Union to include in the curriculum its line of evangelical children’s literature. The message of the schools would be Protestant yet nonsectarian, designed to form character and teach values of republicanism and capitalism. As immigration increased, however, and Irish neighborhoods burgeoned in Boston, Philadelphia, and New York City, Catholic clergymen objected to this Protestant thrust of public schools and responded by forming their own educational institutions. Neither did urban public schools recognize racial diversity. African Americans were often denied public schooling
588
Schools, Public
in northern cities until the 1840s and 1850s, when abolitionists demanded that they receive equal educational opportunities with whites (nonsegregated). After the Civil War, all northern cities opened schools to blacks; however, some cities accommodated them in schools separate from whites. Black parents, many of whom had moved to the North to make it possible for their children to attend school, made huge sacrifices to keep their sons and daughters in school. Mothers and fathers both worked outside the home, usually at low-wage jobs, so that black boys and girls could attend school. Between 1890 and 1940, “older black children were more likely to be in school than foreign-born children” (Walters and O’Connell 1988, 1145). By the 1850s, public schools meshed neatly with domestic ideology. As middle-class mothers assumed responsibility for character formation in early childhood, children under the age of six were excluded from graded public schools. As school systems consolidated, administrators sought to save money by hiring female teachers, and boys spent more of their lives directed by women. In the lower grades of urban public coeducational schools, female teachers drilled all youngsters simultaneously in reading, writing, and arithmetic. Urban high schools, which in some cities were segregated by gender, continued throughout the nineteenth century to be staffed by more male than female teachers. Of course, few youngsters stayed in school long enough to enroll in a high school. Working-class boys were particularly likely to leave school by their early teens to find jobs to help support their families. Middle-class boys typically stayed in school longer to acquire the writing and numerical skills necessary for white-col-
lar jobs, but middle-class girls stayed in school the longest because their labor was least needed by their parents, and school prepared them for the most popular professional job for women—teaching. In the nineteenth century, white native-born and African American children were more likely to attend urban public schools than were foreign-born children. However, boys from certain ethnic groups were especially likely to be enrolled in city schools. Children of Scottish parentage, who shared the language and Protestant faith of common school founders, attended school in large numbers. So, too, did the sons and daughters of eastern European Jews who migrated to the United States after 1880 and settled largely in cities. Jewish parents valued education and learning and pushed their sons to succeed in public schools in order to prepare them for the commercial occupations that most Jewish men favored. Although the opportunities for schooling for boys in the nineteenth century differed based on where they lived, their social class, race, ethnicity, and—if they enrolled in school—their educational experience was remarkably similar. Of course, the presence or absence of agegrading and the length of the school year differed for farm and city boys, but nearly all of them studied under the direction of female teachers who emphasized rote learning. In the four to five years most attended school, boys learned spelling, reading, writing, and math, along with some geography, literature, and history. Boys and girls learned writing by using a pen and copying words. Rarely did teachers assign compositions. Teachers taught math by drill. Regardless of locale, nineteenth-century teachers also conveyed similar values to boys and girls in their classrooms.
Schools, Public They assigned students to sit at individual desks and to speak only to the teacher. Thus they made clear to boys that they should act as individuals and become self-reliant. Boys should not count on friends to help them with their lessons. Boys also competed constantly in schools: to correctly spell words, to appropriately read passages, and to pass a high school entrance exam. Teachers punished boys severely for stealing or defacing books or desks in order to convey to them the importance of respect for private property. Teachers and parents also expected boys to obey the teacher without question. Such behavior was seen as a way to prepare boys to obey appropriate authorities later in life. In the twentieth century, extended public schooling became more widely accepted and more common for boys of all social classes, races, and ethnicities. However, early in the century during the Progressive era, the boys and girls most likely to take advantage of public education were white and middle-class. Yet by 1918, all states had enacted compulsory attendance laws. Parents in all social classes expected that boys would become breadwinners and therefore needed to learn various skills and attributes to equip them for whatever section of the labor market was their destiny. Conversely, girls needed to acquire knowledge to enable them to be good wives and mothers. The Smith-Hughes Act, passed by the federal government in 1917, provided funding to local schools to reinforce this gendered notion of the appropriate forms of vocational education for boys and girls. This legislation helped dictate separate and distinct programs for boys and girls in public schools for many decades. The basic core curriculum in schools had remained the same since the nine-
589
teenth century. Reformers in the 1920s believed that schools should provide youth with more than the ability to read, write, and do math. Youngsters also needed to develop common social skills and to improve their physical health. In 1918, the National Educational Association outlined seven cardinal principles for both secondary and elementary schools, which included developing good health in youngsters, improving their command of basic skills, encouraging them to be responsible family members, enhancing vocational efficiency, promoting good citizenship, teaching worthy use of leisure time, and demonstrating ethical character. The end result of the 1920s reform movement was the expansion of extracurricular activities in schools, which reformers expected would cultivate appropriate social attitudes in youth. Organized school sport became a means of promoting a positive sense of self-worth for boys. The introduction of athletics had a fourfold effect: it became a solution that would reduce the dropout rate, it would create a masculine environment, it would give unruly males an outlet for their energy, and it would enhance the image of schools. Public school officials assumed control of school athletics, and the programs they administered raised new issues for boys—popularity, intensity, and exclusiveness. In the twentieth century, as in the nineteenth, most public school teachers were female. Reformers complained that an all-female teaching force deprived boys of needed male role models. Growing divorce rates and increasing percentages of single-mother families increased concern about the lack of masculine influence available to boys in schools. However, local school boards continued to hire women in large part because they
590
Schools, Public
A teacher assists one boy while the rest work on their assignments, 1938. (Bettmann/Corbis)
were less expensive to employ than were men. In the post–World War II era, the baby boom increased the number of children who entered public schools and necessitated school expansion. Just as important, the launch of Sputnik in 1957 by the Soviet Union created a feeling of selfdoubt in the United States and a concern that American education was behind Russian education. Congress enacted the National Defense Education Act, and science societies worked to invigorate American education. Within ten years, the “space race” led to an expanded science and math curriculum, not only in kindergarten through twelfth grade but also in higher education. Building a
strong education for a strong defense included problem-solving and learner-centered curricula. Reformers realized that American students were accustomed to memorizing facts but often unable to apply those facts. Hence in the new school curriculum, teachers encouraged students to solve problems. Textbook companies developed illustrated science texts with specific scope and sequenced outlines for each grade level. Although presumably all boys and girls were the target of evolved educational emphases, new textbooks in science and literature lacked any minority representation. The emphasis on increasing student achievement highlighted another issue that Americans had long ignored—the
Schools, Public poor quality of education available to most African American children. After the Supreme Court endorsed separatebut-equal facilities for black and white children in 1896, segregated schooling became the norm throughout the South. It was separate but hardly equal since southern states systematically spent less on black than on white schools. In 1954, the Supreme Court reversed itself and declared in favor of desegregated schools in Brown v. Board of Education of Topeka, Kansas. Schools in the South gradually desegregated, but in some cases, states bused black children to white schools and vice versa in order to implement desegregation quickly. In response, some white parents withdrew their children from public schools and enrolled them in all-white private and parochial schools instead. Not only were there disparities between the educational opportunities of black and white children, there were also disparities between the educational achievement of white boys and girls. When math and science issues became preeminent, researchers soon discovered that girls fell behind boys in achievement in both subjects. Boys also lagged behind girls in reading and writing. Boys were more likely to be held back a grade in school, to be enrolled in special education classes, and to drop out of school. Technology was advancing rapidly, yet there seemed to be growing inequities in technological knowledge between boys and girls and between the youth of different social classes. International markets made it necessary to increase understanding of diversity, yet teachers claimed to be inexperienced in how to teach about diversity and how to handle students from diverse backgrounds in the classroom. Congress passed the Gender Equity in Education Act to help promote pro-
591
grams that emphasized diversity and to train teachers in nonsexist behavior. School reformers demanded that students learn teamwork and problem solving and achieve technological literacy. To enhance equal educational opportunities for all children, Congress passed legislation creating America Goals 2000. President William J. Clinton signed this legislation on March 31, 1994. It acknowledged the rights of all children to an opportunity to learn, to well-trained teachers, and to a solid curriculum. In response to the dramatic technological innovations in the early and mid1990s, U.S. Secretary of Education Richard Riley released the nation’s first educational technology plan in 1996, entitled Getting America’s Students Ready for the 21st Century: Meeting the Technological Literacy Challenges. The program called for technology to become a part of elementary and secondary school education instruction in order to help the next generation of schoolchildren to be better educated and better prepared for new demands in the American economy. The plan was quite successful and was revised in 1999 to expand teacher training. Given the fact that boys have traditionally been more encouraged than girls to take advantage of training in technology, the new emphasis on technology in education is bound to enhance their opportunities. However, those boys from minority and less-affluent homes who do not have ready access to home computers and who attend schools that are poorly equipped may find access to technological training more problematic than do more affluent white boys. There has been a major change in education at the end of the twentieth and beginning of the twenty-first centuries as the federal government has identified major problems in U.S. education and
592
Schools, Public
A teacher assists three boys with their schoolwork, 1999. (Skjold Photographs)
poured money into emergency remedies. States have worked more closely than ever with the federal government to achieve national educational goals. Americans have worked hard, although not altogether successfully, to provide opportunities to all students regardless of gender or socioeconomic status. Some reformers see expanding access to technology as the answer to all educational problems, whereas others believe that education should focus more on teaching basic skills. Jacqueline Reinier Priscilla Ferguson Clement Mabel T. Himel Lisa Jett
See also Computers; Learning Disabilities; Military Schools; Schoolbooks; Schools for Boys; Sunday Schools; Vocational Education References and further reading Anderson, James D. 1988. The Education of Blacks in the South, 1860–1935. Chapel Hill: University of North Carolina Press. Clement, Priscilla Ferguson. 1997. Growing Pains: Children in the Industrial Age, 1850–1890. New York: Twayne Publishers. Education Commission of the States Task Force on Education for Economic Growth. 1983. Action for Excellence: A Comprehensive Plan to Improve Our Nation’s Schools. Denver: Education Commission of the States. Faludi, Susan. 1999. “The Betrayal of the American Man.” Newsweek (September 13): 49–58.
Scientific Reasoning Finkelstein, Barbara. 1989. Governing the Young: Teacher Behavior in Popular Primary Schools in Nineteenth-Century United States. London: Falmer Press. Fuller, Wayne E. 1982. The Old Country School: The Story of Rural Education in the Midwest. Chicago: University of Chicago Press. Kaestle, Carl F. 1973a. The Evolution of an Urban School System: New York City, 1750–1850. Cambridge, MA: Harvard University Press. ———. 1973b. Joseph Lancaster and the Monitorial School Movement. New York: Teachers College Press. ———. 1983. Pillars of the Republic: Common Schools and American Society, 1780–1860. New York: Hill and Wang. Kalb, Claudia. 2000. “What Boys Really Want.” Newsweek (July 10): 52. Lewis, Theodore. 1997. “Toward a Liberal Vocational Education.” Journal of Philosophy of Education 31, no. 3: 477–489. Link, William A. 1986. A Hard Country and a Lonely Place: Schooling, Society and Reform in Rural Virginia, 1870–1920. Chapel Hill: University of North Carolina Press. National Commission on Excellence in Education. 1983. A Nation at Risk: The Imperative for Educational Reform. ED 226 006. Washington, DC: Government Printing Office. National Science Board Commission on Pre-College Education in Mathematics, Science and Technology. 1983. Educating Americans for the 21st Century. A Report to the American People and the National Science Board. ED 223 913. Washington, DC: U.S. Government Printing Office. Perlman, Joel. 1988. Ethnic Differences: Schooling and Social Structure among the Irish, Italians, Jews, and Blacks in an American City, 1880–1935. New York: Cambridge University Press. Reinier, Jacqueline S. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers. Sadker, Myra, and David Sadker. 1994. Failing at Fairness: How Our Schools Cheat Girls. New York: Touchstone. Schultz, Stanley K. 1973. The Culture Factory: Boston Public Schools,
593
1789–1860. New York: Oxford University Press. U.S. Department of Education. 2000. “The Federal Role in Education,” http://www.ed.gov/offices/OUS/fedrole. html (accessed March 28). Walters, Pamela Barnhouse, and Phillip J. O’Connell. 1988. “The Family Economy, Work, and Educational Participation in the United States 1890–1940.” American Journal of Sociology 93: 1116–1152.
Scientific Reasoning There are two broad views of science and scientists used to discuss boys’ and girls’ ability to reason like scientists. One view suggests that even young boys have the conceptual skills and other qualities of mind, such as curiosity and imagination, to explain phenomena and revise those explanations in just the same manner as scientists do. Albert Einstein (1950) claimed that the thinking of children and scientists differs only in the degree of precision, systematicness, and economy of their ideas. Many developmental psychologists agree with this characterization and liken children to little scientists who revise their false theories of the world (Carey 1985; Meltzoff and Gropnik 1996). The second view is that the scientific thinking of young boys and girls is intuitive and illogical, which is nothing like the formal and logical process of reasoning of scientists when they practice science. In this view, the ability to reason scientifically can be acquired by children as they grow up and go to school. The view that scientific reasoning is an ability that is learned over an extended period of time and schooling is consistent with more than a century of philosophical, educational, and social science research, notably by the Swiss psychologist Jean Piaget, remains an
594
Scientific Reasoning
Two boys look through a telescope while others look at a globe suspended from the ceiling. (Archive Photos)
important perspective in developmental psychology. Support for each of these views exists, suggesting that young boys and girls have some but perhaps not all of the individual skills necessary to propose and evaluate explanations of the world in a manner like that of scientists. To do so, boys as well as girls must use four component scientific reasoning skills, including the ability to think theoretically, hypothetically, logically, and empirically. These are component scientific reasoning skills in the sense that they are implicated in every characterization of science. Theoretical thinking is the process of forming theories by which to explain and
predict a range of related phenomena. Even young children have been found to use theories to predict and explain how the mind works (psychology), how the body works (biology), and how things work in the physical world (physics) (Wellman and Gelman 1992). Children’s intuitive theories are similar in structure to formal theories of scientists in that they are composed of both an integrated conceptual network of explanatory concepts and assumptions about causal mechanisms. For example, boys and girls, like adults and scientists, will appeal to forces (causal mechanisms) such as “gravity” (explanatory concept) to explain or predict the motion of falling objects. Although similar in structure, the causal mechanisms and explanatory concepts in children’s theories are vague or inadequate compared to scientists’ theories. Again from the domain of physics, boys and girls incorrectly predict the rate and trajectory of falling objects, a rather curious mistake they make about phenomena with which they are very familiar. Ironically, rather than making predictions about how objects fall on the basis of their memory of seeing them fall, they appeal to their (incorrect) theories. Thus, although children may be theoretical, their theories are often wrong, placing them in much the same boat as scientists of long-ago eras who also appealed to incorrect theories to predict and explain phenomena. Hypothetical thinking is the ability to reason about possibilities, which often informs the human view of reality, and is used in science in a variety of ways. Combinatorial possibilities involve judgments enumerating all possible combinations of conditions or factors related to a target outcome, as in determining all combinations of chemicals that may have pro-
Scientific Reasoning duced a particular reaction. Young boys and girls tend to be poor at such reasoning (Inhelder and Piaget 1958), although they perform better with more accessible and simplified versions of the task (Klahr 2000). The term multiple sufficient possibilities refers to recognition of situations in which there are no definite answers to questions about conditions or factors related to a target outcome. For example, any one of four chemicals mixed together might have caused a reaction to occur, resulting in each of the four chemicals being possible causes, even though one chemical is eventually identified as the actual cause. Boys and girls have difficulty remaining indeterminate about the status of each of multiple sufficient possibilities in a situation, often judging each to play a unique role in the outcome (Kuhn, Amsel, and O’Loughlin 1988). Counterfactual possibilities have to do with the construction of alternative accounts of reality that could have occurred but did not. Considered central in science, counterfactual possibilities are often used to express predictions and hypotheses (e.g., if X had been done, Y would have occurred). Young children appear to be quite capable of generating counterfactual possibilities regarding alternatives to well-understood causal event sequences. For example, children told about a little boy who tore his pants when playing outside will generate the counterfactual possibility that if he had not played outside, he would not have torn his pants. However, there remain questions regarding young children’s ability to reason about counterfactual possibilities in other ways and in other domains (Amsel and Smalley 1999). Logical thinking is the ability to draw conclusions from premises, irrespective of one’s beliefs about the premises being asserted or the conclusions being drawn.
595
Such reasoning occurs in science when a prediction or hypothesis is derived from a theory (although reasoning about such hypotheses involves hypothetical thinking). In the past, philosophers have argued that logical deductions are the backbone of science because they provide the only basis for rationally justifying the activity (Braithewaite 1953; but see Lauden 1977). Young boys and girls perform well on simple logical reasoning tasks (“All bicycles have two wheels. The object in the box is a bicycle. Does it have two wheels?” Answer: yes). The difficulty for children arises when the nature of the logical deduction changes (e.g., “All bicycles have two wheels. The object in the box does not have two wheels. Is it a bicycle?” Answer: no), or when they do not believe that the premise or conclusion is true (“All bicycles have three wheels. The object in the box is a bicycle. Does it have three wheels?” Answer: yes) (Braine and O’Brien 1997). Fully appreciating the difference between an argument that is logically valid (wherein the conclusions can be drawn from the premise) but not true and one that is true but not logically valid does not appear to be possible until adolescence (Moshman 1999). The term empirical reasoning refers to the processes involved in the evaluation of evidence and revision of theories. Accounts of such processes in science have ranged from strictly formal analysis of the confirmation or falsification of theories (e.g., Popper 1959) to psychological or historical analysis of the plausibility and coherence of explanations of phenomena (Lauden 1977). Children readily evaluate the plausibility and coherence of their explanations of phenomena and act like scientists in their reactions to explanatory failures. For example, young children will slowly but surely revise their theory that
596
Scientific Reasoning
all blocks balance at their geometric center to account for the behavior of a block that has a weight hidden inside and thus balances off-center (Karmiloff-Smith and Inhelder 1974). But when children are asked to formally evaluate theories, they appear to reason less competently than scientists. Not only do children lack scientists’ evidence-evaluation strategies to validly assess even simple patterns of evidence in formally appropriate ways, but they also fail to evaluate new evidence bearing on theories independent of their prior theoretical beliefs regarding the theory. More generally, when confronted with new information, young children tend to treat it as an explanatory opportunity (i.e., to provide an account for the new information in terms of their prior theoretical beliefs) instead of as a confirmatory or disconfirmatory opportunity (i.e., to assess the new information as evidence bearing on a theory in a systematic and unbiased manner). These components of scientific reasoning work together to produce coordinated efforts at scientific reasoning. By three years of age, boys and girls are able to coordinate the scientific reasoning skills they have available to form, apply, and revise theories about a broad range of phenomena. These theories may be incorrect in terms of modern scientific knowledge, but they have a sophisticated conceptual structure. Moreover, such theories serve as a basis for the children’s thoughts about counterfactual possibilities. Finally, the explanatory concepts and causal processes associated with the theory may be updated in light of repeated failure to successfully explain phenomena by reference to the theory. This form of scientific reasoning, sometimes called “inductive” or “abductive” reasoning, is effective for children to learn
through trial and error. But when a situation calls for a formal analysis of specific theories, children have much difficulty. This form of scientific reasoning, sometimes called “hypothetico-deductive” reasoning, involves logically inferring hypotheses from theories, systematically collecting and analyzing evidence bearing on those hypotheses, and then confirming or disconfirming the theory. Children do not spontaneously reason according to the scientific method because they lack the skills to logically make predictions, particularly regarding theories they may disbelieve; to entertain theoretical possibilities regarding alternative hypotheses to ones they may believe to be true; and to evaluate evidence bearing on theories they prefer or dislike in a systematic and unbiased manner. Nonetheless, these skills emerge with age and education and can be promoted by science education programs that target the promotion of children’s scientific reasoning skills in addition to their scientific knowledge. Eric Amsel References and further reading Amsel, Eric, and J. David Smalley. 1999. “Beyond Really and Truly: Children’s Counterfactual Thinking about Pretend and Possible Worlds.” Pp. 99–134 in Children’s Reasoning and the Mind. Edited by K. Riggs and P. Mitchell. Brighton, UK: Psychology Press. Braine, Marty, and David O’Brien. 1997. Mental Logic. Mahwah, NJ: Lawrence Erlbaum. Braithewaite, Richard Bevan. 1953. Scientific Explanation: A Study of the Function of Theory, Probability and Law in Science. Cambridge, UK: Cambridge University Press. Carey, Susan. 1985. Conceptual Change in Childhood. Cambridge, MA: MIT Press. Einstein, Albert. 1950. Out of My Later Years. New York: Philosophical Library. Inhelder, Barbel, and Jean Piaget. 1958. The Growth of Logical Thinking from
Sexuality Childhood to Adolescence. New York: Basic Books. Karmiloff-Smith, Annette, and Barbel Inhelder. 1974. “If You Want to Get Ahead, Get a Theory.” Cognition 3: 195–212. Klahr, David. 2000. Exploring Science: The Cognition and Development of Discovery Processes. Cambridge, MA: MIT Press. Kuhn, Deanna, Eric Amsel, and Michael O’Loughlin. 1988. The Development of Scientific Thinking Skills. Orlando, FL: Academic Press. Kuhn, Thomas S. 1962. The Structure of Scientific Reasoning. Chicago: University of Chicago Press. Lauden, Larry. 1977. Progress and Its Problems. Berkeley: University of California Press. Meltzoff, Andrew, and Allison Gropnik. 1996. Words, Thoughts, and Theories. Cambridge, MA: MIT Press. Moshman, David. 1999. Adolescent Psychological Development: Rationality, Morality, and Identity. Mahwah, NJ: Lawrence Erlbaum. Popper, Karl Raimund. 1959. The Logic of Scientific Discovery. London, UK: Hutchinson. Wellman, Henry, and Susan Gelman. 1992. “Cognitive Development: Foundational Theories of Core Domains.” Annual Review of Psychology 43: 337–376.
Sexual Abuse See Abuse
Sexuality All human beings are inherently sexual. Infants, children, adolescents, and adults at different stages experience their sexuality in distinct ways. Sexuality evolves during childhood and adolescence, laying the foundation for adult sexual health and intimacy. Adolescent sexual health is defined by a broad range of knowledge, attitudes, and behaviors and cannot be defined solely on the basis of abstinence or preventive behaviors.
597
In colonial America, boys who grew up in large families in a mostly rural society were surrounded by sexuality. Children who shared beds with their parents witnessed the sexual act, and procreation by farmyard animals was a natural part of life. Yet societal and religious values linked sexuality to reproduction contained within marriage, and individual behavior was closely regulated by government. Nevertheless, there is plenty of historical evidence that boys experimented with sex. Parents and masters found it difficult to control the youthful behavior of children and servants, teenage college students engaged in wild behavior with young women of the town, and adolescents on southern plantations initiated sexual relationships with female slaves. In Puritan New England, buggery with an animal was a capital crime, and the perpetrator was forced to witness the killing of the animal prior to his own execution. Although such executions were rare, several Massachusetts and Connecticut teenage boys lost their lives for performing sexual acts with a cow, goat, or sheep. By the end of the eighteenth century, individualistic values permeated post-Revolutionary America, and sexuality for pleasure began to be separated from procreation. Boys, who were viewed as inherently sexual, were encouraged to develop internalized restraint. In the increasingly capitalist economy of the nineteenth century, young apprentices and clerks who moved to urban centers were taught that control of appetite would contribute to success. Sexual intercourse should be delayed until marriage, and masturbation loomed as a significant danger. Learning to control his sexual urges was a way that a boy could build character. Gender roles became solidified in the nineteenth century, and
598
Sexuality
boys and girls of the middle class who were raised separately found it difficult to establish easy relationships prior to marriage. Young males were more likely to find intimacy in rough physical play with other boys or in close same-sex friendships with other young men, some of which had romantic overtones. The commercial economy and urban growth of the nineteenth century increased the availability of commercial sex, and boys and young men joined the clientele of urban prostitutes. Poor boys who lived in neighborhoods where vice prevailed earned extra money pimping for brothels. Teenage working-class boys and girls spent leisure time strolling together and attending dance halls. For treating a girl with his hard-won wages, a young man expected a return of sexual favors, resulting in a form of casual prostitution. Through organizations such as the New York Female Moral Reform Society, middle-class women of the 1830s attacked commercial sex, and mothers took on the responsibility of imploring their sons to be pure. But the influence of white women was less effective in the South, where the freedom of white boys on plantations continued to condone sexual initiation with slaves. Black boys also engaged in sexual experimentation with black girls, although any overture to a white woman or girl was certain to bring harsh consequences, which only increased in severity after emancipation. Throughout American history, fear of social chaos has often focused on controlling the behavior of boys. By the late nineteenth century, advocates of competitive sports, a new muscular version of Christianity, and organizations such as the Young Men’s Christian Association (YMCA) urged manly purity and attempted to elevate boys to the same sex-
ual standard as girls and women. Opponents of obscenity exhorted parents to monitor children’s reading and to protect boys from the popular dime novels that often contained sexually explicit material. As a free market of sex increasingly permeated society, the ability to control one’s sexuality came to define the boundaries of middle-class respectability. The early twentieth century, however, initiated a new sexual era as an economy of advanced capitalism began to promote an ethic of indulgence. As work became more routine, individuals sought selfexpression in their private lives. More boys and girls attended high school and began to live in a youth-centered world in which the sexes mingled. By the 1920s middle-class adolescents participated in a new youth subculture and engaged in the rituals of dating, necking, and then petting in the private space of the car. In white and black working-class communities, young men and women exchanged sexual favors. Homosexuals developed a distinct identity in a new sexual subculture. Although families returned to more traditional behavior during hard times in the 1930s, these trends were accelerated in the period of social change that accompanied World War II. By the 1950s, petting was the most common teenage sexual experience. As the age of marriage dropped, however, sexual intercourse continued to be contained within marriage. Patterns of sexual behavior differed widely among young men and young women, as well as among youth from different backgrounds. By the 1960s sexual liberalism was widespread as the singles life, the student movement, and the hippie counterculture all extolled sex as a source of personal freedom no longer linked to marriage. The business of sex churned out
Sexuality consumer products, and changing demographics of household size and structure condoned nonmarital sex. In this context, by the twenty-first century, sexual experience began at younger and younger ages, and the behavior of males and females has become more alike. In the mid-1950s, just over a quarter of young women under age eighteen were sexually experienced (comparable information for young men is not available for that time period). Data from 1968 to 1970 indicate that 55 percent of young men and 35 percent of young women had intercourse by their eighteenth birthday; by 1977–1979, 64 percent of young men and 47 percent of young women had intercourse by their eighteenth birthday. These percentages continued to increase as the twentieth century progressed: 1986–1988 data show that 73 percent of young men and 56 percent of young women had intercourse by their eighteenth birthday (Alan Guttmacher Institute 1994). Today’s teenagers reach physical maturity earlier and marry later, and almost all of them experiment with some type of sexual behavior. Patterns of sexual activity are now fairly similar among young men and women from different socioeconomic, ethnic, and religious groups. According to 1995 research, 68 percent of young men and 65 percent of young women have had intercourse before their eighteenth birthday (Alan Guttmacher Institute 1999). Historically, much of the research on adolescent sexuality has focused on pregnancy, contraceptive use, and sexually transmitted disease infection, and much of the available data are on adolescent females. However, two recent surveys measure sexual behavior among adolescent males. The Youth Risk Behavior Surveys (YRBS), conducted by the Centers
599
for Disease Control and Prevention, assess the behaviors deemed most responsible for influencing health among the nation’s high school students. The 1995 National Survey of Adolescent Males (NSAM) was designed primarily to examine the sexual and reproductive behaviors of a nationally representative sample of boys ages fifteen through nineteen. In these studies, the majority of teen males reported having had sexual intercourse by age seventeen (59 percent, Sonenstein et al. 1997) or twelfth grade (64 percent, Kann et al. 2000). The older the teen, the more likely he was to report having had sexual intercourse: 27 percent of fifteen-year-old males had lost their virginity, compared with 85 percent of nineteen-year-old males (Sonenstein et al. 1997); and 45 percent of ninth grade males had had intercourse, compared with 64 percent of twelfth grade males (Kann et al. 2000). Overall, 12 percent of male students in grades nine to twelve reported having initiated sexual intercourse before thirteen years of age. Black male students (30 percent) were significantly more likely than Hispanic and white male students (14 percent and 8 percent, respectively) to have had sexual intercourse before thirteen years of age (Kann et al. 2000). In addition, black teen males initiated intercourse earlier than Hispanic or white males. Half of black teen males reported having had sexual intercourse by age sixteen; half of Hispanic teen males reported having had sexual intercourse by age seventeen; and half of white teen males reported having had sexual intercourse by age eighteen (Sonenstein et al. 1997). According to data from the 1995 National Survey of Adolescent Males, teenage males’ frequency of sexual intercourse was low. In the twelve months
600
Sexuality
preceding the survey, more than one-half of sexually experienced teen males had intercourse fewer than ten times or not at all. Although young males’ sexual activity tended to be episodic and punctuated by months-long periods of sexual inactivity, teen males had sexual intercourse more frequently as they got older (Sonenstein et al. 1997). In the 1999 Youth Risk Behavior Survey, 30 percent of ninth grade male students, 34 percent of tenth grade male students, 35 percent of eleventh grade male students, and 48 percent of twelfth grade male students reported engaging in sexual intercourse during the three months preceding the survey (Kann et al. 2000). Overall, most teenage males’ sexual relationships were monogamous. Among sexually experienced teen males, 54 percent had one partner or none in one year; 26 percent had two partners in one year; 14 percent had three or four partners in one year; and 6 percent had five or more partners in one year (Sonenstein et al. 1997). Nineteen percent of male students reported having had sexual intercourse with four or more partners during their lifetime; black male students (48 percent) were significantly more likely than Hispanic (23 percent) or white (12 percent) male students to have had sexual intercourse with four or more partners during their lifetime (Kann et al. 2000). Data from the 1995 National Survey of Adolescent Males showed that most sexually experienced teenage boys and men had sexual partners who were close to their own age. The average age difference between sexually experienced males ages fifteen through nineteen and their most recent female partner was less than six months. However, 25 percent of sexually active sixteen-year-old males reported having a female partner who was fourteen
years old or younger, and 11 percent of sexually active nineteen-year-old males reported having a female partner who was fifteen years old or younger (Sonenstein et al. 1997). According to recent data, the number of adolescents using condoms has increased since the early 1980s, but such use was still inconsistent. Sixty-six percent of male students who said they were currently sexually active reported using a condom during their last intercourse (Kann et al. 2000). Ninety percent of teen males ages fifteen through nineteen who reported having sexual intercourse reported having used condoms at some point in the year preceding the survey. However, only 44 percent reported using condoms every time they had intercourse. Forty-seven percent of sexually active black teen males reported consistent condom use, as compared to 46 percent of white and 29 percent of Hispanic sexually active teen males (Sonenstein et al. 1997). Twelve percent of ninth to twelfth grade male students who were currently sexually active reported that their partner used birth control pills before their last incidence of intercourse (Kann et al. 2000). One-third of sexually active males ages fifteen through nineteen who reported using condoms 100 percent of the time also reported that their partners used the pill (Sonenstein et al. 1997). Although teen males were equally likely to have used an effective method of contraception regardless of age, condom use tended to decline with age (Sonenstein et al. 1997). Among sexually experienced males ages fifteen through nineteen, 14 percent said that they had made a partner pregnant. Twenty-two percent of black, 19 percent of Hispanic, and 10 percent of white sexually experienced males ages fifteen through nineteen reported having made a
Sexuality partner pregnant. Six percent of all sexually experienced males ages fifteen through nineteen reported having fathered a child. Broken down by race, 10 percent of black, 8 percent of Hispanic, and 5 percent of white sexually experienced males ages fifteen through nineteen reported having fathered a child (Sonenstein et al. 1997). Overall, 5 percent of the ninth through twelfth grade male students in the 1999 Youth Risk Behavior Survey sample reported that they had gotten someone pregnant. Older male students were more likely than younger male students to have gotten someone pregnant, and black male students (13 percent) were more likely than Hispanic (7 percent) or white (3 percent) male students to have gotten someone pregnant (Kann et al. 2000). Among currently sexually active students in grades nine through twelve, 31 percent of males had used alcohol or drugs at the time of their most recent sexual intercourse. Eight percent of males reported being hit, slapped, or physically hurt on purpose by their boyfriend or girlfriend, and 5 percent reported being forced to have sexual intercourse (Kann et al. 2000). Approximately 20 percent of all high school students are enrolled in alternative high schools that serve students who are at risk for failing or dropping out of regular high school or are students who have been removed from their regular high school because of drug use, violence, or other illegal activity or behavioral problems. According to a recent study of youth in alternative high schools, the vast majority of males who attended such schools reported having had sexual intercourse (84 percent in grade nine and 88 percent in grade twelve). Sixty-six percent of males reported having had sexual intercourse in the three months preceding the survey. Thirty percent of male students enrolled
601
in alternative high schools reported having initiated sexual intercourse before age thirteen. Fifty-seven percent reported having had four or more sexual partners in their lifetime. Fifty-five percent said that they had used a condom when they last had intercourse, compared with 13 percent who indicated that their partner used birth control pills before their most recent intercourse. Ninth grade male students enrolled in alternative high schools were significantly more likely (66 percent) to have reported condom use at last intercourse than males in twelfth grade (49 percent) (Grunbaum et al. 1999). There is a public and professional consensus about what is sexually unhealthy for teenagers. Professionals, politicians, and parents across the political spectrum share a deep concern about unplanned adolescent pregnancy; out-of-wedlock childbearing; sexually transmitted diseases, including human immunodeficiency virus (HIV) and its ultimate result, acquired immunodeficiency syndrome (AIDS); sexual abuse; date rape; and the potential negative emotional consequences of premature sexual behaviors. However, there is little public, professional, or political consensus about what is sexually healthy for teenagers. The public debate about adolescent sexuality has often focused on which sexual behaviors are appropriate for adolescents and ignored the complex dimensions of sexuality. Becoming a sexually healthy adult is a key developmental task of adolescence. Achieving sexual health requires the integration of psychological, physical, societal, cultural, educational, economic, and spiritual factors. Sexual health encompasses sexual development and reproductive health and such characteristics as the ability to develop and maintain meaningful interpersonal relationships; appreciate
602
Sexually Transmitted Diseases
one’s own body; interact with both genders in respectful and appropriate ways; and express affection, love, and intimacy in ways consistent with one’s own values. Adults can encourage adolescent sexual health by providing accurate information and education about sexuality, fostering responsible decisionmaking skills, offering young people support and guidance to explore and affirm their own values, and modeling healthy sexual attitudes and behaviors. Society can enhance adolescent sexual health by providing access to comprehensive sexuality education; affordable, sensitive, and confidential reproductive health care services; and education and employment opportunities. Adolescents should be encouraged to delay sexual behaviors until they are physically, cognitively, and emotionally ready for mature sexual relationships and their consequences. This support should include education about intimacy; sexual limit setting; social, media, peer, and partner pressure; the benefits of abstinence from intercourse; and the prevention of pregnancy and sexually transmitted diseases. Because many adolescents are or will be sexually active, they should receive support and assistance in developing their skills to evaluate their readiness for mature sexual relationships. Responsible adolescent intimate relationships, like those of adults, if any type of intercourse occurs, should be based on shared personal values and should be consensual, nonexploitive, honest, pleasurable, and protected against unintended pregnancies and sexually transmitted diseases (National Commission on Adolescent Sexual Health 1995). Monica Rodriguez See also Fathers, Adolescent; Pornography; Prostitution; Same-Sex Relationships; Sexually Transmitted Diseases
References and further reading Alan Guttmacher Institute. 1994. Sex and America’s Teenagers. New York: Alan Guttmacher Institute. ———. 1999. Facts in Brief: Teen Sex and Pregnancy. New York: Alan Guttmacher Institute. D’Emilio, John, and Estelle Freedman. 1988. Intimate Matters: A History of Sexuality in America. New York: Harper and Row. Grunbaum, Jo Anne, Laura Kann, Steven A. Kinchen, James G. Ross, Vani R. Gowda, Janet L. Collins, and Lloyd J. Kolbe. 1999. “Youth Risk Behavior Surveillance—National Alternative High School Youth Risk Behavior Survey, United States, 1988.” Centers for Disease Control and Prevention: MMWR Surveillance Summaries 48, no. SS-7 (October 29). Kann, Laura, Steven A. Kinchen, Barbara I. Williams, James G. Ross, Richard Lowry, Jo Anne Grunbaum, Lloyd J. Kolbe, and State and Local YRBSS Coordinators. 2000. “Youth Risk Behavior Surveillance—United States, 1999.” Centers for Disease Control and Prevention: MMWR Surveillance Summaries 49, no. SS-5 (June 9). National Commission on Adolescent Sexual Health. 1995. Facing Facts: Sexual Health for America’s Adolescents. New York: Sexuality Information and Education Council of the United States. Sexuality Information and Education Council of the United States (SIECUS). 1995. SIECUS Position Statements on Sexuality Issues 1995. New York: SIECUS. Sonenstein, Freya L., Kellie Stewart, Laura Duberstein Lindberg, Marta Pernas, and Sean Williams. 1997. Involving Males in Preventing Teen Pregnancy: A Guide for Program Planners. Washington, DC: Urban Institute.
Sexually Transmitted Diseases Sexually transmitted diseases (STDs) are infectious diseases transmitted primarily through sexual contact, although some may also be acquired through contaminated blood products and from mother to
Sexually Transmitted Diseases fetus. Physicians coined the term sexually transmitted diseases in the 1970s to eliminate the judgmental overtones of the older term venereal diseases, which was frequently equated with illicit sexual activity. The major STDs are syphilis, gonorrhea, genital herpes, genital warts, chlamydia, and human immunodeficiency virus (HIV), which causes acquired immunodeficiency syndrome (AIDS). Other diseases that may be sexually transmitted include trichomoniasis, bacterial vaginitis, cytomegalovirus infections, scabies, and pubic lice. STDs are the most common infectious diseases in the world and are most prevalent in teenagers and young adults. Despite advances in the treatment of infectious diseases during the twentieth century, control of STDs has been thwarted by views that depict these diseases as moral retribution for socially unacceptable behavior (Brandt 1985). Until the late nineteenth century, physicians who studied STDs tended to focus on adult sufferers. Sexual activity outside marriage was tightly regulated, first by the church and later by the state, and STDs were seen as just punishment for those who violated moral standards (D’Emilio and Freedman 1988). More important, children and adolescents of both sexes were supposed to be asexual. Medical advice literature of the period warned that signs of sexual feelings in the young were an aberration and that parents should actively discourage such feelings by preventing their children from masturbating or reading romantic novels. Adolescent chastity was particularly important to native-born whites of the emerging middle class, who used sexual restraint, along with temperance and other forms of self-control, to distinguish themselves from laborers, immigrants, and African Americans (Moran 2000).
603
Medical advice literature also responded to what many saw as a disturbing trend in American family structure: growing numbers of adolescent boys were living away from home. The expansion of urban commerce led boys to leave family farms to take jobs as clerks in the nation’s growing urban centers, all of which contained flourishing sex trades. Physicians and social reformers worried that adolescent boys living alone in the city would be enticed into lives of sexual depravity and wrote advice manuals warning that solicitation of prostitutes led to disease and premature death (Cohen 1999). Concerns about the “victimization” of young men by prostitutes prompted the formation of purity leagues, like the New York Magdalen Society, aimed at eliminating the blight of prostitution on the nation’s cities. Reformers also promoted the creation of organizations such as the Young Men’s Christian Association, which attempted to provide wholesome substitutes for prostitution and other urban vices but also inadvertently caused some young men to develop same-sex relationships (Gustav-Wrathall 1998). During the late nineteenth and early twentieth centuries, medical discoveries shifted the focus of disease prevention efforts. The discovery of the microorganisms that caused many STDs enabled scientists to develop diagnostic tests that detected disease in women, who were less likely than men to show external physical symptoms until the disease was well advanced. These new diagnostic procedures disclosed that STDs were much more prevalent than originally assumed and were often the underlying cause of other medical problems such as impotence, sterility, and insanity. Medical advances also drew increasing attention to
604
Sexually Transmitted Diseases
so-called innocent victims of these diseases, namely the wives and children of men who frequented prostitutes. Some physicians even made a distinction between “venereal” disease contracted through illicit sexual conduct and “innocent” forms caused by passage of the disease from husband to wife or through casual forms of contact with sufferers who had open sores on the mouth or skin. The New York City dermatologist L. Duncan Bulkley (1894) warned that nonvenereal transmission of syphilis was common and that the disease could be contracted through kissing, shaking hands, breastfeeding, circumcision, sharing utensils and bed linens with infected persons, and smoking cigarettes and cigars made by syphilitic operatives. Today, medical experts recognize that nonsexual forms of transmission are extremely rare and that fears about “innocent” forms of contagion were shaped largely by class, race, and ethnic prejudice of the period. Rates of immigration were increasing during this time, and the birthrates among the white, native-born middle classes were declining, leading to fears that the latter were committing “race suicide” by failing to reproduce in adequate numbers. Detection of STDs in members of the “respectable” white, native-born middle classes exacerbated these fears of race suicide, since physicians recognized that these diseases could lead to sterility, miscarriage, stillbirths, and birth defects in surviving infants. Control of STDs therefore became linked with the eugenics movement of the early twentieth century, which hoped to improve the quality of human racial stock through selective breeding and reforms in public health (Pernick 1996). Eager to explain infection in “respectable” persons, physicians argued
that immigrant workers, particularly food service workers, domestic servants, and cigar and cigarette makers, should be tested for venereal diseases and, if infected, should be forbidden from working in these occupations (Tomes 1998). Physicians also advised against the practice, common in the South, of using black female servants as wet nurses for white infants, since physicians alleged that syphilis was endemic in the black population (Jones 1993). At times, antiSemitism is also apparent in warnings about venereal disease: physicians alleged that the practice of ritual circumcision caused higher rates of syphilis in Jews than in gentiles. Yet physicians recognized that sexual contact remained the primary form of transmission and used this fact to advocate the creation of sex education programs in public schools. At the same time, developments in child psychology, most notably the work of G. Stanley Hall and Sigmund Freud, postulated that awakening sexuality was a normal part of child and adolescent development but warned that youth’s sexual impulses had to be channeled away from premature sexual activity. Controlling adolescent sexuality became more problematic during and after World War I, when standards of sexual morality among young people changed dramatically. Although not as drastic as the events of the 1960s, the “sexual revolution” of the 1920s directly challenged Victorian standards of sexual chastity. Changes in the sexual behavior of young people created new challenges for sex educators, as the goals of educating the young about sex and protecting them from premarital sex frequently came in conflict with each other. Opponents of sex education argued that these programs actually encouraged sex-
Sexually Transmitted Diseases ual activity. Proponents countered that scientific information transmitted by qualified educators was better than the myths and folk beliefs that young people acquired from their peers. Sex education advocates warned that boys were especially likely to acquire misinformation about sex, and these advocates were particularly concerned about the common belief among young men that sexual experience was vital to masculine identity. Yet public opposition to sex education in schools led schools to water down their programs or eliminate them entirely. For this reason, young men received conflicting messages about sexuality. In private, many males shared a sexual double standard that linked sexual activity with masculinity and sexual chastity with virtuous womanhood. Since “respectable” women were forbidden to engage in premarital sexual activity, the only way a young man could gain sexual experience before marriage was to visit a prostitute, thereby exposing himself to infection. Psychologists warned that males who did not exhibit interest in heterosexual sexual activity were at risk of becoming homosexuals. At the same time, social conservatives continued to condemn premarital sexuality and view STDs as punishment for violation of social norms. The prevalence of premarital sexual activity among young people continued to climb during the years following World War II. In addition, the discovery of antibiotics dramatically reduced the incidence of STDs. For example, the number of cases of primary and secondary syphilis fell from 66.4 per 100,000 in 1947 to 3.9 per 100,000 in 1957 (Brandt 1985, 171). Yet pockets of infection remained, particularly among populations who lacked access to adequate health care because of
605
racism, poverty, or geographic isolation. During the late 1960s, rates of STDs among white middle-class youth started to increase as young people from this group dropped out of mainstream society and embraced an alternative, hippie subculture that rejected middle-class norms of sexual behavior. These youths also tended to avoid “establishment” institutions such as hospitals and clinics, where they frequently encountered adults who disapproved of their lifestyle. Those who were under age twenty-one also feared that health care personnel would inform their parents should they seek medical treatment for STDs and would frequently avoid obtaining care until the disease was far advanced. In order to stem the spread of these diseases, specialists in adolescent medicine pushed for laws that would allow minors to obtain medical treatment without parental consent. Physicians at this time also coined the term sexually transmitted disease to eliminate the social stigma associated with the older term venereal disease (Prescott 1998). The appearance of AIDS in the 1980s added new urgency to STD prevention efforts but also reaffirmed older prejudices about this category of disease. Because AIDs was first identified in gay men and intravenous drug users, public response to the disease was initially apathetic and reflected earlier sentiments that those who suffered from this category of disease somehow deserved their fate. Even those who acquired AIDs through nonsexual means were stigmatized. When Ryan White, a thirteen-year-old hemophiliac, contracted AIDs from tainted blood products in 1984, he and his parents were shunned by neighbors in their hometown of Kokomo, Indiana. Ryan’s classmates called him a “fag,” and school officials barred him from attending the
606
Siblings
town’s high school. When ostracism turned to violence, the Whites moved to another town where public school officials took a more rational approach to his disease and were able to calm the fears of town residents. Ryan became a national celebrity and helped raise awareness about those living with AIDs, particularly other children and adolescents (White 1991). Yet White’s story also reinforced distinctions between “innocent” victims of AIDs like himself and those who obtained the disease from less socially acceptable means. This double standard has meant that attempts to control the spread of AIDs and other STDs through safe sex and needle exchange programs continue to be problematic because many individuals still believe that these diseases are divine retribution for bad behavior. Heather Munro Prescott See also Disease and Death; Prostitution; Same-Sex Relationships; Sexuality References and further reading Brandt, Allan M. 1985. No Magic Bullet: A Social History of Venereal Disease in the United States since 1880. New York: Oxford University Press. Bristow, Nancy K. 1996. Making Men Moral: Social Engineering during the Great War. New York: New York University Press. Bulkley, L. Duncan. 1894. Syphilis in the Innocent (Syphilis Insontium) Clinically and Historically Considered with a Plan for the Legal Control of the Disease. New York: Bailey and Fairchild. Cohen, Patricia Cline. 1999. The Murder of Helen Jewitt. New York: Vintage. D’Emilio, John D., and Estelle B. Freedman. 1988. Intimate Matters: A History of Sexuality in America. New York: Harper and Row. Gustav-Wrathall, John Donald. 1998. Take the Young Stranger by the Hand: SameSex Relations and the YMCA. Chicago: University of Chicago Press.
Jones, James. 1993. Bad Blood: The Tuskegee Syphilis Experiment. Rev. ed. New York: Free Press. Moran, Jeffrey P. 2000. Teaching Sex: The Shaping of Adolescence in the 20th Century. Cambridge, MA: Harvard University Press. Pernick, Martin S. 1996. The Black Stork: Eugenics and the Death of “Defective” Babies in American Medicine and Motion Pictures since 1915. New York: Oxford University Press. Prescott, Heather Munro. 1998. “A Doctor of Their Own”: The History of Adolescent Medicine. Cambridge, MA: Harvard University Press. Tomes, Nancy. 1998. The Gospel of Germs: Men, Women, and the Microbe in American Life. Cambridge, MA: Harvard University Press. White, Ryan, and Ann Marie Cunningham. 1991. Ryan White: My Own Story. New York: Dial Press.
Siblings Western civilization has long used the concept of brotherhood to symbolize any relationship in which a strong bond is or should be present (Handel 1985). Brotherhood and, more recently, sisterhood have come to represent relationships built on a deeper commitment to one another, even in difficult times. These terms have come to symbolize both the great strength that can exist in relationships, such as the brotherhood identified among African American males, and the grievous betrayals that can similarly result, as evidenced in the biblical story of Cain and Abel, John Steinbeck’s novel East of Eden, and William Shakespeare’s play Hamlet. Historically, the relationships between siblings, perhaps the most intense many human beings will experience, have come to represent both the best and the worst in how people treat one another. Although children and adolescents may avoid conflict with friends in order to
Siblings preserve those relationships, sibling relationships, which do not operate under the same threat of termination, are likely to experience a heightened level of both warmth and conflict. The significance of the sibling bond during the growing years is becoming ever more evident. Early theories of children’s socioemotional development viewed the importance of sibling relationships as limited to the rivalry thought to define the relationships between siblings and their competition for parents’ attention and other resources. Initial research focused on the structural makeup of children’s sibling relationships, reasoning that it was the child’s position in the family relative to his or her brothers and sisters that defined the importance of the sibling experience. Over time, greater consideration has been given to the quality of the relationship between siblings (Furman and Burhmester 1985) and the family characteristics that promote or hinder that quality (Brody, Stoneman, and McCoy 1992; Dunn and Kendrick 1981). More recently, research about siblings has moved to the examination of specific sibling unions. For example, research is currently under way that focuses on the importance of children’s sibling relationships when one of the siblings is mentally or physically disabled (Mandleco et al. 1998), when one of the siblings is perpetrating incestuous or other abnormal behaviors (Adler and Schutz 1996), and when children are attempting to cope with the death or long-term illness of a sibling (Gallo and Knafl 1993). As a result, many unique qualities of this shared relationship are now being recognized, including the importance of children’s sibling configurations, the ways in which sibling relationships change across childhood and adolescence, parents’ impor-
607
Two brothers playing piggyback (Photodisc)
tance to the quality of siblings’ relationships, and the significance of siblings as role models or as protectors against many of the stressful events occurring both within and without the family. Finally, sibling relationships are becoming increasingly recognized for their potential importance to other areas of children’s and adolescents’ lives. In general, four characteristics of siblings’ relationships have received particular attention: birth order, age spacing between siblings, number of siblings in a family, and the gender of siblings. Numerous studies have examined siblings’ birth order as a predictor of children’s successful development. Many have argued that being the oldest sibling is generally advantageous because it provides increased opportunities to function as a teacher and model for younger siblings. In
608
Siblings
contrast, others have speculated that being born later is advantageous because older siblings can serve as “pacemakers” of desirable behavior, allowing the younger siblings to better judge what they should realistically be able to achieve. In adulthood, firstborn siblings do appear to take on positions of responsibility and authority more often (Wagner, Schubert, and Schubert 1979, 88), but they also have greater anxiety and selfabsorption compared to later-born siblings (Cloninger 2000, 120). Siblings provide children with many of the same reciprocal functions found in friendships, including rough-and-tumble and makebelieve play, but the age differences typically found between older and younger siblings allow them to also provide complementary functions for one another. The complementary nature of their relationships is particularly evident in the teaching, modeling, helping, and caregiving often provided to later-born siblings by their older brothers or sisters (Dunn 1983, 805). These interactions provide later-born siblings with opportunities to observe and interact with older children while providing the older siblings with the opportunity to develop greater responsibility and skills in teaching and caregiving. It should be pointed out that these attempts at helping, managing, or teaching are not always well given by older siblings or well received by their later-born counterparts. The importance of birth order appears to vary as a function of the age spacing between siblings (Teti, in press, 13). Children who are close in age to their siblings will likely have more intense relationships, exhibiting greater conflict as well as more shared interests and activities. In contrast, when greater age spacing exists, there are increased benefits regarding
their respective roles as teachers and learners, helpers and those being helped. An issue that has received a great deal of attention in conjunction with birth order is the number of siblings in children’s families, particularly as it relates to children’s level of intelligence. A number of studies have concluded that laterborn siblings are likely to have lower intelligence than their older brothers and sisters. The logic behind these studies was that larger families produced children with lower intelligent quotients (IQ); thus, having more siblings present in a home when a child was born was thought to result in children growing up in a family context that was increasingly diluted intellectually. However, Joseph Rodgers and his colleagues (2000, 607) recently concluded that it was not the act of being in a large family that lowered children’s IQs; rather, large families appear to be born to parents with lower IQs. A final characteristic important to children’s sibling experience is their gender. Children’s gender indicates how and how much they are likely to interact with their siblings (Teti, in press, 17). In general, relationships among brothers as compared to sisters appear to be largely the same until late childhood and early adolescence, when brothers begin to perceive their relationships as less supportive, warm, and intimate than those with their sisters (Dunn 1996, 40; Furman and Buhrmester 1992, 110–111). Of greatest importance is not brothers versus sisters but whether the gender makeup of sibling pairs is similar or different. Even in the earliest stages of development, same-sex siblings begin to demonstrate more social and less agonistic behavior toward one another when compared to mixed-sex sibling pairs (Dunn and Kendrick 1981, 1271). In addition, later-born children in
Siblings mixed-sex sibling pairs are progressively less likely to imitate the older siblings’ behavior, compared to those in same-sex sibling pairs (Pepler, Abramovitch, and Corter 1981, 1346). Gender is also important to the socializing behavior of siblings born earlier; older sisters are more likely than their male counterparts to take on the role of teacher and manager for younger siblings, irrespective of the laterborn siblings’ gender (Stoneman, Brody, and MacKinnon 1986, 508). These gender differences appear to have become increasingly important as the average number of children within families has decreased (Mizell and Steelman 2000); in smaller families the importance of characteristics such as gender becomes increasingly pronounced. Whether families are made up largely of boys or girls also appears to have implications for other aspects of family life. Several studies have found that the presence of sons rather than daughters is related to a lower probability of divorce occurring and lower consideration of divorce as an alternative by mothers (see Mizell and Steelman 2000), a difference thought to result from fathers’ increased participation in the family when sons are present. Conversely, mothers of daughters were found to have higher marital satisfaction compared to mothers of sons (Abbott and Brody 1985, 81), which is thought to result from the greater number of household responsibilities girls generally take on relative to boys. Siblings can have an impact on other children in the family from the moment they come into the world. The initial importance of later-born siblings to their older siblings is largely due to the impact they have on the siblings’ previous interaction patterns with parents. When children first become siblings through the
609
birth of a younger brother or sister, many are initially excited and intrigued by the novelty of having a new baby in the home, but it can also be a time of anxiousness and despair as the older child must learn the realities of sharing the parents’ love and attention with the new infant. Parents can do much to ensure that older children feel secure and to prepare the older children for the changes that will likely occur when the infant finally arrives. As younger siblings move from infancy to early childhood, older siblings’ interactions with their younger brothers or sisters change as the younger children become better able to move about, communicate, and eventually engage in reciprocal play (Dunn and McGuire 1992). As siblings move into middle and late childhood, several interesting characteristics emerge. First, the warmth and conflict present between siblings are not very strongly linked to one another, indicating that the presence of one is not necessarily indicative of the others’ absence (Furman and Buhrmester 1985, 457). Many sibling relationships appear to evidence high levels of both warmth and conflict. During this period of development, sibling relationships persist in demonstrating asymmetry in both positive qualities such as nurturance and admiration and negative qualities such as power and dominance. Also during this period, same-sex sibling relationships have been found to have more warmth and closeness but less conflict compared to opposite-sex sibling pairs of similar age spacing. As siblings move from late childhood to adolescence, there is increasing egalitarianism between them but also a corresponding decline in the level of intensity in the relationship. Duane Buhrmester and Wyndol Furman (1985, 1396) found
610
Siblings
that siblings spend less time together during this period of development but otherwise experience little change in intimacy and affection. Others have found a general reduction in both the positive and negative characteristics of siblings’ relationships (Brody, Stoneman, and McCoy 1994, 278). One explanation for these changes accompanying the onset of adolescence may be siblings’ greater need for individualism and autonomy and their expanded opportunities for establishing new friendships, both of which are likely to reduce the intensity of siblings’ relationships with one another. Parents have an enormous effect on children’s sibling relationships. In particular, they are important to siblings’ relationships through their own relationships with each of their children, their approach to parenting the siblings, and the extent to which they treat each sibling differently. Several studies have found that mothers’ and fathers’ relationships with and behavior toward their children are important to the quality of their children’s sibling relationships. Mothers’ and fathers’ parenting behavior is also important to siblings’ interactions with each other, but in different ways. In general, mothers’ greatest impact on children’s sibling relationships results from their use of discipline. When mothers use more intrusive and controlling (Volling and Belsky 1992, 1219) or more punitive disciplinary techniques (Brody, Stoneman, and MacKinnon 1986, 233), siblings appear to be more aggressive toward one another. In contrast, fathers’ importance to siblings’ relationships is based more on the emotional dimensions of their relationships with their children. When fathers exhibit more positive affection and facilitative behavior in their interactions with their children, children evidence
more positive and less conflicted behavior toward their brothers and sisters (Brody, Stoneman, and McCoy 1994, 283; Volling and Belsky 1992, 1219). Other evidence of the importance of parents’ relationships with their children to the children’s sibling relationships has to do with children’s attachments to their parents. Young children and their infant siblings had more positive and less negative sibling interactions when they were identified as being more securely attached to their mothers, a status that is likely to be found when mothers are responsive to their children’s needs (Teti, in press, 28). Parents can also affect the quality of siblings’ relationships when they treat their children differently. However, parents cannot help but respond to their children somewhat differently. Because siblings generally differ in their age and development and often either are born with different temperaments or develop different personalities, parents must adjust their interactions with their children in order to appropriately respond to children’s individual needs and actions. These differences in parental behavior toward siblings are often accepted and even expected by siblings. For example, older brothers and sisters expect their younger siblings to receive more attention from their parents as a result of their lower ability to function independently and their resulting increased dependence on parents. When parents and siblings have been examined together, parents generally directed greater amounts of both positive and negative behavior toward laterborn children than their older siblings (Brody, Stoneman, and McCoy 1992, 649; Stocker, Dunn, and Plomin 1989, 725). In general, when parents’ behavior toward their children moves away from this normative level of differential treatment,
Siblings more negative emotionality is evidenced between siblings. When parents treat siblings differently, it is likely to significantly affect how children come to view themselves. What is difficult to determine is whether excessive differential treatment of siblings by parents is a cause of siblings’ negativity toward one another or a response to excessive negativity already present between siblings. An important component in understanding parents’ differential treatment of their children is the siblings’ temperaments. When children have been identified as being more temperamentally difficult than their siblings, differences are likely to exist in their relationships with their parents and how their parents manage their behavior. In their attempt to understand the importance of temperament for children’s sibling relationships, Gene Brody, Zo Stoneman, and Kelly McCoy (1992, 649) contend that both individual temperaments and the relative differences between siblings’ temperaments must be considered. Parents’ treatment of their children appears to have more to do with relative differences in siblings’ temperaments than with the difficulty of older or younger siblings’ temperaments considered individually. Because of their frequent exposure to one another, siblings can influence one another for both good and ill. This is particularly true for older siblings, who are more likely to function as models, teachers, and helpers for their later-born siblings. Within sibling relationships, children and adolescents are likely to learn the art of compromise and negotiation. If a boy gets mad at a friend, he can go home or even stop being that person’s friend. However, sibling relationships necessitate that children work through problems and come to some type of resolution. In
611
addition, because of the caregiving opportunities that naturally arise within the sibling relationship, many children are also likely to develop an early sense of empathy, responsibility, and respect for others through interactions with their siblings (Dunn 1983). Siblings do not always set good examples for one another, however: older siblings’ involvement with illicit drugs, alcohol use, deviant behavior, and sexual activity have all been found to predict younger siblings’ engagement in similar behavior (Conger and Reuter 1996; Rowe and Gulley 1992; Rogers and Rowe 1988). In assessing implications of sibling relationships, one challenge is to distinguish siblings’ effects on one another from what are merely results of being reared in the same family environment. Siblings can also act as buffers against many of the challenges faced by children and adolescents growing up. For example, Tracey Kempton and her colleagues (1991, 437) found that having a sibling was helpful to adolescents’ efforts to adjust to their parents’ divorce. In a similar manner, Patricia East and Karen Rook (1992, 170) found that many children with low levels of support from school friends had relatively high levels of support within their favorite sibling relationships. They concluded that children who became isolated at school appear to develop greater dependency on their brothers and sisters. Finally, because relationships with siblings are generally permanent, they may provide children and adolescents with a unique socializing context that may serve to enhance their nonfamilial relationships, particularly with peers. For many children and adolescents, friendships may be difficult to maintain. Sibling relationships provide an opportunity for children and adolescents to learn appropriate
612
Siblings
relationship behavior. How this learning process occurs, however, is still not completely clear. Clare Stocker and Judy Dunn (1991, 239) contend that greater positive behavior in friendships appears to result from lessons learned in negative sibling experiences. In contrast, McCoy, Brody, and Stoneman (in press, 406) propose that having a difficult temperament may be a major hindrance to establishing positive friendships. They contend that learning how to establish a positive sibling relationship may negate the negative challenges with friends that may result from having a more difficult temperament. The relationships boys experience with brothers and sisters have historically been viewed as significant, but until recently little consideration was given to the quality of the relationship itself. Most research focused only on children’s different experiences as a result of their different positions (i.e., first, middle, or last born) within the family. As researchers have begun to study the implications of the relationships between siblings, they have discovered a number of new factors important to children’s individual development and to the sibling relationship itself. They have also come to better understand the benefits sibling relationships can provide children during difficult times. Because of their often intense nature, some of the strongest sibling relationships are not those devoid of conflict but often those in which both warmth and conflict are present (Stormshak, Bellanti, and Bierman 1996). As a result, sibling relationships provide an environment in which boys can explore important social skills such as caregiving, compromise, and negotiation but can also be a context in which to learn about competition, dominance, and aggression. Understanding the strengths and challenges inherent in sib-
ling relationships can provide families with a more harmonious home environment as well as better prepare children and adolescents for their relationships outside the home. J. Kelly McCoy References and further reading Abbott, Douglas A., and Gene H. Brody. 1985. “The Relation of Child Age, Gender, and Number of Children to the Marital Adjustment of Wives.” Journal of Marriage and the Family 47: 77–84. Adler, Naomi A., and Joseph Schutz. 1996. “Sibling Incest Offenders.” Child Abuse and Neglect 19: 811–819. Brody, C. J., and L. C. Steelman. 1985. “Sibling Structure and Parental SexTyping of Children’s Household Tasks.” Journal of Marriage and the Family 47: 265–273. Brody, Gene H., Zolinda Stoneman, and Carol MacKinnon. 1986 “Contributions of Maternal Child-rearing Practices and Interactional Contexts to Sibling Interactions.” Journal of Applied Developmental Psychology 7: 225–236. Brody, Gene H., Zolinda Stoneman, and J. Kelly McCoy. 1992. “Parental Differential Treatment of Siblings and Sibling Differences in Negative Emotionality.” Journal of Marriage and the Family 54: 643–651. ———. 1994. “Contributions of Family Relationships and Child Temperaments to Longitudinal Variations in Sibling Relationship Quality and Sibling Relationship Styles.” Journal of Family Psychology 8: 274–286. Buhrmester, Duane, and Wyndol Furman. 1990. “Perceptions of Sibling Relationships during Middle Childhood and Adolescence.” Child Development 61: 1387–1398. Cloninger, Susan C. 2000. Theories of Personality: Understanding Persons. 3d ed. Upper Saddle River, NJ: PrenticeHall. Conger, Rand D., and Martha A. Reuter. 1996. “Siblings, Parents and Peers: A Longitudinal Study of Social Influences in Adolescent Risks for Alcohol Use and Abuse.” Pp. 1–30 in Sibling Relationships: Their Causes and
Siblings Consequences. Edited by G. H. Brody. Norwood, NJ: Ablex. Dunn, Judy. 1983. “Sibling Relationships in Early Childhood.” Child Development 54: 787–811. ———. 1996. “Brothers and Sisters in Middle Childhood and Early Adolescence: Continuity and Change in Individual Differences.” Pp. 31–46 in Sibling Relationships: Their Causes and Consequences. Edited by Gene H. Brody. Norwood, NJ: Ablex. Dunn, Judy, and C. Kendrick. 1981. “Social Behavior of Young Siblings in the Family Context: Differences between Same-Sex and Different-Sex Dyads.” Child Development 52: 1265–1273. Dunn, Judy, and Shirley McGuire. 1992. “Sibling and Peer Relationships in Childhood.” Journal of Child Psychology and Psychiatry 33: 67–105. East, Patricia L., and Karen S. Rook. 1992. “Compensatory Patterns of Support among Children’s Peer Relationships: A Test Using School Friends, Nonschool Friends, and Siblings.” Developmental Psychology 28: 163–172. Furman, Wyndol, and Duane Buhrmester. 1985. “Children’s Perceptions of the Qualities of Sibling Relationships.” Child Development 56: 448–461. ———. 1992. “Age and Sex Differences in Perceptions of Networks of Personal Relationships.” Child Development 63: 103–115. Gallo, Agatha M., and Kathleen A. Knafl. 1993. “The Effects of Mental Retardation, Disability, and Illness on Sibling Relationships: Research Issues and Challenges.” Pp. 215–234 in Siblings of Children with Chronic Illnesses: A Categorical and Noncategorical Look at Selected Literature. Edited by Zolinda Stoneman and Phyllis Waldman Burman. Baltimore: Paul H. Brookes Publishing. Handel, Gerald. 1985. “Central Issues in the Construction of Sibling Relationships.” Pp. 493–523 in The Psychosocial Interior of the Family. Edited by Gerald Handel. New York: Aldine de Gruyter. Kempton, Tracey, Lisa Armistead, Michelle Wierson, and Rex Forehand. 1991. “Presence of a Sibling as a Potential Buffer Following Parental Divorce: An Examination of Young
613
Adolescents.” Journal of Clinical Child Psychology 20: 434–438. Mandleco, Barbara L., Susanne F. Olsen, Clyde C. Robinson, Elaine S. Marshall, and Mary K. McNeilly-Choque. 1998. “Social Skills and Peer Relationships of Siblings of Children with Disabilities.” Pp. 106–120 in Children’s Peer Relations. Edited by P. T. Slee and K. Rigby. New York: Routledge. McCoy, J. Kelly, Gene H. Brody, and Zolinda Stoneman. In press. “Temperament and the Quality of Youths’ Best Friendships: Do Sibling and Parent-Child Relationships Make a Difference?” Mizell, C. Andre, and Lala C. Steelman. 2000. “All My Children: The Consequences of Sibling Group Characteristics on the Marital Happiness of Young Mothers.” Journal of Family Issues 21: 858–887. Pepler, Deborah J., Rona Abramovitch, and Carl Corter. 1981. “Sibling Interaction in the Home: A Longitudinal Study.” Child Development 52: 1344–1347. Rodgers, Joseph L., H. Harrington Cleveland, Edwin van den Oord, and David C. Rowe. 2000. “Resolving the Debate over Birth Order, Family Size, and Intelligence.” American Psychologist 55: 599–612. Rogers, Joseph L., and David C. Rowe. 1988. “Influence of Siblings on Adolescent Sexual Behavior.” Developmental Psychology 24: 722–728. Rowe, David C., and Bill L. Gulley. 1992. “Sibling Effects on Substance Use and Delinquency.” Criminology 30: 217–233. Stocker, Clare, and Judy Dunn. 1991. “Sibling Relationships in Childhood: Links with Friendships and Peer Relationships.” British Journal of Developmental Psychology 8: 227–244. Stocker, Clare, Judy Dunn, and Robert Plomin. 1989. “Sibling Relationships: Links with Child Temperament, Maternal Behavior, and Family Structure.” Child Development 60: 715–727. Stoneman, Zolinda, Gene H. Brody, and Carol MacKinnon. 1986. “Same-sex and Cross-sex Siblings: Activity Choices, Roles, Behavior, and Gender Stereotypes.” Sex Roles 15: 495–511. Stormshak, Elizabeth A., Christina J. Bellanti, and Karen L. Bierman. 1996.
614
Skateboarding
“The Quality of Sibling Relationships and the Development of Social Competence and Behavioral Control in Aggressive Children.” Developmental Psychology 32: 79–89. Teti, Douglas M. In press. “Sibling Relationships.” In Interiors: Retrospect and Prospect in the Psychological Study of Families. Edited by J. McHale and W. Grolnick. Mahwah, NJ: Erlbaum. Volling, B. L., and Jay Belsky. 1992. “The Contribution of Mother-Child and Father-Child Relationships to the Quality of Sibling Interaction: A Longitudinal Study.” Child Development 63: 1209–1222. Wagner, Mazie E., Herman J. P. Schubert, and Daniel S. P. Schubert. 1979. “Sibship-Constellation Effects on Psychological Development, Creativity, and Health.” Advances in Child Development and Behavior 14: 57–148.
Skateboarding Skateboarding is not just a sport; it is a mind-set, an attitude, an industry, and a cultural phenomenon. Many believe that the skateboard was invented by people in the surfing business approximately forty years ago. Naturally, surfers want to surf, even when weather conditions are poor. Until the 1950s, surfing was exclusively a male sport, and many boys wanted to find an alternative to surfing waves when the ocean was flat. Some give credit to Bill Richards, who opened a California surf shop in 1962. Bill and his son Mark made a deal with the Chicago Roller Skate Company to produce sets of skate wheels. These were mounted to square wooden boards, and the new craze of “sidewalk surfing” was born. When clay wheels entered the picture, this new boys’ sport really started to roll. In 1965, the first National Skateboard Championships aired on ABC’s Wide World of Sports, and skateboarding was even featured on the cover of Life magazine that same year. These days,
skateboarding is all about boys performing urban acrobatics and radical moves like “riding the rails,” “pulling ollies,” and “50-50 grinds.” Skateboarders have created their own subculture that includes a unique style of dress, a “thrasher” mindset, and a unique vocabulary. According to the latest statistics, skateboarding is the sixth largest participatory sport in the United States, with more than 6 million “terra surfers” in the country. And true to its birthplace, almost half of them live in California. Skateboarding has a rich history filled with innovation and creativity. The first types of skateboards were actually more like scooters, which date back to the early 1900s. Some say the very first skateboard was simply a scooter with the push bar broken off. Over the next fifty years, boys changed the look of the scooter and took off the milk crate; they dismantled roller skates and nailed them onto two-by-four planks of wood. By the late 1950s, surfing was becoming increasingly popular, and boys began to think of cruising on these pieces of plywood as “sidewalk surfing.” The first commercially made skateboards hit the marketplace in 1959. The birth of the commercial skateboard industry brought new and exciting advancements in both the board and in the wheels used. Clay wheels made the ride much smoother and provided improved traction over steel wheels, so new tricks became possible. In the early 1960s companies such as Makaha and Hobie sold more than 50 million boards within a three-year period. Skateboarding had become a wildly popular sport for boys almost overnight. In 1962, the first retail skateboard shop opened its doors on October 6 in North Hollywood, California. The shop was called Val Surf and was pri-
Skateboarding marily a surf shop, but owner Bill Richards and his two sons, Mark, age fifteen, and Kurt, age eighteen, began to realize that there might be a real market for the skateboard. Val Surf joined up with surfboard maker Hobie Alter, whose name was synonymous with surf culture, and together they designed a line of skateboards. Hobie was the first to come out with a pressure-molded fiberglass skateboard. These boards became popular with boys very quickly. Then, suddenly, skateboarding entered its first slump when the fad almost completely died out in the fall of 1965. ABC’s Wide World of Sports covered the first national skateboarding championships, held at La Palma Stadium in Anaheim, California, in 1965. They were also covered by CBS and NBC. Skateboarding was then featured on the cover of Life magazine on May 14, 1965, and inside, the skateboard was described as an exhilarating and dangerous joyriding device similar to a hotrod car. The backlash from all the hypedup media coverage was immediate. The American Medical Association declared that skateboards were the newest medical menace. A group of safety experts announced that skateboarding was unsafe, urged stores not to sell skateboards, and advised parents not to buy them for their sons. Skateboarding would experience a series of peaks and valleys over the next forty years, with a major slump about every ten years. That first slump was due in part to the reckless riding of skateboarders themselves, but it was also caused by an inferior product. Skateboard manufacturers were in a hurry to get their first boards out on the market, so they had done very little research, and only minimal effort was put into designing a safe skateboard. Manufacturers simply replaced the squeaky steel roller-skate
615
Teenager skateboarding on a ramp, 1996 (Duomo/Corbis)
wheels with a quieter, smoother clay wheel and made a few refinements to the devices that held the wheels onto the board (called “trucks”). However, clay wheels still did not grip concrete roads very well, and boys of all ages across the nation were taking some very nasty falls. Cities started to ban skateboards in response to safety and health concerns, especially after a few boys were fatally injured. Skateboarding virtually disappeared from public view. The first generation of skateboarders had established the foundation of techniques, tricks, and style, even though these boys and young men were severely limited by poor equipment. In 1969,
616
Skateboarding
Richard Stevenson of Los Angeles, California, received a patent for a skateboard with a tail. He designed this “kicktail” for technical reasons. He added an upward curve at the back of the skateboard in order to make the board more maneuverable, more like a real surfboard. This innovation earned him the title of “father of the skateboard.” Two years later a man named Frank Nasworthy reinvented the wheel—he actually designed a new polyurethane wheel for roller skates, but it worked beautifully on skateboards as well. By 1973 skateboarding was fully revived, and sidewalk surfers were riding the “second wave” on a board with a new shape and urethane wheels. The urethane wheel would completely revolutionize the sport; it provided much better traction and speed and allowed skateboard enthusiasts to develop new tricks and much more difficult maneuvers. Skateboards increased in width from 6–7 inches to more than 9 inches, which improved stability on vertical surfaces. Dozens of board manufacturers were now putting graphics on the undersides of their boards. Skateboarding stickers were also wildly popular with boys. Vertical skating in empty swimming pools would soon become a mainstay of the sport, along with acrobatics in cylindrical pipes. SkateBoarder magazine was back on the rack, and the sport was on a roll once again. In 1976, the first modern outdoor skateboard parks were built in Florida and in California, followed by hundreds of other parks all over the United States. The “tamer” horizontal and slalom styles of skateboarding were replaced with the more popular vertical, aerial, and acrobatic maneuvers. Although a number of key skateboard tricks have been developed over the
years, the “ollie” has dominated skateboarding for the past twenty years. Alan Gelfand, nicknamed “Ollie,” developed the maneuver in Florida in the late 1970s. A year after he started skateboarding at the age of twelve, he decided to experiment with lip slides. He realized he could achieve a small amount of “air” if he popped his board while doing the lip slide maneuver. The Ollie “pop” finally became the “ollie” aerial that is now infamous. This move involves tapping the tail of the board down while jumping in the air and kicking the front foot forward. Proper execution results in the board jumping in the air with the skater; the board should stay directly under the skater’s feet for a proper landing. This trick allowed a completely new type of skateboarding to evolve called “street skating.” When street skating, boys perform tricks on, over, or against obstacles in and near streets. The ollie became the fundamental trick of modern-day skateboarding and allowed boys to fly over stairs, benches, rails, low walls, and other objects. Many believe that the ollie is the single greatest maneuver ever invented. Some estimate that the ollie became the foundation or “building block” for 80 percent of the street tricks and about 60 percent of the vertical tricks performed by boys. In just two years during the late 1970s, more than 40 million skateboards were purchased in the United States. More than 300 parks designed exclusively for skateboarding opened across the country, and boys also skated in empty swimming pools, on sloping concrete surfaces, or any place they could find. However, skating’s old enemy—safety—was once again a major concern. Insurance became so expensive that many skateboard park owners were forced to shut down, and
Skateboarding most parks were bulldozed. By the end of 1980 skateboarding had died its second death, and many boys abandoned the sport altogether. By 1983, legal issues were being dealt with in a serious and innovative manner, and skate parks were being revived. In October 1984, the Bones Brigade video premiered. Produced by the Powell Peralta skateboard corporation, it would become one of the main factors in revitalizing skateboarding. The video featured Lance Mountain as the official host traveling to different skate locations to check out the action. Members of the Bones Brigade included such key skaters as Ray “Bones” Rodriguez, Steve Caballero, Alan “Ollie” Gelfand, and Mike McGill. These young men had a unique combination of talent, skill, style, grace, and charisma. They were featured riding in backyard ramps, pools, skate parks, and down steep hills at exhilarating speeds. The video would go on to sell some 30,000 copies, and skateboarding was soon riding its “third wave.” Vertical (or “vert”) riding took off in 1984, and launch ramps became popular. Toward the end of the 1980s, street skating was in vogue. With a focus on technical tricks and “pulling ollies,” skateboarding took on a whole new attitude. During this era, such magazines as Thrasher and Poweredge began publication. Powell Peralta went on to produce two other videos and was a dominant influence in lifting skateboarding to even higher altitudes as a tremendously popular boys’ sport. By 1991, a worldwide recession hit, and the skateboard industry was not spared. Manufacturers experienced huge financial losses. But as before, dedicated boys and young men would help the sport to survive and would reinvent skateboarding once again.
617
Skateboarding is currently riding its “fourth wave.” Skateboarders wear XXlarge shirts, and shorts fall well below the knee. The street sport of skateboarding has become a subculture for boys, complete with cult heroes and legends. The growth of cable and satellite television and computers and the Internet has led to a greater worldwide awareness of skateboarding. A number of competitive skateboarding events in the 1990s provided exposure for the sport. In 1991, the National Skateboarding Association, which had been founded ten years earlier in 1981, held competitions in France, Germany, Spain, and the United States. In 1995 the World Cup Skateboarding organization became the leading organizer of skateboarding contests. The same year, ESPN 2 “Extreme Games” brought skateboarding into living rooms across the country, and it was selected as a featured sport in the 1996 Olympics opening ceremonies. It became quite common to see boys participating in skateboarding competitions on television, although the sport had to share the limelight with roller blading and snowboarding events. Apparently, the popularity of skateboarding was further enhanced by the development of snowboards, which were closely associated with skateboards; many boys participated in both sports. At the end of the 1990s, the focus of many skateboarders was still street style, and the industry included numerous manufacturers, marketers, and sponsors. Professional skaters often went on to develop their own product lines and manage their own companies. It appears that skateboarding may be riding a “permanent wave” because of the enthusiasm of dedicated skaters who keep the sport alive. Because of the efforts of the International Association of Skateboard Companies, more parks are scheduled for
618
Skiing
construction in other states in the first few years of the next millennium. It is even possible to visit one of three skateboard museums: Skatopia, located in Rutland, Ohio; the Huntington Beach International Skate and Surf Museum, located in Huntington Beach, California; and the Skatelab Museum, located in Simi Valley, California (www.skatelab.com). The museum boasts a very large collection of skateboards, skateboard products, and memorabilia. The display also includes scooters from the 1930s, the precursor to the modern skateboard. Skateboarding is a sport worthy of serious attention, and it has brought countless hours of fun and exhilaration to the boys who skate and to spectators, as well. Pure and simple, skateboarding is the positive release of explosive energy; it is pure fun, excitement, and exhilaration, sometimes scary, sometimes mellow, and sometimes gnarly. It is spontaneous art, it is choreographed discipline. It is freedom, it is self-expression. It is here to stay. Robin D. Mittelstaedt References and further reading Brooke, Michael. 1999. The Concrete Wave: The History of Skateboarding. Toronto, Ont.: Warwick. Cassorla, Albert. 1976. The Skateboarder’s Bible. Philadelphia: Running Press. Davidson, Ben. 1976. The Skateboard Book. New York: Grosset and Dunlap. Dixon, Pahl, and Peter Dixon. 1977. Hot Skateboarding. New York: Warner Books. Weir, La Vada. 1977. Skateboards and Skateboarding. New York: Pocket Books.
Skiing Immigrants from the Scandinavian countries and Finland brought skiing to the United States in the mid-nineteenth cen-
tury, but boys’ skiing should be seen in the wider context of winter recreation, including sledding and snowshoeing. Until the 1940s, boys who skied lived chiefly in New England, the upper Midwest, the Rocky Mountains, or Sierra Nevada. From the 1850s to the early 1900s, skiing was primarily utilitarian, a way of getting to school or work when the snow was deep. A few skiing clubs in California sponsored downhill races in the 1850s, and skiers in the Midwest specialized in ski jumping. The renown of the Norwegian explorer Fridtjof Nansen, who crossed Greenland on skis in 1888 and wrote a popular book about it, is credited with inspiring ski clubs throughout northern Europe, Canada, and the United States that promoted good health, fellowship, and amateur competition. At about the same time, English sportsmen were discovering the challenge of downhill skiing in the Alps and introducing it to Americans in New England. By 1905 when the National Ski Association (NSA) was founded to regulate and promote the sport in the United States, the stage was set for the mass marketing of what had been an immigrant pastime. Skiing grew slowly but steadily in the years between 1900 and 1945. Two distinct types of skiing emerged, Nordic (cross-country) and Alpine (downhill and slalom), but in the years before mechanical lifts, all skiing involved climbing uphill. Ski clubs sprang up around the country, and many colleges followed Dartmouth College’s lead in holding winter carnivals and sponsoring ski races. When the first Olympic winter games were held in Chamonix, France, in 1924, the U.S. team was sponsored by the Minneapolis Ski Club and the NSA, and all but one member was a Scandinavian immigrant. The only competitions were
Skiing Nordic style, jumping, and cross-country, but that was about to change. British sportsman Arnold Lunn had organized the first slalom race in 1922, and in 1925 a slalom race was held as part of the Dartmouth winter carnival. It was won by Charlie Proctor, the first native-born American to become a skiing superstar. Proctor was born in 1906 and grew up in Hanover, New Hampshire, where his father taught physics at Dartmouth College. He got his first pair of skis at the age of four, and by age eleven he was winning the boys’ ski jump competitions. He also competed in cross-country races, representing the United States in both events in the 1928 Olympics at St. Moritz, Switzerland. He later developed ski resorts and ski runs from New England to Sun Valley, Idaho, established ski schools, and designed and manufactured ski equipment. Proctor was hardly alone. All across the country boys were taking up skiing, competing in races, and soon finding ways to make money as ski instructors or ski resort developers. Wayne Poulsen, who helped popularize Squaw Valley, site of the 1960 Winter Olympics, made his first pair of skis at age eleven in 1926. In 1932 he took third place in the junior division of the National Ski Jumping Championships in Tahoe City, California. Proctor and Poulsen were part of the new generation that was helping to create the basis for the post–World War II boom in recreational skiing. Their enthusiasm and the glamour associated with winning ski jumps and races were reflected in illustrated articles in magazines for boys such as The Youth’s Companion and Boys’ Companion: The Magazine for Boy Building. In January 1913, Frank Merriwell, the athletic hero of Gilbert Patten’s (Burt L. Standish’s) popular novels for boys, appeared in a ski-
619
A little boy skiing in Colorado, 2000 (Bob Winsett/Corbis)
ing adventure published in New Tip Top Weekly. The mechanization of skiing was also important to its increase in popularity. In the early 1930s, railroads in New England began to run special trains for skiers from Boston and other cities to Maine, Vermont, and New Hampshire ski hotels. In 1936 the first Europeanstyle luxury ski resort opened in Sun Valley, Idaho, its clientele arriving exclusively on the Union Pacific Railroad. Moreover, Sun Valley had the world’s first chairlift to take skiers to the top of a mountain. The first rope tows had been built in Switzerland, Canada, and the United States a few years earlier. By the late 1930s, the number of American
620
Skiing
skiers was increasing rapidly, but the entry of the United States into World War II in 1941 put restrictions on recreational travel and slowed the expansion of skiing for a few years. By the 1950s, however, skiing had emerged as one of the most popular leisure activities, and families began making regular winter trips to new or expanded ski resorts such as Aspen, Colorado; Sugarloaf, Maine; and Taos, New Mexico. Since the 1950s, skiing became a recognizable lifestyle with an emphasis on luxurious living, high-tech gear, and designer clothing, a far cry from a boy’s homemade hickory or pine boards with leather straps. Why and how skiing became the symbol and substance of late-twentieth-century opulence is a complex and often disheartening story, partially told in Annie Gilbert Coleman’s insightful essay, “The Unbearable Whiteness of Skiing.” Through their Alpine-style architecture and advertising, many American ski resorts promote a myth of European aristocratic living and racial purity. As ski areas have become increasingly dependent on technology, from mechanical lifts rising through deforested mountainsides, to machine-made snow, to mechanical snow grooming, to chemical treatments to preserve snow on trails and remove it from parking lots, some skiers have returned to simpler, more environmentally friendly styles of skiing. Others, as Coleman points out, have made efforts to introduce racial and ethnic minority children to the fun of skiing. Although few boys today experience the pleasure of setting off from their family’s farm on homemade skis, many boys still complete the cycle of boyhood from child to youth by progressing from one winter sport to another. Considering skiing in the context of other winter activities helps place it in boy culture.
Although the ski industry claims that a child who can walk can ski, most boys experience snow first on sleds, and sledding preceded skiing as a boy’s winter activity in the nineteenth century. The appeal of sledding, or “coasting” as it was once called, is the risk involved in lying prone, face forward in the style called “belly-bump” as the sled careens down a steep hill. An added thrill comes from endangering passersby. Most of the paintings of boys sledding in the 1840s and 1850s show sledders terrifying pedestrians and tumbling into snow. Emily Dickinson captured such moments perfectly: Glass was the Street—in tinsel Peril Tree and Traveller stood— Filled was the Air with merry venture Hearty with Boys the Road— Shot the lithe Sleds like shod vibrations Emphasized and gone It is the Past’s supreme italic Makes this Present mean— (Johnson 1970, 630)
Homemade sleds were often little more than boards the width of a child with barrel-stave runners nailed to each side, yet when the first manufactured sleds appeared in the 1840s they were frequently considered inferior. One reason is the eternal struggle between boys and adults over safety. In an 1859 patent application for a sled, B. P. Crandall claimed that medical authorities condemned the practice of lying on the stomach while sledding, so he designed a sled that prevented it. However, the design was not popular. Toboggans were favored in some regions, especially after the success of toboggan rides at the Montreal Winter Carnival in 1883, but the invention of the “Flexible Flyer” sled by Samuel Allen in 1889 made the model synonymous with sledding and
Skiing became the favorite of boys throughout the country. Allen’s sled could be steered using either feet or hands, facilitating both the style approved by adults and the boys’ own choice. Moreover, the Flexible Flyer was painted with an American eagle holding arrows in both talons, continuing the martial spirit of mid-nineteenth-century boys who named their sleds “General Grant,” “Flying Cloud,” and “Young America” and painted them with pictures of Indians, horses, and flags. The 1950s saw the introduction of metal and plastic saucers, popular with some because they were difficult to steer. Many different styles of plastic sleds have been on the market since the 1970s, most of them designed for small children sledding in soft snow. For speed on packed, icy slopes, the Flexible Flyer still rules. Regional variations in over-thesnow transport may still be found. In Minnesota, boys made “bumpers” from apple barrel staves cleated together at both ends with a rope attached to the front cleat. The rider stood sidewise with a boot braced against each cleat, holding the rope in one hand and a pole for pushing, balancing, and steering in the other. In French Canada, boys made and rode pite or tapeculs, a kind of one-ski scooter. Canadian boys in general preferred snowshoeing, and boys in the United States used snowshoes more than skis until the early twentieth century. Skis were called “Norwegian snowshoes” for many years after their introduction. In his 1882 handicraft guide The American Boy’s Handy Book, Daniel Beard instructed boys on how to build a toboggan and skis, and The Young Folks’ Cyclopedia of Games and Sports (1890) devoted a separate entry to snowshoes but placed skis in a paragraph in skating. Rivalries between boys on snowshoes and boys on
621
skis continued well into the twentieth century. Henry Ives Baldwin, later chief forester of New Hampshire, began skiing in 1908 and used his single pole with a hook on one end to catch rides behind delivery wagons. He was especially proud of being able to defeat boys on snowshoes in cross-country races. In his boys’ adventure novel, Ski Patrol, the avalanche expert Montgomery Atwater describes a thrilling 40-mile chase and capture of a game poacher on snowshoes by a boy on skis. Sleds, skis, and snowshoes appear together in the pages of toy and sporting goods catalogs from the 1920s on. Shoppers at FAO Schwarz in 1952 had a choice of pine or maple ski sets in four lengths from 4 to 6 feet, with toe clamp or cable binding. Seven years later, parents could buy “sidewalk” skis in even shorter lengths for children as young as two. According to ski industry statistics, there was a significant increase in the number of skiers under eighteen years of age in the 1960s. Boys’ skiing had expanded beyond the northern states as winter vacations allowed families to visit ski areas regularly. As skiing developed among boys, new styles emerged. Freestyle skiing, involving aerial acrobatics off small jumps, began attracting boys in the 1950s, and by 1988 it was an event in the Winter Olympics. By that time, however, boys had been lured away from skiing by snowboarding. Invented in 1965, the snowboard attracted boys already familiar with surfboards and skateboards. The broad single plank allowed the skillful to perform astonishing maneuvers and acrobatics on ski slopes. National competitions began in 1980, and snowboarding made its Olympic debut at Nagano, Japan, in 1998. Snowboarders increasingly annoy skiers at resorts with their apparent recklessness, disregard of
622
Slave Trade
fashion, and colorful slang. Snowboarding seems to have restored boy culture to the ski slopes. Bernard Mergen References and further reading Allen, E. John B. 1993. From Skisport to Skiing: One Hundred Years of American Sport, 1849–1940. Amherst: University of Massachusetts Press. Atwater, Montgomery M. 1943. Ski Patrol. New York: Random House. Baldwin, Henry Ives. 1989. The Skiing Life. Concord, NH: Evans Printing. Coleman, Annie Gilbert. 1996. “The Unbearable Whiteness of Skiing.” Pacific Historical Review 65 (November): 583–614. Johnson, Thomas H., ed. 1970. The Complete Poems of Emily Dickinson. London: Faber and Faber. Mergen, Bernard. 1997. Snow in America. Washington, DC: Smithsonian Institution Press. Skiing Heritage: Journal of the International Skiing History Association. 1989– . Quarterly. 499 Town Hill Road, New Hartford, CT 06057.
Slave Trade In the nineteenth-century United States, the domestic slave trade was a constant presence in nearly every enslaved boy’s life. Millions of young men found themselves being torn from their families and friends against their will or having their parents or siblings taken from them by this process. The trauma of such sales could be devastating, and many boys vividly remembered decades later the effect that it had upon their lives. The majority of enslaved boys probably never stood upon an auction block, nor were they marched overland to the deep South in a manacled slave coffle. Still, the threat of sale was pervasive, and even those who
did not become personally involved in the slave trade never knew for sure whether or not they would be sold one day and forcibly taken away from the ones they loved. Although the buying and selling of human beings had always been a part of American society, the nature of this traffic changed over time. In the colonial period, most slaves sold in British North America were individuals imported from Africa or the West Indies. Following the American Revolution, however, this changed, especially after the closing of the African trade in 1808. After that date, all Americans who wanted to purchase slaves would have to buy them from among the existing slave population within the United States. The invention of the cotton gin in the 1790s also led to an increased demand for slaves as white southerners began expanding into the old Southwest to grow that lucrative crop. The result was the emergence of an indigenous, or domestic, slave trade by the early nineteenth century that transported thousands of enslaved men and women from the upper South (which had a supposed surplus of slaves) to the lower South (where slaves were in great demand) each year. It has been estimated that between 1790 and 1860, Americans transported more than 1 million African American slaves from the upper South to the lower South; approximately twothirds of these slaves arrived there as a result of sale. Moreover, twice as many individuals were sold locally (Deyle 1995). Therefore, by the nineteenth century, this domestic slave trade had certainly become a most common form of commerce in the Old South. Enslaved boys made up a large part of this domestic slave trade. For one thing, a large percentage of the nineteenth-cen-
Slave Trade
623
Slave families lived with the threat of separation through sale. South Carolina plantation, nineteenth century. (Library of Congress)
tury American slave population were children. More than two-fifths of antebellum slaves were younger than age fifteen, and one-third were younger than age ten (Schwartz 2000). Also, even though enslaved girls were often sold at an earlier age than boys because girls tended to mature sooner, boys sold at somewhat higher prices and were frequently in greater demand because of their labor potential. It is impossible to know exactly how many enslaved boys were actually sold, but one recent study (Tadman 1989, 45) claims that between 1820 and 1860 at least 10 percent of all teenagers in the upper South became commodities in the interregional trade, and more than twice that many were sold locally. It is quite likely that similar rates
of sale, if not even higher, occurred throughout the South. Most males sold into the interregional slave trade were between the ages of fifteen and twenty-five, but boys much younger could always be found. This proved especially true during times of economic prosperity when demand for slaves was greater. In general, though, extremely young children were usually sold with their mothers. They were easier to sell that way, and most buyers did not want to raise a very young child for several years before getting a positive return on their investment. Still, it was not unusual for boys eight years old or even younger to find themselves sold into the slave trade. In part to counter complaints against such practices from northerners,
624
Slave Trade
a few southern states passed laws prohibiting the sale of children under the age of ten without their mothers. Unfortunately, these laws were easily skirted and often not enforced. Moreover, they did nothing to stop the even greater number of sales to owners within the same state. Therefore, some very young enslaved boys found themselves being sold frequently and for a variety of reasons. Louis Hughes was born in 1832 and encountered his first sale at the age of six. Within the next six years he would be sold three more times to different owners within his native state of Virginia before being placed on an auction block in Richmond and sold to a planter from Mississippi at the age of twelve. Unlike many adult slaves who were sometimes sold as punishment for a supposed infraction, most enslaved boys were usually sold because their owner needed cash. Ironically, in some cases enslaved black children were sold to help finance the education of their owner’s privileged white children or help in other ways to support their owner’s lavish lifestyle. In Georgia, the enslaved boy John Brown was sold at the age of ten to help finance the construction costs of a new plantation house for his owner (Davis 1993). Another frequent reason for the sale of enslaved boys was legal action to settle the debts of their owners. One recent study in South Carolina found that roughly half of all the slave sales in that state between 1820 and 1860 were the result of some form of court action. One of the most common types of sale based upon legal action was those to settle estates. Nothing worried enslaved Americans more than the death of their owners, since that often meant that sales would occur to divide up and settle their estates. As the former slave Frederick Doug
lass noted, the death of an owner was a time of “high excitement and distressing anxiety” for most American slaves. Sometimes the slaves were split up among their former owner’s descendants, but it was also common to sell the slaves individually at public auction to bring the biggest return for the estate. Therefore, at such times it was not unusual for boys as young as four or five to be sold away from their family and friends for the rest of their lives. Naturally, most enslaved boys and their families tried to resist such sales as best they could. In order to protect her children from being sold, Moses Grandy’s mother used to hide him and his brothers and sisters in the woods in North Carolina until she got a promise from her owner that he would not sell them. The former slave Henry Watson also remembered how whenever a strange white man arrived on his plantation in Virginia, he and the other slave children would run and “hide ourselves until the man had gone” (Webber 1978, 187). Parents also frequently put up a fight and protested vehemently when owners attempted to take their children away from them through sale. In fact, such scenes were so common that many owners resorted to deceptive tactics when selling children, such as carrying out the transaction when the parents were absent or claiming they were just hiring out the boy for a short time period when they were actually selling him for life. For their part, enslaved boys also often tried to resist a sale. One former Virginia slave remembered how “young’uns fout and kick lak crazy folks” when they were placed on the auction block, while others used other ploys to negotiate a sale. At the age of nine, Ambrose Headen was forced to leave his family in North Car-
Slave Trade olina, walk 14 miles to a slave market, and place himself upon an auction block. After three hours of physical inspection and intense bidding, he was sold to a local planter who was known for his cruelty. Headen began crying uncontrollably until the planter, at the urging of others, resold the young boy to another buyer, just as he had hoped (Schwartz 2000, 171; Davis 1993, 673). Although some individuals, such as Headen, were able to alter a sale to their liking or even negate an undesirable one by running away or causing a scene, for most, there was little that could be done to prevent this ever-present reality of life for enslaved Americans. After the humiliation of the auction block, those boys sold in the interregional trade were usually manacled to other slaves in a long coffle and forced to march overland to their destination in the deep South. For most slaves from the upper South, such a fate was often considered worse than death because of the region’s frontier conditions, subtropical climate, rampant diseases, and extreme working conditions. And, of course, for young boys it meant being thrown into a totally alien environment and forced to survive without the comforts and resources of their family and friends. Naturally, such an event had a tremendous impact upon the boys who found themselves caught up in it, and it was something that most of them remembered for the rest of their lives. In fact, for many, it was the moment when they first came to realize that they were slaves and that they and their families could be treated differently from the white people around them. After being sold at the age of four, the Maryland slave Charles Ball had to watch as his new owner whipped his mother when she tried to plead with
625
the man not to take her young child away. More than fifty years later Ball admitted that “the terrors of the scene return with painful vividness upon my memory” (Ball 1969, 18). For most enslaved boys, especially those sold into the interregional trade, sale meant that they would almost certainly never see their families or friends again, and in many respects, it brought the same type of finality as death. Yet most continued to remember their families and homes, if only in their dreams. After being sold from his family in Virginia, Lewis Clarke noted how his “thoughts continually by day and my dreams by night were of mother and home.” And the former South Carolina slave Caleb Craig later in life acknowledged that he still had “visions and dreams” of his mother “in my sleep, sometime yet” (Webber 1978, 113; Jones 1990, 43). Although the majority of enslaved boys never experienced the trauma of sale personally, the slave trade still had a constant effect upon their lives. For one thing, even if they were never sold themselves, it is quite likely that most enslaved boys would have had other family members or friends sold away from them, and it is hard to overemphasize the impact that having a parent sold away could have upon a young boy’s life. In addition, most enslaved boys at one time or another witnessed the slave trade and heard stories about its operations and effects. For those living in southern cities or near county courthouses, slave auctions were common, and both black and white southerners were present at their proceedings. Other boys lived near country roads and frequently observed the many slave coffles as they trudged their victims, including many children, toward the slave markets of the deep
626
Slavery
South. One former slave from Texas, Calvin Moye, was constantly in fear of the many slave traders who passed by his place: “Dey was lots of dem speculators coming by de road in front of de plantation, and ever’ time I see dem coming, cold chills run over me till I see dem go on by our lane” (Reinier 1996, 173). Finally, the slave quarters were always full of tales of child snatching and kidnappers enticing little children into their wagons with trinkets and food. Although such stories were often based more on suspicion than on reality, cases of slave kidnapping were common enough to give credence to these fears. These stories exacerbated the feelings of fear already present within many enslaved families and also helped to keep numerous young boys from wandering too far away from home. Therefore, although it is true that most small children were probably not sold away from their parents, fears of such an event still influenced most young boys, and the slave trade had other ways of playing a powerful role in their lives. Steven Deyle See also African American Boys; Slavery References and further reading Ball, Charles. 1969. Slavery in the United States. 1837. Reprint, New York: Negro Universities Press. Davis, Jack E. 1993. “Changing Places: Slave Movement in the South.” The Historian 55 (Summer): 657–676. Deyle, Steven. 1995. “The Domestic Slave Trade in America.” Ph.D. diss., Columbia University. Johnson, Walter. 1999. Soul by Soul: Life inside the Antebellum Slave Market. Cambridge, MA: Harvard University Press. Jones, Norrece T., Jr. 1990. Born a Child of Freedom, Yet a Slave: Mechanisms of Control and Strategies of Resistance in Antebellum South Carolina. Hanover, NH: University Press of New England.
King, Wilma. 1995. Stolen Childhood: Slave Youth in Nineteenth-Century America. Bloomington: Indiana University Press. Reinier, Jacqueline S. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers. Schwartz, Marie Jenkins. 2000. Born in Bondage: Growing Up Enslaved in the Antebellum South. Cambridge, MA: Harvard University Press. Tadman, Michael. 1989. Speculators and Slaves: Masters, Traders, and Slaves in the Old South. Madison: University of Wisconsin Press. Webber, Thomas L. 1978. Deep Like the Rivers: Education in the Slave Quarter Community, 1831–1865. New York: W. W. Norton.
Slavery Among slaves in the American colonies, boys outnumbered girls. Although only a small number of Africans were brought to the mainland colonies in the early seventeenth century, the Atlantic slave trade increased slightly in the 1680s and greatly after 1700. Planters preferred to buy young male slaves with a lifetime of labor ahead of them. Although 15 to 20 percent of imported slaves were under the age of twelve, slave cargoes considered the most desirable were two-thirds “men-boys” (ages fourteen to eighteen) and one-third “women-girls” (ages thirteen to sixteen). As a result, when the slave trade was at its height, enslaved males in some localities numbered 180 for every 100 females. After being sold to planters at the port of Charleston, South Carolina, or at riverside wharves or county courthouses in Maryland and Virginia, as many as one-fourth of the imported slaves died during the first year, mostly from respiratory diseases. Yet these unhealthy conditions improved, and the skewed sex ratio balanced out by
Slavery
627
A slave family on Smith’s Plantation, Beaufort, South Carolina, mid-nineteenth century (Library of Congress)
the 1730s, when a native-born population came of age. By the mid-eighteenth century, the enslaved population was growing through natural increase, an unprecedented event for new world slavery. As population growth continued in the nineteenth century throughout the southern states, an increasingly large proportion of the enslaved were children. By 1860, when the U.S. census counted almost 4 million slaves, 56 percent were under the age of twenty (King 1995, xvii). On plantations in the Tidewater area of the Chesapeake Bay and the Low Coun-
try of the Carolinas, children grew up in relatively stable slave communities. In the Chesapeake area, where tobacco plantations were smaller and men were forced to venture afield to find wives, only about half of the young children lived with both parents. Those whose fathers lived nearby remained with their mothers and other relatives. On large Carolina rice plantations, which required a large labor force, children were more likely to live with both parents. Slave boys received English names from their masters but often African ones from their
628
Slavery
parents, who recognized kinship ties by naming sons for absent fathers or their grandfathers or uncles. These ties were reinforced by shared daily life. Families and relatives lived together in “quarters” of double rows of one- or two-room log or clapboard cabins, where boys slept with siblings in lofts, on pallets, or in trundle beds. Sharing a common yard, slaves cooked and ate communally, and children were served from a single skillet or a wooden trough. Chesapeake boys were put to work between the ages of five and ten. Working with kinship groups, they began with simple tasks in which they were instructed by parents or other relatives. Boys began fieldwork by chasing crows, helping stack wheat, or picking worms off tobacco plants. As they grew older, they picked tobacco and cradled wheat. They minded cows or chickens, cared for horses, toted drinking water to the fields, used mud or sticks to mend fences, and guided oxen when fields were plowed. Carolina slaves were given a task to be performed each day, and after it was finished, their time was their own. Boys of twelve or thirteen became three-quarter task hands, which could mean helping to plant and harvest rice or hoeing twelve rows or picking 30 pounds of cotton. In both areas, some boys were designated to be house servants for wealthy white families. As young as the age of five or six, they learned to fetch wood, build fires, carry food from the kitchen, fan flies from the table, and wash the dishes. Other boys were selected to learn skills and became carpenters, brick makers or layers, leatherworkers, or blacksmiths. Some were apprenticed to a white craftsman for three or four years, but most learned from skilled slaves or their fathers. In some families such skills were
proudly passed from father to son. A slave boy with skills earned money for his master, who hired him out to other plantations or to urban artisans and collected his wages. Enslaved and white Chesapeake boys grew up playing together. They wrestled, rode sticks they pretended were horses, and got into mischief. White children played school by teaching slave children their letters, sometimes whipping them when they forgot their lessons. But whites playing with black boys could be deliberately cruel, encouraging them to dangerous physical feats, enticing dogs to bite their toes, and purposely frightening them with actions and tales. In Carolina coastal parishes, where slaves greatly outnumbered whites, anxious parents drew a firmer line between their own children and their slaves. In both areas enslaved boys played by themselves in the quarters or woods, digging for worms, fishing, or hunting for opossum. In the yard they played ring games of their own devising, based on songs they improvised much as their parents and kinfolk did. Interaction with white culture could create psychological conflicts for enslaved boys. When six-year-old Frederick Douglass left his grandmother’s cabin in 1824 and found himself on Colonel Edward Lloyd’s plantation on Maryland’s Eastern Shore, he entered a self-sufficient community embracing thirteen farms and more than 500 slaves. Yet he gravitated toward the Great House, occupied by Colonel Lloyd and his family. Fred was hungry much of the time and associated the Great House with an abundance of food. He learned to finagle bread and butter by singing beneath the window of the daughter of his master (the manager of Lloyd’s farms), who may have been his father. He also sat in on lessons with
Slavery Lloyd’s son Daniel, five years his senior, whose Massachusetts tutor struggled to cure the white boy of speaking like a slave. Forming words along with Daniel, Fred learned the power of literacy and cultured English speech. Sent to Baltimore at age eight, he bribed white boys with bread to teach him his letters. By age eleven, he was able to match letters on boards in the shipyard with those in a Webster’s speller he carried around in his pocket, and, in stolen moments, he taught himself to write. Frederick Douglass would flee slavery and become a distinguished speaker and writer for the abolitionist cause. Yet even while he emulated the genteel values of the Great House, he also regarded it with mixed anxiety and deep resentment (Reinier 1996). Although slave owners of the eighteenth century considered small children a nuisance and of little market value, by the early nineteenth century, southeastern planters began to recognize the value of their surplus slaves. As seaboard lands declined in productivity, planters or their sons migrated west, transporting a labor force that often consisted of boys as young as ten to fourteen years of age. When Congress ended the Atlantic slave trade in 1808, the need of the expanding cotton frontier for labor accelerated the domestic trade. Southeastern planters, who recognized the commodity value of enslaved children, sold their surplus slaves to professional traders. These sales of adolescent and even much younger boys caused painful separations from their families. As early as 1789, fouryear-old Charles Ball of Maryland was sold away from his mother and siblings. Sixty years later he vividly remembered how his mother walked beside the horse of his master, beseeching him not to sep-
629
arate her family. In the 1840s eight-yearold Amos Abner Cotton of Virginia was sold to a trader when his master died and the estate was divided. His father, who was a craftsman and had made money by hiring himself out, attempted to buy back his son. But the trader refused, admonishing the anguished father, he “must think himself white.” Separated from his family and sold to Kentucky, Amos was put to work cradling and binding wheat (Reinier 1996, 159–160). Such separations brought disorder and despair to the new labor forces assembled in the Old Southwest. One-third of slaves brought to some Louisiana cotton parishes were “solitaires,” mostly boys separated from family members or kin. Two-thirds of solitaires engaged in heavy labor on a sugar plantation were male. On these new sugar or cotton plantations, the rapidly assembled labor force was young; almost half the individuals could be under the age of seventeen. Families and communities similar to those in the southeastern states did not take shape until the 1840s and 1850s, when kinship groups worked the fields together. Boys as young as age five learned to work with the hoe and went to the fields with their parents during the cotton harvest. If they failed to keep up, they could feel the whip administered by overseer or driver. These boys also were profoundly aware of punishments of other slaves they heard about or witnessed. William Colbert recalled that when his older brother received a whipping for visiting a girl on another plantation, he sat on his parents’ steps and cried. When William Moore saw his mother whipped, he ran around in circles and threw a rock at his master. Yet, throughout the South, enslaved boys themselves were more likely to be punished by a cruel or exasperated planter’s wife, who shook them,
630
Smoking and Drinking
pummeled their bare backs, or whipped them with a wooden paddle or a leather strap (Reinier 1996, 175). After the Civil War, freed parents searched for the children from whom they had been separated. Many individuals walked long distances, dictated newspaper advertisements, or sought help from the only federally funded agency in the South, the Freedmen’s Bureau. These families located some but not all of their scattered offspring. Orphaned boys stayed with other kinship groups until they could provide for themselves. Yet dangers remained for the newly freed boys as former slave owners sought to bind them in various forms of apprenticeship. Even after their emancipation, these formerly enslaved boys continued to be valued highly for their labor. Jacqueline S. Reinier See also African American Boys; Civil War; Douglass, Frederick; Plantations; Slave Trade; Washington, Booker T., and W. E. B. Du Bois References and further reading Federal Writers Project, Interviews with Former Slaves. 1930s. Chapel Hill: Southern Historical Collection, University of North Carolina. Joyner, Charles. 1984. Down by the Riverside: A South Carolina Slave Community. Urbana: University of Illinois Press. King, Wilma. 1995. Stolen Childhood: Slave Youth in Nineteenth-Century America. Bloomington: Indiana University Press. Kulikoff, Allan. 1986. Tobacco and Slaves: The Development of Southern Cultures in the Chesapeake, 1680–1800. Chapel Hill: University of North Carolina Press. Malone, Ann Patton. 1992. Sweet Chariot: Slave Family and Household Structure in Nineteenth-Century Louisiana. Chapel Hill: University of North Carolina Press.
McFeely, William S. 1991. Frederick Douglass. New York: Simon and Schuster. Morgan, Phillip D. 1998. Slave Counterpoint: Black Culture in the Eighteenth-Century Chesapeake and Low Country. Chapel Hill: University of North Carolina Press. Reinier, Jacqueline S. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers.
Smoking and Drinking Tobacco and alcohol are considered “gateway” drugs whose use by adolescent boys and children increases the likelihood of experimentation with illegal drugs. Tobacco use and alcohol consumption usually coincide with a number of other “problem behaviors” that account for more than 75 percent of deaths among adolescent males (CDC 1999). For these reasons, health professionals and educators are increasingly concerned about substance use among children and adolescents. Tobacco, which contains the drug nicotine, is the chief ingredient in cigarettes, cigars, pipe tobacco, chewing tobacco (chew), and snuff. Nicotine is a toxic, addictive stimulant that produces a sense of alertness and well-being in smokers. It increases adrenaline production, quickens the heart rate, and depresses the appetite. Tars found in tobacco are carcinogenic, contributing to cancer of the lips, mouth, throat, lungs, and other organs. Heart disease among smokers results from other chemicals (including carbon monoxide) in tobacco. In addition, the Environmental Protection Agency (EPA) estimates that 3,000 lung cancer deaths per year result solely from secondhand smoke. Secondhand smoke has also been linked to cardiovascular disease, asthma,
Smoking and Drinking and pneumonia. Tobacco use is the chief preventable cause of death in the United States. Although the word cigarette is the feminine form of cigar, cigarette smoking in America has historically been a male activity associated with poverty and low social class. By the late 1800s, immigrants from countries where smoking was common among the lower classes contributed to the increased sale of cigarettes in the United States. As early as 1879, the New York Times was warning about health concerns in stories with headlines like “Cigarettes Killed Him” and “Cigarette Fiend Dies.” The public concurred: cigarettes were referred to as coffin nails, dope sticks, and the devil’s toothpicks. Early promotions helped men and boys associate cigarettes with sex. Advertisements quoted doctors who talked of the “secret sexual practices” of smokers. Cigarette packs often came with picture cards of “actresses and beauties” bundled inside. Prior to World War I, however, women who smoked were thought to be chorus girls, actresses, prostitutes, or girls of poor reputation. During the war, millions of American men smoked cigarettes provided for them by the government and civic groups. Obviously, not everyone agreed that cigarettes were health risks. Although popular evangelist the Reverend William Sunday told men that “there is nothing manly about smoking cigarettes. For God’s sake, if you must smoke, get a pipe,” smoking continued to be a male privilege after the war (Koven 1996, 5). Women began smoking in great numbers in the 1920s, specifically because tobacco was one of the markers that differentiated between the roles and conduct of the sexes. In a now-infamous public relations
631
campaign called the “torches of liberty,” debutantes paraded in the streets of New York City carrying cigarettes to symbolize female empowerment. Similar patterns emerged at the end of the twentieth century among youth, with boys first adopting the habit to show their masculinity, only to be overshadowed by girls smoking to show their equality to boys. Virginia Slims ad campaigns capture the sentiment well (“You’ve come a long way, baby”). One of the fastest-growing tobacco markets today is women in college. According to the Centers for Disease Control and Prevention (CDC), nearly one-quarter of all male deaths can be attributed to two causes: cardiovascular disease and cancer. The CDC offers four alarming statistics from a 1999 nationwide survey of high school students: 76.1 percent of high school students eat less than the recommended daily servings of fruits and vegetables daily, 70.9 percent do not attend physical education classes daily, 34.8 percent currently smoke cigarettes, and 16 percent are overweight or at risk for obesity. All of these behaviors have been found to contribute to the risk of dying from cardiovascular disease and cancer. Clearly, habits initiated in the adolescent years contribute to adult mortality. Unfortunately, convincing a teenage boy to consider long-term health effects is a hard sell. Cigarette smoking seems to be as typical among school-age boys as textbooks and recess. In the eleven states with the highest percentage of youthful smokers (Alabama, Arkansas, Kentucky, Louisiana, New Mexico, North Dakota, Ohio, South Carolina, South Dakota, Tennessee, and West Virginia), more than 75 percent of boys have tried smoking cigarettes. Only one state (Utah) falls under 50 percent.
632
Smoking and Drinking
Not surprisingly, Utah is one of the few states to consistently request identification of young people seeking to purchase cigarettes in stores and gas stations (CDC 1999). Although ethnicity does not seem to affect experimentation with cigarettes (more than 80 percent of students have tried smoking by the end of high school), there are differences in habitual smoking and frequency of smoking. Euro-American boys are the most likely to smoke daily (29.3 percent), and they smoke more frequently (ten or more cigarettes a day). Euro-American boys are followed closely by Latin American boys, 21.1 percent of whom smoke daily. African American boys are the least likely to have a daily smoking habit (14.6 percent) and the least likely group to smoke ten or more cigarettes per day. The CDC does not report smoking data for other ethnic or racial groups. All these figures rise when smokeless tobacco (chew), pipe tobacco, and cigars are added. The increase in cigar smoking in adult males is paralleled in the adolescent population. Among high school–age Euro-American boys, 28.3 percent smoke cigars, and 18.8 percent use smokeless tobacco. When combined with cigarette use, approximately 50 percent of Euro-American boys use some form of tobacco. The figure drops to 37.8 percent for Latin American boys and 28.6 percent for African American boys (CDC 1999). Experimentation with cigarettes, habitual smoking, and frequency of smoking all increase with age. By the twelfth grade, more than 80 percent of boys have tried smoking, more than 30 percent smoke on a daily basis, and more than 10 percent smoke more than ten cigarettes per day. Although over 60 percent have tried smoking by the time they enter high
school, less than 20 percent had daily habits, and less than 5 percent were heavy smokers (CDC 1999). Although smoking cigarettes has actually decreased slightly among high school students since 1990, recent surveys indicate that the average age boys begin experimenting with smoking is twelve, with some starting much earlier (Chapin 2000). As for alcohol, even Puritan forefathers drank. “The good creature of God,” or whiskey to the Puritans, was used in the colonies as a universal medication. Alcohol was prescribed for colds, fevers, frosted toes, and snakebites. Parents gave it to children to bolster their good health and to cure their illnesses. People drank more alcohol in colonial days than they do today: three to seven times more per person per year. In the mid-1800s, temperance advocates who were concerned about the health and moral risks of alcohol use began a campaign to ban it. In 1919 they succeeded when the Eighteenth Amendment, which banned alcohol production and consumption in the United States, was added to the Constitution. Once alcohol consumption was illegal, it became part of the male domain. The association of alcohol with “undesirables,” including poor Americans, African Americans, and immigrants, made drinking and drunkenness an ideal way for men to rebel against “the system.” In addition to its association with rebellion, alcohol consumption has historically been (and continues to be) a form of male solidarity. Men’s and boys’ drinking in groups creates a sense of identity and group membership. It has been a social occasion, a purpose for socializing, and a rite of passage all in one. Alcohol is a drug that acts as a sedative and an anesthetic, depressing mental and motor functions. The active chemical in-
Smoking and Drinking gredient in beer, wine, and other alcoholic beverages is ethyl alcohol, a potentially addictive depressant of the central nervous system and the most widely used recreational drug in the world. Vital functions such as pulse rate, respiration, and blood pressure are all affected by alcohol use. Permanent damage to the brain, heart, liver, stomach, and reproductive organs (including male impotence) may result from alcohol abuse. In contrast to these long-term effects, auto accidents are a frequent short-term risk of alcohol abuse, contributing one of the main causes of death in men under the age of twenty-five. Alcohol abuse is also a major contributor to violence, sexual abuse, criminal behavior, and family dysfunction. In the 1970s and 1980s, the formation of groups like Mothers Against Drunk Driving (MADD) and Students Against Drunk Driving (SADD) demonstrated increased public recognition of the dangers of alcohol. In 1985, President Ronald Reagan signed into law a uniform minimum drinking age of twenty-one. Stiffer penalties for drinking and driving have also become law. Despite increased public awareness of the issue, alcohol use has increased since 1990 and is more prevalent among boys than girls. Like tobacco use, alcohol use is reported as early as elementary school years, with the average boy first experimenting with alcohol at the age of twelve (CDC 1999). According to the CDC, more than 80 percent of Euro-American and Latin American boys currently in high school have tried alcohol. Consistent with the patterns of tobacco use, African American boys (73.8 percent) are less likely than their white and Latino peers to have tried alcohol, but the overall percentage is high. Latin American boys (56.3 percent) are more likely than Euro-Ameri-
633
can boys (54.9 percent) and African American boys (39.1 percent) to drink on a regular basis; however, Euro-Americans (39.1) are more likely than Latin Americans (37.5 percent) and African Americans (17.4 percent) to binge drink (CDC 1999). Binge drinking is usually defined as drinking to the point of intoxication, which is frequently equated to five alcoholic beverages in one sitting for men (and three for women). Experimentation with alcohol, regular drinking, and binge drinking all increase with age. By the twelfth grade, nearly 90 percent of boys have tried alcohol, more than 60 percent drink on a monthly basis, and nearly 50 percent binge drink. Although more than 70 percent have tried alcohol by the time they enter high school, less than 25 percent were early binge drinkers (CDC 1999). The health consequences of tobacco and alcohol use and abuse and their contribution to adolescent mortality are just part of the current problem. Commonly referred to as “gateway drugs,” alcohol and tobacco use frequently predicts experimentation with illegal drugs and later substance abuse. In a recent survey of middle school students, 30.9 percent of boys smoked, 54.5 percent drank, and 9.1 percent had already experimented with illegal drugs. Of the drug users, all reported also smoking and drinking. Of the students who never smoked cigarettes or consumed alcohol, none had experimented with illegal drugs (Chapin 2000). In addition to serving as gateways to more serious substance abuse, alcohol and tobacco use coincide with other risky behaviors among young boys. Researchers in the 1970s first identified the trend, calling it “problem behaviors.” Boys who smoke and drink also tend to take a variety of other risks like driving too fast,
634
Smoking and Drinking
doing stunts, engaging in early sexual activity as well as risky sexual activity, carrying weapons, and participating in delinquent behaviors. A review of decades of research suggests the following: 1. Early initiation into smoking and drinking predicts heavier involvement in them later and more negative consequences. 2. Doing poorly in school or expecting to do poorly predicts substance abuse. 3. Impulsive behavior, truancy, and antisocial behavior are related to all other problem behaviors. 4. Low resistance to peer influences and having friends who engage in substance abuse are common to young boys who smoke and drink. 5. Substance abuse behaviors are associated with insufficient bonding to parents, inadequate supervision and communication from parents, and parents who are either too authoritative or too permissive. 6. Living in a poverty area or a highdensity urban area is linked with smoking and drinking as well as other problem behaviors. 7. Rare church attendance is associated with most problem behaviors. (Hamburg 1992; 196) Smoking and drinking behaviors often begin in the adolescent years for a number of reasons. The first is the nature of adolescence itself. Some of the most complex transitions of life occur during adolescence, as the body changes from a child’s to an adult’s, relationships with others take on new meanings and levels of intricacy, and individuals become increasingly independent. It is a time of expanding horizons and self-discovery. The
effects can be observed in children, but the processes begin to show serious consequences in adolescence that will affect individuals for the remainder of their adult lives. A review of a decade of articles published in Adolescence revealed that the three most prevalent research issues (representing nearly half of all articles published) are problem behaviors (including smoking and drinking), sexuality, and values. One of the major conclusions of these studies was that adolescent boys were ill-equipped to face the increasing numbers of opportunities for substance use and abuse because they lacked sufficient decision making skills and sources of information. In general, boys’ attitudes and experiences seem to be changing faster than their knowledge and coping skills. A second explanation for why boys smoke and drink is a developmental one. All developing boys are saddled with “life tasks,” which must be accomplished within developmental spans. Normative developmental stresses and responses to them play a major role in adolescence. According to John Hill (1983), there are five life tasks that take on special importance during adolescence: identity, or discovering and understanding the self as an individual; intimacy, or forming close relationships with others; autonomy, or establishing a healthy sense of independence; sexuality, or coming to terms with puberty and expressing sexual feelings; and achievement, or becoming a successful and competent member of society. Although these tasks are not unique to adolescents, the massive biological, psychological, and social changes occurring in adolescence cause these tasks to take on special significance.
Smoking and Drinking Because a major task of adolescence is autonomy and parental controls tend to fall away rapidly during the period, it is not surprising that teens search out other sources of information. Peers and the mass media provide attractive alternative sources of information. Despite the amount of funding expended on antismoking and antidrinking campaigns targeted at adolescent boys, health campaigns often fail because they do not understand that adolescent risk behaviors are functional, purposive, instrumental, and goal-directed and that the goals involved are often central in normal adolescent development (establishing autonomy, gaining peer acceptance, making the transition to adulthood, etc.). The mass media offers an array of attractive celebrity role models for adolescents, and emulation of these role models is common. The tobacco and alcohol industries are quick to take advantage of this phenomenon. Consider the following example: shortly after Brad Pitt was named the “sexiest man alive” in People magazine, his character in the upcoming film Seven became a smoker as part of a million-dollar deal with the production company. Paid product placement in films is a common technique for advertisers to connect with target audiences. The number of films targeted to adolescent boys makes movies ideal marketing tools. Images in the mass media may suggest to boys that cigarettes and alcohol are “cool” and “sexy,” but the day-to-day influences of peers also greatly contribute to experimentation and habit formation. Smoking and drinking both play a role in identity formation among peer groups. Peers that smoke and drink create perceived norms that encourage similar behaviors. A third method of explaining boys’ use of tobacco and alcohol is rooted in biology. Pubertal development has been associated
635
with changes in family interactions, parental feelings, peer relationships, patterns of intimacy, heterosexual interests and behavior, and educational achievements. Psychological adjustment to such sweeping changes affects how boys view themselves as men and as individuals, how and when they begin dating, how well they adjust to school, and how they perceive the family environment. Discussion of boys’ experiences of spermarche (first orgasm) are generally a family taboo, sending early adolescents searching for information elsewhere. Boys who develop adult male physiques early receive instant admiration from peers; boys who develop later are subject to teasing (often starting in the locker room), which can lead to a sense of shame. The timing of biological maturation affects cognitive scope, selfperceptions, perceptions of the social environment, and personal values. These four variables are hypothesized to predict adolescent risk-taking behavior. Initial results from ongoing research suggest that compared with less mature boys, physiologically mature boys (defined by puberty) perceived less risk associated with driving and drinking, engaging in sex, smoking cigarettes, drinking beer and wine or hard liquor, and using drugs. Testosterone levels in boys are directly related to thrill seeking or “sensation seeking.” Boys with higher than average testosterone levels become sensation seekers; they exhibit a greater number of reckless or problem adolescent behaviors, such as unsafe sex, alcohol consumption, cigarette smoking, and drug use. In order to suppress such risky behaviors, high-sensation-seeking males require viable alternatives like amusement parks, skydiving, and extreme sports. Still a fourth way of explaining why boys use tobacco and alcohol is cognitive.
636
Smoking and Drinking
Boys believe they are less likely than peers to be in alcohol-related accidents, become addicted to alcohol, or develop lung cancer later in life. This “optimistic bias” has been confirmed in hundreds of published studies. Adolescents in general believe they are invulnerable to harm. Although cardiovascular disease and cancer figure prominently in deaths among adolescents, young boys fail to perceive the link between their substance use habits and negative health outcomes. Because people act on their perceptions (rather than reality), understanding how boys think about smoking and drinking is a vital first step in efforts to reduce or extinguish the behaviors. The fifth and last method of explaining boys’ smoking and drinking habits is a social one. At no time is the tension between family members greater than when children first enter adolescence. This is when the focus of young people’s lives shifts out of the family. Despite this, adolescents spend roughly 40 percent of their time at home with family members. Three of the major tasks of adolescence, identity clarification (Who am I?), sexuality (Am I attractive to girls? How do I approach girls? Is it wrong to masturbate? Could I be gay?), and separation (How can I be a man if my parents are still telling me what to do? Do my parents still love me?), are highly influenced by the ways families construe and connote them. Boys’ experimentation with their appearance and behaviors can be disturbing to parents. However, family interactions emphasizing warmth, acceptance, and understanding tend to support higher levels of ego development and identity clarification in adolescents. The environment provided by the family for these adolescent developmental tasks is vital. For instance, substance
abuse often begins in adolescence; it is tied to a normal process of experimenting with new behaviors, becoming assertive, developing relationships outside the family, and leaving home. Substance abuse is often contingent on the quality of parentadolescent relationships. As the influence of families diminishes, boys turn to peers and the media. Not only a boys’ family environment but his relationship with his peers can also influence whether or not he smokes and drinks. Forming and maintaining peer relationships are central to adolescent development. Adolescents who are accepted by peers have been shown to be high in self-esteem, social skills, and academic success. Those lacking in such relationships show poor psychological, social, and academic adjustment. Perceived peer norms are often the best predictor of adolescent substance abuse: if a young boy believes his friends think he should drink, he will likely drink. Considering the multiple contributors to adolescent substance use, no single solution is possible. For instance, developmental and biological explanations suggest that substance use meets bona fide needs of adolescent boys, and thus the best solutions are those that offer healthy alternatives to meet developmental needs and sensation seeking. Cognitive explanations, on the other hand, suggest educational programs that emphasize personal risks associated with such behaviors. Programs emphasizing the long-term effects of substance use have been largely unsuccessful, but those emphasizing the immediate physical and social effects have been more successful. The most promising results have occurred in prevention programs. Once youth begin using tobacco and alcohol, helping them quit becomes increasingly difficult. The
Sports, Colonial Era to 1920 earlier boys begin using alcohol and tobacco, the less likely they will ever quit the habits. Given the increasingly early ages of experimentation with substance use, efforts by parents, schools, and health professionals must begin in preschool and elementary school. John Chapin See also Disease and Death; Illegal Substances References and further reading Babbit, Nicki. 2000. Adolescent Drug and Alcohol Abuse: How to Spot It, Stop It, and Get Help for Your Family. Sebastopol, CA: O’Reilly. Behr, Edward. 1996. Prohibition. New York: Arcade. Burnham, John. 1993. Bad Habits: Drinking, Smoking, Taking Drugs, Gambling, Sexual Misbehavior, and Swearing in American History. New York: New York University Press. CDC (Centers for Disease Control and Prevention). 1999. “Division of Adolescent and School Health’s Information Service Report.” Silver Springs, MD: Government Printing Office. Chapin, John. 2000. “Third-Person Perception and Optimistic Bias among Urban-Minority ‘At-Risk’ Youth.” Communication Research 27, no. 1: 51–81. Feldman, Shirley, and Glen Elliot, eds. 1990. At the Threshold: The Developing Adolescent. Cambridge, MA: Harvard University Press. Gall, Timothy, and Daniel Lucas, eds. 1996. Statistics on Alcohol, Drug and Tobacco Use. Detroit: Thompson. Hamburg, David. 1992. Today’s Children: Creating a Future for a Generation in Crisis. New York: Times Books, Random House. Hill, John. 1983. “Early Adolescence: A Research Agenda.” Journal of Early Adolescence 3: 1–21. Jessor, Richard, and Shirley Jessor. 1977. Problem Behavior and Psychosocial Development: A Longitudinal Study of Youth. New York: Cambridge University Press.
637
Klier, Barbara, Jacquelyn Quiram, and Mark Siegel, eds. 1999. Alcohol and Tobacco: America’s Drugs of Choice. Wylie, TX: Information Plus. Koven, Edward. 1996. Smoking: The Story behind the Haze. Commack, NY: Nova Science. Lock, Stephen, and Lois Reynolds, eds. 1998. Ashes to Ashes: The History of Smoking and Health. Atlanta, GA: Rodopi. Mooney, Cynthia, ed. 1999. Drugs, Alcohol and Tobacco: Macmillan Health Encyclopedia. New York: Macmillan Library Reference. Pacula, Rosalie. 1998. Adolescent Alcohol and Marijuana Consumption: Is There Really a Gateway Effect? Cambridge, MA: National Bureau of Economic Research. Segerstrom, Suzanne, William McCarthy, and Nicholas Caskey. 1993. “Optimistic Bias among Cigarette Smokers.” Journal of Applied Social Psychology 23: 1606–1618. Siegel, Mark, Alison Landes, and Nancy Jacobs. 1995. Illegal Drugs and Alcohol: America’s Anguish. Wylie, TX: Information Plus. Stefanko, Michael. 1984. “Trends in Adolescent Research: A Review of Articles Published in Adolescence.” Adolescence 19, no. 73: 1–13. Strasburger, Victor, and Don Greydanus, eds. 1990. Adolescent Medicine: The At-Risk Adolescent. Philadelphia: Hanley and Belfus. Tate, Cassandra. 1999. Cigarette Wars. New York: Oxford University Press. Torr, James, ed. 2000. Alcoholism. San Diego, CA: Greenhaven Press. Weinstein, Neil. 1987. Taking Care: Understanding and Encouraging SelfProtective Behavior. New York: Cambridge University Press. Winters, Paul, ed. 1997. Teen Addiction. San Diego, CA: Greenhaven Press.
Sports, Colonial Era to 1920 During the colonial period, just as Native American boys prepared themselves to be physically tough as adults by practicing archery, playing lacrosse, and running foot races, Anglo boys engaged in sports
638
Sports, Colonial Era to 1920
Skating outdoors. Lithograph, ca. 1885. (Library of Congress)
with utilitarian value, such as swimming, boating, hunting, fishing, and horseback riding. Their leisure lives were restricted by religious views and Sabbatarian issues about use of time for prayer: both Puritans in New England and Quakers in Pennsylvania frowned upon idleness, frivolity, and gambling. Catholic settlers in Maryland and elsewhere held a more relaxed view of acceptable activities based on their established European recreational patterns, which included the sports, games, and dancing that accompanied communal festivals and the celebration of saints’ days. Although banned by the Puritan authorities in the Massachusetts colony in the early 1600s, the maypole dance on May Day attracted youth. Males throughout
the colonies participated in games of skill and chance and at times joined in physical contests, particularly on established muster days when local militias gathered for drill. As Europeans moved westward, taverns served as sites for food, drink, lodging, and entertainment in these frontier areas. There men and boys engaged in contests of physical skills and strength to measure their prowess. They bowled, wrestled, shot at targets, ran footraces, and played billiards, often for stakes or prizes. The prize for gander pulling was the goose itself, which hung suspended from a tree as riders attempted to pull its greased head and neck from its body. Gander pulling proved especially popular in the Dutch colony of New York, as did
Sports, Colonial Era to 1920 a game called “kolven,” which resembled golf or hockey. Throughout the northern states, boys enjoyed winter activities like ice skating and sleighing whenever the weather permitted. Young boys played with toys and marbles and enjoyed chase games such as tag. As they grew older, bowling games like skittles, quoits, or ninepins became popular, as did the various ball and bat games known as “rounders,” “town ball,” and “stool ball” that preceded organized baseball clubs. By the 1760s the colonists played cricket and racquet sports, and by 1802 a specialized site for such practice, the Allen Street Courts, had been established in New York. The Racquet Club of New York included minors among its membership in 1845. In the southern colonies, white boys enjoyed similar forms of leisure. Slaves, however, were expected to participate in boxing matches for the entertainment of their masters, and black jockeys raced horses for the gentry who bet on the horse races. Southern black boxers Tom Molyneux and Bill Richmond proved so skillful that they traveled to England to compete in the boxing championship, winning their freedom in the process. Along the southern frontier, the largely Scotch-Irish males practiced a particularly brutal form of “rough and tumble” wrestling that eschewed rules. Eye gouging, biting, scratching, hair pulling, and even emasculation determined one’s hardiness, resolve, and social status among peers, who cheered their favorite fighter to punish his opponent. In the early 1800s, in other rural areas boys competed in contests at agricultural fairs by demonstrating their physical skills in farming. The first agricultural fair was held in 1810 in Pittsfield, Massachusetts, and other agricultural fairs soon followed.
639
Thanks to the antebellum urban health reform movement, sporting practices became more acceptable forms of leisure. Known as “muscular Christianity,” the health reform campaign for males gained ascendance at midcentury. This program, spearheaded by white, middle-class reformers, stressed physical vigor, moral courage, and religion and provided advice to male youth on the healthfulness of sports as an antidote to urban temptations and sedentary ways. Reformers like Thomas Higginson believed that sports for boys promoted physical strength and training for their roles as men. More organized forms of sport began to appear in urban areas as men formed athletic clubs during the antebellum period, and younger boys joined as junior members or formed their own contingents in imitation. One such club, the Knickerbockers of New York City, codified the rules of baseball in 1845, and its version, which differed from those of clubs in Massachusetts and elsewhere, eventually won widespread acceptance. The Potomac Boat Club, founded in 1859 along the Potomac River in the District of Columbia, helped to spread the enthusiasm for competitive rowing in the mid- and late nineteenth century. At the time, rowing enjoyed widespread popularity and drew huge crowds of spectators. Professional rowers stimulated young males and amateurs to form rowing clubs in locales with good waterways. The Potomac Boat Club competed against other rowing clubs and hosted athletic and social events for the middle- and upper-rank men and women coming to the club; the clubhouse hosted other sports, too, such as swimming and canoe races. Like baseball, cricket enjoyed a great measure of popularity. It is estimated that
640
Sports, Colonial Era to 1920
10,000 men and boys in 500 urban cricket clubs played the game throughout the United States by 1860. Baseball games between rival clubs often drew crowds of spectators that became commercialized events held in special enclosed sites. Professionalization soon followed, as nineteen-year-old James Creighton became the first acknowledged baseball player to accept payment for his athletic abilities. A fully professional baseball team, the Cincinnati Red Stockings, appeared in 1869. A year later, Chicago, a midwestern commercial rival, fielded its professional baseball team, the White Stockings (now known as the Chicago Cubs). In 1852 boys participated in the first intercollegiate sport. That year a railroad company offered a vacation and prizes to the crews at Yale and Harvard Universities for a competition on Lake Winnipesaukee, the location of the railroad’s New Hampshire resort. Both the regatta and its accompanying festivities drew public attention and paying customers to the sponsor. Ethnic athletic clubs also organized leisure activities to promote their own language, values, and cultural traditions among youth. Some of these became commercialized spectacles that also drew the attention of American-born athletes and spectators. By the 1850s the Scots’ Caledonian Club garnered 20,000 spectators at its Highland Games, which featured running, jumping, and weight-throwing contests that were eventually incorporated as modern track and field events. Other ethnic clubs, such as the German Turners, Polish Falcons, and Czech Sokols, offered a broad range of activities to members, including study of ethnic literature, dance, gymnastics, soccer, bowling, rifling, and singing, all to encourage the retention of European practices. Jewish settlement houses and Young Men’s
Hebrew Associations, more intent on assimilation with the American mainstream culture, sponsored baseball, football, basketball, track, wrestling, and boxing teams for adolescents, teenagers, and young men within Jewish communities. Boxing and wrestling in particular helped Jews dispel the stereotypical notion that they were cerebral, physically weak, and feeble. An 1866 book entitled Athletic Sports for Boys listed fishing, boxing, sailing, rowing, fencing, gymnastics, horsemanship, skating, swimming, and the driving of horse-drawn coaches as popular pastimes. As boys became teenagers and began courting girls, other social sports gained prominence. In the generation that followed the Civil War, croquet, roller skating, archery, tennis, and bicycling allowed young couples to engage in public leisure practices that enabled them to eschew the previously required chaperone. During the late nineteenth century, baseball superseded cricket in popularity, and boys formed sandlot teams to challenge their urban rivals. Teams divided along neighborhood, ethnic, and racial lines, but integrated contests also occurred in northern cities. At the high school level, students organized into competitive leagues in imitation of the professional associations. Country clubs offered more elite sports such as golf and tennis for white youth during the 1880s– 1890s. Prominent social and business leaders in cities like Boston, Philadelphia, and the suburbs of New York founded country clubs as places for their families and elite friends to engage in athletic and social activities. These country clubs provided an outdoor, rural setting for sports separate from the urban lower class and immigrants flooding into many cities in the United States in the 1880s. The Ger-
Sports, Colonial Era to 1920 mantown Cricket Club in Philadelphia was founded in 1890–1891, and the club played a large role in the success of cricket. In addition, the Germantown Cricket Club promoted tennis for its members by building tennis courts for their use. By the 1920s, tennis surpassed cricket as the major sport for all members, including boys. In fact, the famous American tennis champion, Big Bill Tilden, was a member of the Germantown Cricket Club and learned to play tennis on the courts there. The Seabright Lawn Tennis and Cricket Club in Rumson, New Jersey, featured a cricket crease and three tennis courts when it was built in 1886. Seabright’s members included competitive tennis champions. These country clubs provided the athletic facilities and training for upper-rank boys to participate in tennis. Soccer, which was played in Boston high schools during the 1860s, evolved into American football on college campuses by the 1880s. Boys initiated their own games, organized leagues, and established rules. The game experienced explosive growth in the schools until the turn of the twentieth century, but as in baseball, that rapid growth was accompanied by lack of regulation, abuses, and public concern over the use of ineligible players and the wholesome uses of boys’ leisure time. At the collegiate level, gambling and the emphasis on winning fostered professionalism, cheating, and the use of nonstudents in the quest for victories and bragging rights. Throughout the Progressive era (1890–1920), civic reformers and faculty members in the schools moved to exert greater adult control over boys’ sports. The largest interscholastic leagues, such as the Public Schools Athletic League in New York City and the Cook County Athletic League in Chicago, came
641
Boys playing baseball, ca. 1900 (Library of Congress)
under adult supervision because of widespread allegations of cheating by boys. The Young Men’s Christian Association (YMCA), founded in the United Kingdom in 1844 and brought to the United States in 1851, offered sports and games to attract adherents to its cause, and civic groups established sandlots and playgrounds as safe havens for younger children, who had previously played in the streets. Urban settlement houses sponsored boys’ athletic teams as well as a host of activities designed to assimilate the children of southern and eastern European immigrants who arrived in droves after 1880. In the schools, faculty and administrators assumed control of athletic teams, and trained physical educators
642
Sports, Colonial Era to 1920
served as coaches to provide guidance and instructions. In public parks, supervisors and playground attendants promoted Americanization through competition, teamwork, self-sacrifice, discipline, and sportsmanship. The Playground Association of America, founded by such urban reformers as Luther Gulick, Jane Addams, and Joseph Lee, sought to provide healthful boys’ sports under the direction of adults in cities like Chicago, Boston, and New York. By the turn of the century, the playground movement assumed national proportions. With the invention of basketball and volleyball by YMCA physical educators in the 1890s and the construction of field houses in urban parks after the turn of the century, adults maintained a yearround vigilance on boys’ activities. The new indoor games allowed for winter pastimes conducted in publicly supervised facilities. Still, some youth escaped such watchful diligence. Middle-class and wealthier youth took up tennis and golf in country clubs outside city boundaries. Some working-class youths found employment as pin spotters in the urban bowling alleys often attached to saloons, and others formed town or neighborhood teams independent of the schools and parks. Some of the latter evolved into semipro and even professional units that engendered community pride and created social mobility for some young men who played baseball and football. Baseball and boxing proved particularly attractive to working-class youth as a means to ready advancement both in their neighborhoods and in the American economy. Urban life and ethnic rivalries necessitated the toughness required to fight, and both sports promoted the aggressiveness of the American capitalist system and the physical prowess esteemed by
both the working class and the larger culture. Both sports produced a succession of ethnic heroes, such as John L. Sullivan and Mike “King” Kelly, Irish Americans, who symbolized a measure of acceptance and assimilation into American society for the Irish. By 1920 competitive sports and games exercised a powerful influence over American boys of diverse racial, ethnic, and religious backgrounds. Gerald R. Gems Linda J. Borish See also Baseball; Boxing; Muscular Christianity; Tennis References and further reading Adelman, Melvin L. 1986. A Sporting Time: New York City and the Rise of Modern Athletics, 1820–1870. Urbana: University of Illinois Press. Athletic Sports for Boys: A Repository of Graceful Recreations for Youth. 1866. New York: Dick and Fitzgerald. Borish, Linda J. 1987. “The Robust Woman and the Muscular Christian: Catharine Beecher, Thomas Higginson and Their Vision of American Society, Health, and Physical Activities.” International Journal of the History of Sport: 139–154. ———. Forthcoming. Landmarks of American Sports. American Landmarks Series. Edited by James O. Horton. New York: Oxford University Press. Gems, Gerald. 1997. Windy City Wars: Labor, Leisure, and Sport in the Making of Chicago. Lanham, MD: Scarecrow Press. ———, ed. 1995. Sports in North America: A Documentary History. Vol. 5, Sports Organized, 1880–1900. Gulf Breeze, FL: Academic International Press. Green, Harvey. 1988. Fit for America: Health, Fitness, Sport, and American Society. Baltimore: Johns Hopkins University Press. Hardy, Stephen. 1982. How Boston Played: Sport, Recreation, and Community, 1865–1915. Boston: Northeastern University Press. Kirsch, George B., ed. 1992. Sports in North America: A Documentary History. Vol. 3, The Rise of Modern
Sports, 1921 to the Present Sports, 1840–1860. Gulf Breeze, FL: Academic International Press. Rader, Benjamin G. 1983. American Sports: From the Age of Folk Games to the Age of Spectators. Englewood Cliffs, NJ: Prentice-Hall. Riess, Steven A. 1989. City Games: The Evolution of American Urban Society and the Rise of Sports. Urbana: University of Illinois Press. Rotundo, Anthony. 1994. American Manhood: Transformations in Masculinity from the Revolution to the Modern Era. New York: Basic Books. Smith, Ronald A. 1990. Sports and Freedom: The Rise of Big-Time College Athletics. New York: Oxford University Press.
Sports, 1921 to the Present By the end of the nineteenth century organized sports for boys, such as baseball, football, and basketball leagues sponsored by ethnic clubs, schools, settlement houses, religious, and civic agencies, existed throughout the United States. Such groups expanded their efforts and influences throughout the twentieth century as adults increasingly managed and controlled boys’ play in an effort to curb perceived abuses and prevent juvenile delinquency, especially of lower-class urban male youth. By 1920 many large cities had joined the Playground Association of America in providing play spaces for urban children. Small children enjoyed neighborhood sandlots, and older ones frequented playgrounds, which were often located adjacent to the public schools. In such spaces male and female supervisors sought to train children to play in a particular manner; they wanted to teach American values of discipline, teamwork, and loyalty through supervised sporting activities. Although playgrounds had slides, swings, and gymnastic apparatus for free play, playground supervisors emphasized the
643
American team sports of baseball, football, and basketball. Athletic supervisors expected boys to become better citizens by learning teamwork, self-sacrifice, a strong work ethic, and deference to authority. Track and field contests and fitness tests also instilled competitiveness, the basis for the capitalist economic system. Children received prizes and badges for goals reached as they vied for neighborhood and city championships. Park districts and playgrounds often served as battlegrounds when competing ethnic or racial groups resided in neighborhoods close to the park and play areas. Progressive reformers encouraged boys of different races and ethnic groups to play sports together, and boys played relatively harmoniously until African Americans moved to northern cities in large numbers during World War I. Subsequently, officials segregated play areas, and violations of segregated spaces could lead to race riots. In 1919 such a case ensued in Chicago when a black youth wandered across the line marking a racially divided beach. White youths stoned and killed him, an incident that resulted in a citywide race riot that cost more than thirty lives. Similar though less violent confrontations took place between other members of ethnic groups, particularly traditional European rivals such as Poles and Jews. In the larger park districts, teams, often composed of rival ethnic factions, vied for supremacy and bragging rights. By the 1920s boys had formed social-athletic clubs and basement clubs, where they gathered to play cards or other games, entertain friends, and sometimes engage in illicit activities. Dances and other fundraising activities provided these clubs with operating expenses or capital to gamble on athletic contests. Larger groups
644
Sports, 1921 to the Present
High school football game, Georgia, 1941 (Library of Congress)
often enjoyed the financial support of local politicians, who provided clubhouse space, team uniforms, and sports equipment in return for political support and muscle during elections. Business owners, too, often supplied similar amenities for advertising and community pride. Not only politicians and businesspeople but also newspapers used the public sporting spaces to promote their own entrepreneurial ventures. During the winter, newspapers promoted ice-skating races, known as the “Silver Skates,” that produced local and urban champions. In the park field houses and in urban gyms, boxers trained for the Golden Gloves fights, sponsored by the New York Daily News and the Chicago Tribune. The pugilistic competition pitted champions from the urban rivals against one another as early as 1923. By the following decade, the newspapers organized in a national ef-
fort to promote boxing and their own circulation figures. By 1938 the Golden Gloves tournament drew 23,000 entries from twenty-six states, with winners forming an American team for competition with international foes. Newspapers also sponsored swimming races and bowling tournaments in various cities. The latter provided working-class boys with employment as pin spotters, and the sport enjoyed great popularity throughout the 1930s and 1940s. Bowling alleys became a traditional leisure setting for teenagers thereafter for both sport and socializing. Many of the boxers in the Golden Gloves competitions represented the Catholic Youth Organization (CYO), founded in Chicago in 1930. The CYO offered a comprehensive sports program aimed at combating juvenile delinquency and urban gangs and promoting racial harmony. It featured its own interna-
Sports, 1921 to the Present tional boxing team and the world’s largest basketball league, with 130 teams in its Chicago-area parishes. Those two sports in particular offered greater opportunities for interracial competition. CYO boxing tournaments featured a wide variety of racial and ethnic competitors by the late 1930s. Interscholastic competition in football, basketball, and baseball drew great media and popular attention, particularly for intercity matches, which had been popular since the turn of the century. Urban rivalries were played out among high school baseball teams, with city champions meeting in major league stadiums to determine preeminence. Even mediocre football teams traveled across the country to claim regional and national honors when their local seasons concluded. The Chicago Prep Bowl, an annual city championship played between the Catholic and public school leagues, drew 120,000 fans to Soldier Field in 1937, the largest crowd to ever witness a football game. Starting in 1928, the University of Chicago sponsored a national high school basketball championship tournament. Five years later, Catholics initiated their own national basketball championship at Loyola University in Chicago to accommodate parochial schools. The B’nai B’rith organized similar sporting opportunities for Jewish youth. Ethnic athletic clubs, which had tried to retain European languages and values, emphasized gymnastics and soccer. By the 1920s, however, they began sponsoring baseball, football, and basketball teams in order to retain the support of their American-born offspring. For example, Young Men’s Hebrew Associations (YMHAs) participated in interleague competitions with other ethnic groups, and they also played other non-Jewish youth teams.
645
Other organizations fostered the growth of American team sports as well. The American Legion, a military veterans’ group, sponsored baseball teams for fifteen- to eighteen-year-old boys starting in 1925. Their first national championship followed a year later and gained television coverage in 1988. Nearly 95,000 players on 5,000 teams participated in the program in 1999. Little League Baseball offered competition for younger boys starting in 1939. By the end of the century, it had expanded to an international operation in over ninety countries with an annual televised championship game. The Pony League, founded in 1950, provided baseball for the remaining segment of the eight to eighteen age group. The Pop Warner football program fulfills a similar function. Started in 1929, the association organizes local leagues, consisting of flag football for boys ages five to eight and tackle football for boys ages seven to sixteen. By 1999 more than 300,000 participants vied for the national championship. In addition to the Pop Warner program, numerous smaller organizations and public park districts offer similar activities in football, basketball, and soccer. The American Youth Soccer Organization originated in California in 1964 and has since grown into a national operation with more than 600,000 players. A multitude of local, unaffiliated soccer programs swell that number considerably throughout the United States. Two Young Men’s Christian Association (YMCA) instructors, James Naismith and William G. Morgan, respectively, invented the games of basketball and volleyball in the 1890s. The physical educators of the YMCA have been instrumental in the international growth of both sports since the early twentieth century. The United States Volleyball Association was established in 1928. In addition to widespread
646
Sports, 1921 to the Present
interscholastic play of both sports at the high school level, boys’ basketball has enjoyed phenomenal growth in local parks, churches, and community centers as well as neighborhood playgrounds and rural sites. Numerous organizations have fostered a wide variety of individual sports. As early as 1880, cyclists organized as the League of American Wheelmen. Like other sports associations of the time, the League of American Wheelmen mandated segregation by prohibiting young black athletes from joining. Though most boys engage in cycling in an unorganized manner, recently motocross racing and mountain biking events have become more organized. Competitive national and international cycling championships attract many boys in the late twentieth century. The United States Figure Skating Association, founded in 1921, provided national competition in the sport, and local skating clubs provided training facilities and coaching for boy skaters. Speed skating remains centered in particular northern climates such as Wisconsin and Minnesota. Roller skating, which was popular in the late nineteenth century, enjoyed a resurgence in the twentieth but gave way to other boys’ sports as technology improved. Skateboarding began in California as an adaptation of surfing in the 1950s and spread across the United States, symbolic of a rebellious youth culture with its own particular dress, language, and physical style. The National Skateboarding Association formed in 1981, and the sport gradually moved toward mainstream incorporation thanks to organized, televised, national competitions thereafter. Roller blading also assumed national attention as part of the physical fitness movement in the last two decades of the twentieth century. By
1999 inline skating claimed 27 million practitioners, with another 3 million boys, most from thirteen to twenty years of age, who played street or roller hockey. Boys took up winter sports in increasing numbers as well. Ice hockey, previously relegated to the tier of northern states, gained numerous adherents nationwide as indoor rinks sprouted around the country and boys joined peewee hockey leagues. Ice hockey teams and clubs for boys of various ages continue to be popular with male youth in many parts of the country. Ice hockey has grown in popularity among boys since the 1960s, with outstanding hockey players participating in intercollegiate competition when they get older. Other boys engaged in skiing, snowmobiling, or snowboarding, the youth culture winter counterpart to skateboarding that emerged in the 1960s. Now snowboarding competitions enable boys to display their physical skills and athleticism in organized events. Even before skateboarding and snowboarding, surfing served as the sport of youthful dissent. Anglo-Hawaiian boys learned to ride the waves as native Hawaiians did in the early 1900s, and the activity spread to California beaches by midcentury. By the 1960s surfing had spawned a lifestyle that included particular dress, hairstyles, music, and consumerism and was often featured in movies. The American Surfing Association, established in 1976, fostered organized competitive events. Technology brought new sports to the beach in the form of parasailing and windsurfing in the latter half of the century. At public and private water facilities, both swimming and waterskiing maintained a longer history of competition and recreation than did surfing. Motorboats ap-
Sports, 1921 to the Present peared by the 1920s, and they were soon towing skiers. The American Water Ski Association originated in 1939 and continued to gain increasing stature after World War II. Swimming, a timeless recreational and utilitarian activity, had a centuries-long competitive history but gained even greater participation with its inclusion in high school, park district, YMCA, and YMHA programs. Age-group teams and competitions abound throughout the United States today for various swimming and diving categories. Boys have played golf since at least the 1890s, but the American Junior Golf Association was not established until 1977. The association showcases the talents of both boys and girls ages thirteen to eighteen in tournaments designed to attract college scholarships. Millions more play the game for purely recreational purposes. Country club links and public courses regularly draw boys to test their golf skills. Like golf, racket sports experienced substantial growth during the twentieth century. Wealthier middle-class and upper-class boys began playing tennis in the 1880s. The game drew greater numbers of participants as it moved from country clubs to schools, inner-city parks, and independent programs. The National Junior Tennis League organized in 1968, and tennis academies tutor elite players. Tennis tournaments for boys of diverse ages take place at local, state, and national levels. The United States Tennis Association sponsors the national sixteen- and eighteen-year-old championships for outstanding junior tennis players. Most youth who play the game continue to do so on public and neighborhood courts. Although racquetball and badminton are less popular than tennis, these games’ players pursue their pastimes in similar fashion, and the bad-
647
minton players enjoy the status of interscholastic competition in some state high schools. Both gymnastics and track and field, which are among the oldest sports, have experienced declining participation figures as fewer high schools field teams and programs and the public parks have shifted their focus to other team sports. Road races, however, continue to enjoy great popularity as part of the fitness movement, and running clubs offer youth exposure to the sport and training for races of varying distances. Private gymnastic clubs offer training for boys, mostly in urban areas. Boys have taken to newer outdoor adventure sports in increasing numbers in recent decades. Rock climbing no longer requires an excursion to remote places. Climbing walls are more readily accessible in health clubs, school gymnasiums, and commercial facilities. New technology has made kayaking more affordable, and its practice requires only some nearby waterway. Even mountain biking, which previously required proximity to particular geographic areas, has been adapted in heartland communities with the construction of simulated courses, often built by boys themselves. In the late twentieth century, the commercial production of equipment for boys’ sports promoted their participation in sports, whether in the water, in the gymnasium, or on the playing field. Gerald R. Gems Linda J. Borish See also Basketball; Boxing; Football; Ice Hockey; Skateboarding; Skiing; Tennis References and further reading Berryman, Jack W. 1975. “From the Cradle to the Playing Field: America’s Emphasis on Highly Organized
648
Substance Abuse
Competitive Sports for Preadolescent Boys.” Journal of Sport History (Fall): 112–131. Borish, Linda J. Forthcoming. Landmarks of American Sports. American Landmarks Series. Edited by James O. Horton. New York: Oxford University Press. Erickson, Judith B. 1983. Directory of American Youth Organizations. Omaha, NE: Boys Town. Gems, Gerald R. 1996. “The Prep Bowl: Sport, Religion and Americanization in Chicago.” Journal of Sport History 23, no. 3: 284–302. Goodman, Cary. 1979. Choosing Sides: Playground and Street Life on the Lower East Side. New York: Schocken Books. Kirsch, George, Othello Harris, and Claire E. Nolte, eds. 2000. Encyclopedia of Ethnic Sports in the United States. Westport, CT: Greenwood Press. Mormino, Gary Ross. 1982. “The Playing Fields of St. Louis: Italian Immigrants and Sport, 1925–1941.” Journal of Sport History 9 (Summer): 5–16. Rader, Benjamin G. 1983. American Sports: From the Age of Folk Games to the Age of Spectators. Englewood Cliffs, NJ: Prentice-Hall. Riess, Steven. 1989. City Games: The Evolution of American Urban Society and the Rise of Sports. Urbana: University of Illinois Press.
Substance Abuse See Illegal Substances
Suicide The term suicidal refers to behavior intended to bring about one’s death or the behavior of one who engages in lifethreatening action and is indifferent to surviving it. The outcome of a suicidal action may be fatal (i.e., suicide) or nonfatal. Suicidal ideation refers to thinking or talking about killing oneself. Historically, in Western thought, suicide has been considered a male behavior. In the nineteenth century, numerous European
and U.S. commentators argued that killing oneself required a degree of energy, courage, and intelligence that was incompatible with the nature of women. At the time, it was also assumed that suicide was a “disease” of “civilized people” and urban communities. “Simple” and “primitive” people, which at the time meant women, the less educated, and all non–western European people, were thought to be immune to suicide (Canetto 1997b; Kushner 1993). Over the years it has become clear that suicide is not limited to males or to industrialized societies. Recent epidemiological studies show that in some countries (e.g., China), suicide is more common among females, particularly in certain age groups. In the United States, however, suicide is primarily a male death. In fact, it is a growing cause of death among young males. Suicide is the third leading cause of death for males ages fifteen to nineteen in the United States. Most of these suicides are by firearms, more specifically handguns (Canetto 1997b; Canetto and Lester 1995; Johnson, Krug, and Potter 2000). Suicide rates are highest among Native American males, although there is significant variability across tribes. Historically, Euro-American youths have had much higher rates of suicide than African American youths. In recent decades, however, rates for African American male adolescents have increased more rapidly than rates for Euro-American male adolescents, such that the gap between the rates for these two groups is narrower. The increase in African American male youth suicide has been particularly substantial in the South (CDC 1998). Suicide rates in boys exceed those in girls by a ratio of 5:1 (Canetto and Lester 1995). The gender difference in mortality
Suicide holds across ethnic groups, although suicide rates vary greatly from group to group. Native American boys have higher rates of suicide than Native American girls, although the latter have higher rates of suicide than Euro-American boys. The paradox is that boys are two to three times less likely than girls to report suicidal ideation or to engage in nonfatal acts of suicidal behavior (Canetto 1997a; Lewinsohn, Rohde, and Seeley 1996). Gender differences in nonfatal suicidal behavior are not found in all ethnic groups in the United States. For example, among native Hawaiians and some Native Americans (i.e., among Pueblo Indians but not Zuni Indians), adolescent males report similar rates of nonfatal suicidal behavior as adolescent females (Canetto 1997a; Howard-Pitney et al. 1992). Gay males have unusually high rates of nonfatal suicidal behavior relative to heterosexual males. No definitive information is available on rates of death by suicide among gay males (Remafedi 1999). In the United States, suicidal behavior of all kinds is uncommon in boys and girls before puberty. The incidence of suicide increases more rapidly among males than among females during adolescence and young adulthood. In recent decades, the gender gap in suicide mortality has been widening, especially among some U.S. ethnic minority groups. Rates of suicide mortality for U.S. ethnic minority boys have increased markedly, leading to a growing gender gap. Suicide rates among girls of all U.S. ethnic groups have remained stable (Canetto 1997a; CDC 1998). Male rates of suicide mortality exceed those of females in most countries where suicidality is recorded. The male predominance in suicide mortality among ado-
649
lescents, however, is not universal. For example, in several South American, Caribbean, and Asian countries, including Brazil, Cuba, the Dominican Republic, Ecuador, Paraguay, the Philippines, Singapore, and Thailand, young females’ suicide mortality exceeds that of young males. In Mauritius, young males and females have the same rates of suicide mortality (Canetto and Lester 1995; Johnson, Krug, and Potter 2000). Why are boys less likely than girls to report depression, suicidal thoughts, and nonfatal suicidal behavior and, at the same time, more likely to die of suicide? Why do the gender differences in nonfatal suicidal behavior occur in some U.S. ethnic groups and not in others? Why are gay males at higher risk of nonfatal suicidal behavior than heterosexual males? The contrasting trends in nonfatal and fatal suicidal behavior in males and females have been called the “gender paradox” of suicidal behavior (Canetto and Sakinofsky 1998). Many theories have been proposed to explain this paradox. However, most fail to account for the variations in the gender paradox by ethnicity and sexual orientation. In other words, since the gender paradox of suicidal behavior is not universal, it needs to be understood in light of cultural factors. A theory of gender and suicidal behavior that addresses cultural variability is the theory of cultural scripts (Canetto 1997b). This theory is grounded in the observation of a correspondence between social norms and actual behavior in different cultures. Persons tend to engage in the behaviors (including suicidal behaviors) that are meaningful and permissible in their community. Different communities have unique scripts of suicidal behavior, that is, specific conditions under which suicidal behavior is expected and
650
Suicide
by whom. The scenario of the suicidal act (including the actor, method, precipitants, and themes) and the consequences of the suicidal behavior are part of this script. Even though each suicide is in some way unique, it also shares some characteristics with other suicides from the same community. In other words, there are common and scripted elements in suicidal acts within cultures. One would expect cultural scripts of gender and suicidal behavior to be particularly influential among adolescents, who are in the process of defining their identity and may take messages about genderappropriate behavior more seriously than adults. In the United States, boys’ low rates of suicidal ideation and behavior and their high rates of suicide mortality are consistent with dominant beliefs about masculinity and suicidal behavior. Studies indicate that it is considered unmasculine to talk about suicidal behavior and to survive a suicidal act. Adolescent males have a greater fear of disapproval over having suicidal thoughts than adolescent females. Nonfatal suicidal behavior in males receives more criticism than the same behavior in females. Males are particularly critical of other males who survive a suicidal act. Nonfatal suicidal behavior appears to be associated with some identification with or adoption of behaviors considered feminine across sexual orientations. In one study, boys who acted in ways perceived as feminine were more likely to engage in nonfatal suicidal behavior during adulthood than conventionally masculine boys or masculine girls. In another study, gay boys with a history of suicidal behavior were more likely to describe themselves as feminine than gay boys without a history of suicidal behavior. No studies so far have explored gen-
der meanings of nonfatal suicidal behavior among ethnic minority adolescents. One could expect less gender-specific beliefs about nonfatal suicidal behavior in ethnic groups with similar gender rates of nonfatal suicidal behavior. In the United States, killing oneself is viewed as masculine behavior. Male suicide is judged as less wrong and less foolish than female suicide, and males who kill themselves are considered more rational than females who kill themselves. In terms of precipitants, impersonal failures are viewed as more masculine than personal difficulties. For example, males who kill themselves following a relationship problem are considered more maladjusted than males who commit suicide after an achievement problem. Adolescent males say that they are less fearful of death and injury than same-age females. No studies so far have explored gender meanings of suicide among ethnic minority adolescents in the United States. The fact that male rates are higher than female rates within each ethnic group suggests that killing oneself is associated with masculinity across ethnic groups. Moreover, the growing disparity among male and female rates among African American adolescents may be an indication of the increasing influence of dominant beliefs about gender and suicide. Gender differences in method probably account in part for males’ higher rates of mortality from suicidal behavior. In the United States, boys are more likely than girls to use firearms and hanging as a method of suicide. The percentage of youth suicide in which a firearm is involved is highest in the United States as compared with countries of similar economic background. Firearms carry a high risk for immediate lethality. They do not allow much room for survival or rescue,
Suicide even when the suicidal behavior occurs in public places (Canetto and Sakinofsky 1998). In the United States, firearms are assumed to be a masculine, “hard” suicide method, but poisoning is considered a feminine, “soft” method. At the same time, it is important to remember that gender meanings and the lethality of different methods are, to some degree, culture-specific. For example, in the United States, poisoning is a method that is more common in females than in males and is assumed to be feminine. It is also a method that in the United States has low lethality because it typically involves overdosing on medications, for which intervention is generally available and effective. In developing countries such as Sri Lanka, however, poisoning is not a method that is primarily used by females or a method that is considered feminine. Poisoning in developing countries typically involves household or farming poisons. Effective medical care for poisoning is not routinely accessible; hence poisoning is highly lethal. In Sri Lanka, males, like females, use poisoning as a primary method of suicide (Canetto 1997b). It is unclear whether U.S. boys’ suicide method preferences reflect their high intent to die because adolescents’ knowledge of method lethality is often inaccurate. It is likely, however, that more male suicidal acts are intended as fatal. Most suicidal acts involve some planning. There is also a correlation between intent and lethality. At the same time, it is important to acknowledge that, in the United States, boys’ choice of suicide method takes place against a cultural script that treats surviving a suicidal act as unmasculine. Consider the dominant language of suicidal behavior in the United States. The term suicide attempt, which used to refer to
651
nonfatal suicidal behavior, suggests that a person tried and failed at suicide. The terms completed suicide and successful suicide convey that dying of suicide represents a form of success. Given prevailing ideologies of masculinity, males may feel they cannot allow themselves to fail at suicide, even though success in this case means death. The search for causes of adolescent suicidal behavior, fatal and nonfatal, spans many fields of research. A sampling of proposed risk factors includes genetic markers, national unemployment rates, parental education and income, childhood adversities, mental disorders, physical and sexual abuse, parental divorce, a family history of depression, suicidal behavior and substance abuse, cognitive style, coping skills, and exposure to suicide models. Information on risk factors and dynamics unique to suicidal males, however, is limited because most researchers do not focus on questions of gender and do not separately examine male data. Also, information on risk factors has not advanced enough to help predict individual suicidal behavior. Many young persons experience some or many of the risk factors, and yet never become suicidal. In the United States, death by suicide is associated with a similar set of antecedents as nonfatal suicidal behavior. Among mental disorders, alcohol and substance abuse, conduct disorders, and depression have been most commonly associated with suicidal ideation and behavior, both fatal and nonfatal, among adolescents. Alcohol and substance abuse and conduct disorders appear to be stronger correlates of suicidal behavior for boys than for girls. Psychiatric co-morbidity increases the risk for all suicidal behaviors. Although psychopathology is
652
Suicide
considered by some the single most influential predictor of suicidal behavior, only a small proportion of adolescents with mental disorders engage in suicidal behavior; most do not. Physical illness and functional impairment may also play a role in nonfatal suicidal behavior. Low self-esteem, low academic achievement, peer difficulties, and social isolation have been identified as additional factors in nonfatal suicidal behavior. In addition, exposure to suicidal behavior (including a family history of suicidal behavior, recent suicidal behavior by a friend, and a person’s own past suicidal episodes) has been found to increase the risk of all suicidal behaviors. Stressful life events, including turmoil and instability in key relationships and failures at school or work, can be precipitants of suicidal behavior in adolescents. Sexual abuse has emerged as a factor in gay male nonfatal suicidal behavior. In general, the more difficulties individuals have or have had, the more likely they are to engage in suicidal behavior. Although adversities increase the risk for suicidality, they are particularly potent in those adolescents with dysfunctional cognitive and coping styles. Finally, the availability of firearms is more of a risk factor for suicide in boys than for girls. Boys are more likely to use firearms in suicide. It is possible that, in the United States, guns may be perceived as a more masculine method. A key issue in adolescent suicidal behavior is that risk factors affect boys and girls differently. Similar risk factors are associated with fatal suicidal behavior in boys and nonfatal suicidal behavior in girls. Relative to girls, boys seem protected from suicidal ideation and nonfatal suicidal behavior. At the same time, they have much higher rates of suicide mortality. One possibility is that boys have sim-
ilar rates of suicidal ideation as girls but are less likely to admit it because of the stigma, for males, of talking about feeling suicidal. For the same reason, males may hide their nonfatal suicidal acts more often than females. It is also possible that professional caregivers are less willing to recognize and label as suicidal a variety of life-threatening behaviors in males. Many questions about boys and suicidal behavior remain unanswered. For example, it would be helpful to learn more directly what prevents boys from allowing themselves to survive a suicidal act and how beliefs of gender-appropriate suicidal behavior may influence boys’ choice of suicide method and outcome. The gender differences in suicidal behaviors and the variations in boys’ suicidal behavior depending on ethnicity, nationality, and sexual orientation point to new directions in primary prevention. In the United States, suicide prevention programs should consistently address questions of gender, sexual orientation, and ethnicity. One possibility would be to include a didactic component on the epidemiology of gender and suicidal behavior across ethnicities and sexual orientations. Suicide prevention educational programs should also assess adolescents’ beliefs about gender and suicidal behavior. If such an assessment revealed an endorsement of the notion that killing oneself is powerful and masculine, the program leaders should be prepared to engage adolescents in examining and challenging this dysfunctional belief. Finally, suicide prevention programs should also consider the implications of boys’ critical attitudes toward persons who reveal suicidal thoughts or acts. Although these attitudes probably protect boys from nonfatal suicidal behavior, they also have negative consequences. In nonsuicidal boys, they
Sunday Schools are likely to signal an unwillingness to reach out to suicidal peers. In boys who are suicidal, negative attitudes toward nonfatal suicidal behavior may increase shame and interfere with help-seeking behavior. Ultimately, these attitudes may make it difficult for boys to allow themselves to survive a suicidal act.
653
Lewinsohn, Peter M., Paul Rohde, and John R. Seeley. 1996. “Adolescent Suicidal Ideation and Attempts: Prevalence, Risk Factors, and Clinical Implications.” Clinical Psychology: Science and Practice 3, no. 1: 25–46. Remafedi, Gary. 1999. “Suicide and Sexual Orientation.” Archives of General Psychiatry 56: 885–886.
Silvia Sara Canetto References and further reading Canetto, Silvia Sara. 1997a. “Meanings of Gender and Suicidal Behavior among Adolescents.” Suicide and LifeThreatening Behaviors 27: 339–351. ———. 1997b. “Gender and Suicidal Behavior: Theories and Evidence.” Pp. 138–167 in Review of Suicidology. Edited by R. W. Maris, M. M. Silverman, and Canetto. New York: Guilford. Canetto, Silvia Sara, and David Lester. 1995. “Gender and the Primary Prevention of Suicide Mortality.” Suicide and Life-Threatening Behavior 25: 58–69. Canetto, Silvia Sara, and Isaac Sakinofsky. 1998. “The Gender Paradox in Suicide.” Suicide and Life-Threatening Behavior 28: 1–23. CDC (Centers for Disease Control and Prevention). 1998. “Suicide among Black Youths—United States, 1980–1995.” Journal of the American Medical Association 279, no. 18: 1431. Howard-Pitney, Beth, Teresa D. LaFromboise, Mike Basil, Benedette September, and Mike Johnson. 1992. “Psychological and Social Indicators of Suicide Ideation and Suicide Attempts in Zuni Adolescents.” Journal of Consulting and Clinical Psychology 60: 473–476. Johnson, Gregory R., Etienne G. Krug, and Lloyd B. Potter. 2000. “Suicide among Adolescents and Young Adults: A CrossNational Comparison of 34 Countries.” Suicide and Life-Threatening Behavior 30: 74–82. Kushner, Howard I. 1993. “Suicide, Gender, and the Fear of Modernity in Nineteenth-Century Medical and Social Thought.” Journal of Social History 26, no. 3: 461–490.
Sunday Schools The first American Sunday schools were enlightened and philanthropic efforts founded in the 1790s to instill virtue in children, particularly unruly urban boys, in order to prepare them for citizenship. By the 1810s, however, evangelical educational methods worked out in England were introduced in the United States, with the purpose of eliciting religious conversion experiences in children. Taught by lay teachers representing various Protestant denominations, these Sunday schools were so popular in the early nineteenth century that a national association, the American Sunday School Union, was organized in 1824 to spread schools and their materials across the nation. Before public schools became widespread in the 1840s, many boys and girls learned disciplined behavior and how to read in Sunday schools. By midcentury, however, the ecumenical ideal was eclipsed by decentralized denominational schools designed to supplement public school education. At the same time, the emphasis on religious conversion was replaced by notions of Christian nurture, advocating the gradual religious development of children. Teaching methods were borrowed from public schools, and materials were designed to develop manly piety in boys. Yet, as Americans continued to focus their fears of social chaos on the behavior of male youth, the Sunday school remained an important
654
Sunday Schools
Sunday school in an agricultural workers’ camp, Bridgeton, New Jersey, 1942 (Library of Congress)
institution for teaching not only religion but also self-discipline and correct social relations to boys. During the seventeenth and early eighteenth centuries, Anglican, Congregational, and Lutheran ministers expected conversion or entry to communion to occur in the late teenage years or early twenties and did not attempt to elicit conversion experiences in children. In the late eighteenth century, the first American Sunday schools were based on those founded in England to ameliorate living conditions among the urban poor. Motivated by Pennsylvania’s ratification of the federal Constitution in 1787 and passage of the First Amendment separating church and state in 1791, Benjamin Rush and other Philadelphia gentlemen turned
to the voluntary association, founding the First Day Society to conduct similar schools for children of their city. “Who can witness,” Rush wrote, “the practices of swimming, sliding, and skating which prevail so universally on Sundays in most cities of the United States, and not wish for similar institutions to rescue our poor children from destruction?” (quoted in Reinier 1996, 79). Members of the First Day Society hoped to discipline children of the city’s turbulent lower class, yet they also sought to provide elementary education for boys who would become apprentices and eventually urban citizens. And they hoped that their schools would become a model for tax-supported education throughout the state. The schools they supported from 1791 until 1820 taught or-
Sunday Schools derly habits, literacy, and skills yet were grounded in Christianity. Boys and girls were expected to attend public worship at their respective churches, and the Bible was used as a textbook. Annual public examinations rewarded scholarly progress and good behavior with the society’s approval and a “premium,” usually a little book, whereas misbehavior was punished by shame and, if repeated, exclusion from school. By 1810 members of the society considered their experiment a success; their annual report concluded: “A recurrence to the early minutes has given in Evidence that some of the Lads then steadily attended the School and received Premiums for good Behavior and Improvement in their studies . . . [and] have since become opulent and respectable Members of the community” (quoted in Reinier 1996, 80). Shortly after, however, evangelical educational methods imported from England were introduced in the United States when the Reverend Robert May conducted an interdenominational Sunday school also directed at Philadelphia working-class children. May deliberately attempted to evoke an emotional response by preaching on themes of death and judgment. He clearly sought to elicit a conversion experience in the children, his volunteer assistant teachers offered religious instruction rather than secular elementary education, and his system of rewards taught children the values and procedures of a cash economy. A child who recited correctly a lesson memorized from Scripture or catechism earned a black ticket. A number of black tickets bought a red one; and a number of red tickets purchased a premium, usually a hymnbook or catechism. Riding the crest of religious enthusiasm generated by revivals of the Second Great Awakening, these evangelical
655
methods were immensely popular. As Sunday schools proliferated in the city, in 1817 young men and their merchant supporters formed the Philadelphia Sunday and Adult School Union. Within a year they were conducting 43 interdenominational Sunday schools that instructed 5,658 children and 312 adults. The schools initially had been intended for children from poor and working-class families, but by 1820 they were also attracting children of the affluent. That same year, members of the First Day Society realized that their more secular effort had been superseded, and, ceasing to operate their schools, they voted to donate their remaining funds to the Philadelphia Sunday and Adult School Union (Reinier 1996, 89). In 1818 managers of the interdenominational union drew up “Internal Regulations for the Sunday Schools.” Their first goal was the spread of literacy. The union printed alphabet cards and a spelling book in order that children could be instructed in reading until they mastered the difficult Bible. Equally important was formation of orderly habits; regulations stressed industry, punctuality, and cleanliness. “Order is delightful,” the managers wrote, “and although it imposes restraint upon the scholars it will be found to be pleasing to them in practice” (quoted in Reinier 1996, 89). But the primary purpose of the Sunday school was to save souls. As adults began to focus on what they called “early piety,” they sought to elicit conversion experiences in boys and girls as young as seven or eight years of age. According to the internal regulations, Sunday school teaching required three methods. The “expounding” method was lecture, and the “catechetical” method was question and answer. Neither could be effective, however, without “exhortation,” through which the
656
Sunday Schools
emotions of the child were “awakened” to be receptive to religious instruction. By 1820 the Philadelphia Sunday and Adult School Union was reproducing religious tracts in order to elicit the desired response. Many of these tracts were originally from England, and the managers were never fully satisfied with them, objecting that they reflected British rather than American “civil government, manners, and customs.” For example, they rejected a British theme of contented poverty as unsuitable for aspiring Americans. But they did retain descriptions of the deaths of pious children as an appropriate method of awakening. In 1824 the Philadelphia Sunday and Adult School Union conducted more than 700 Sunday schools in 17 states, in which more than 7,000 teachers instructed 48,000 students. In that year alone the union republished 133,000 imported tracts, which it distributed not only in the mid-Atlantic states but north to New England, south to the Carolinas and Mississippi, and west throughout Kentucky, Tennessee, Ohio, Indiana, and Missouri (American Sunday School Union 1825, 32–33). Seeking to continue the spread of its influence, it reorganized as a national association, the American Sunday School Union, and began to produce materials for a national market. Its new lay Committee of Publications rejected British tracts that focused on child death. Instead, the committee designed a new evangelical children’s literature that wove its religious message into such topics as natural science. Viewing boys as untamed animals, the new Sunday school books taught internalized restraint, reminding children that God watched them all the time. Potential citizens were instructed in self-discipline by books such as Election Day, which fea-
tured examples of worthy and unworthy voters (American Sunday School Union 1827). And a male role model was provided in Life of George Washington, in which the exemplary character of the first president had been formed and his animal nature tamed by the influence of his pious mother (Reed 1829). Some nineteenth-century boys did turn to evangelical religion when faced with their own illness or the death of a sibling or parent. Twelve-year-old James Riker, Jr., of New York City, born in 1822, was embarrassed when his father’s clerk, a Methodist who shared his bedroom, encouraged him to kneel in prayer, and was ashamed to tell his family about the incident. When stricken with typhus fever the following summer, however, he was soothed when his sister sang hymns she had learned at a Methodist meeting. Shortly after, his mother died suddenly from cholera. A few months later, James experienced religious conversion at a revival the family attended at the New York Seventh Presbyterian Church (Reinier 1996, 92). Other boys, such as the Methodist John H. Vincent of Illinois, born in 1830, found efforts to elicit conversion in children “gloomy” and “morbid.” As an adult, he felt that the religion of his childhood encouraged excessive emotionalism and that it had deprived him of a spontaneous “boy life.” He preferred an approach more like the new literature of the American Sunday School Union and “the stability of the Sunday school to the upheavals of a revival” (quoted in Boylan 1988, 91). Although by 1830 the American Sunday School Union was a genuine nationalizing force, its evangelical message was rejected by freethinkers, Roman Catholics, Quakers, Unitarians, and Jews. Denominations such as the Methodists
Sunday Schools were skeptical of its ecumenical appeal and began to organize their own Sunday schools. Middle-class reformers began to advocate tax-supported and state-supervised systems of education, with fulltime schools taught by professional teachers. When managers of the American Sunday School Union urged adoption of the new evangelical children’s literature in the Massachusetts public schools in 1838, Horace Mann, a Unitarian and secretary of the state board of education, strenuously objected. The new common schools would offer a Protestant republican curriculum but avoid sectarian instruction. By the 1840s, as public schools became widespread, Sunday schools were increasingly decentralized and supported by religious denominations to instruct children in particular doctrines. A new generation rejected the earlier emphasis on a cataclysmic conversion experience and urged concepts of Christian nurture, or the gradual development of religious sensibility in children. Teachers adopted new educational methods such as those of Johann Heinrich Pestalozzi, and the Sunday school, which no longer taught reading, became a supplement to the public school. Yet especially in times of social chaos, Sunday schools continued to teach selfdiscipline to boys. During and after Reconstruction in the South, freed children and adults learned to read in Sunday schools taught by white or black teachers. Mission schools in northern cities and on the western frontier brought in the unchurched and taught values of public order and decorum, separating the rough from the respectable. Many urban missions became neighborhood institutions that offered social services and founded industrial schools. As the nineteenth century progressed, ministers and
657
lay teachers feared that boys were being socialized largely by mothers and female teachers and advocated a more muscular Christianity. They urged that Sunday school literature reflect the behavior of real boys and that the curriculum include opportunity for spontaneity and robust play. In the early twentieth century, leaders in a variety of fields attacked the faith in moral absolutes that previously had permeated American society. Pragmatic educators such as John Dewey advocated an open world of change rather than fixity, and the use of science to achieve human goals. Public schools became increasingly secularized, a trend reinforced by Supreme Court interpretation of the First Amendment in decisions that barred religious instruction in public schools (McCullum v. Board of Education, 1948) and outlawed school prayer (Engel v. Vitale, 1962) and compulsory Bible reading in the classroom (Abington School District v. Shempp, 1963). In this context, religious instruction was confined to the denominational Sunday school, supplemented by a variety of parachurch organizations. Yet the evangelical thrust of the earlier American Sunday School Union and later versions of Protestant fundamentalism was never fully extinguished. The revival of evangelical Protestantism in the 1980s made religious education in public schools once more a political issue. Fears of social and cultural chaos still focused on the behavior of male youth. In the diverse, pluralistic American society of the twenty-first century, the use of religious instruction and faith-based organizations to instill self-discipline in boys is still a matter of public concern and often of contentious debate. Jacqueline S. Reinier
658
Superheroes
See also Books and Reading, 1800s; Muscular Christianity; Parachurch Ministry; Preachers in the Early Republic References and further reading American Sunday School Union. 1825–1830. Annual Reports. Philadelphia: American Sunday School Union. ———. Committee of Publications. 1827. Election Day. Philadelphia: American Sunday School Union. Boylan, Anne M. 1988. Sunday School: The Foundation of an American Institution, 1790–1880. New Haven: Yale University Press. Marsden, George M. 1990. Religion and American Culture. New York: Harcourt Brace Jovanovich. Reed, Anna. 1829. Life of George Washington. Philadelphia: American Sunday School Union. Reinier, Jacqueline S. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers.
Superheroes In 1938, Detective Comics (later DC Comics) introduced Superman, the first and best-known superhero, described in the first issue as “Champion of the oppressed, the physical marvel who has sworn to devote his existence to those in need!” Thus was born the prototype for subsequent superheroes who would appeal so powerfully to young boys—flashy costumes, secret identities, technical gadgetry, and tests of superhuman strength in which moral virtue prevails. Superman was followed the next year by another DC creation, Batman, a more humanized superhero. As the United States was about to enter World War II in 1941, Marvel Comics developed the Human Torch and its first huge star, Captain America, who regularly used his powers against the Nazis. After the comic book slump of the 1950s, Marvel went on to create in rapid
succession a new generation of superheroes: the Fantastic Four (1961), the Incredible Hulk (1962), Thor (1962), Spiderman (1962), the X-men (1963), and Conan the Barbarian (1970). In fact, these wellknown characters represent only a small fraction of the superheroes Marvel created, many of whom lasted only a few issues. The oddest was the 1941 creation the Whizzer, who gained superstrength by injecting himself with mongoose blood, a stunt unconvincing even by comic book standards. Although the birth of comic book superheroes can be traced to the late 1930s, this genre combined the visual techniques of earlier comic strips with the storytelling that existed in dime novels and pulp fiction, which began to appear in the 1840s. Comic books are part of a long tradition of popular fiction that emphasized fast-moving (and often quickly written) stories of adventure, horror, violence, and the supernatural. The Adventures of Tom Sawyer opens with Tom immersed in the imagined world of pirates that he derived from Ned Buntline’s dime novel The Black Avenger of the Spanish Main: Or, The Fiend of Blood. The world of pulp fiction was often violent, with many of the western adventures featuring killing, scalping, and drinking blood. The adventures of Davy Crockett, who became a mythic figure in dime novels after his death at the Alamo, contain descriptions that even by today’s standards seem graphic. In one brawl Crockett describes the damage he did to his opponent: “His eye stood out about half an inch, and I felt the bottom of the socket with the end of my thumb” (quoted in Schechter 1996, 32). And unlike the heavily moralized stories such as those that would appear in the McGuffey readers, the dime novels existed for pure pleasure and escape.
Superheroes
659
Batman and Robin (Adam West and Ward Burt) from the television series (Kobol Collection)
Not surprisingly, boys often had to escape the surveillance of parents to read dime novels. Novelist Booth Tarkington would sneak away to the stables, or he would hide his books behind acceptable ones like Pilgrim’s Progress by John Bunyan (1678). In the late nineteenth century, antivice groups led by Anthony Comstock began to target publishers of dime novels, whom they saw as endangering children too innocent and vulnerable to protect themselves. Comstock compared these novels to “literary poison, cast into the foundations of our social life . . . infecting the pure life and heart of our youth” (quoted in Beisel
1997, 70). This anxiety about popular culture and boyhood would resurface in the 1950s, when comic books—including the superheroes—would come under attack. It would again become a major political issue at the turn of the millennium as parents expressed concern about violence in video games, movies, and the hugely popular extension of superheroes—professional wrestling. Beginning in the late 1940s, psychologist Frederic Wertham waged a one-man crusade against comic books. His claims that they contributed to juvenile delinquency led to congressional hearings and the “voluntary” decision to create a
660
Superheroes
Comics Code Authority to censor comic books. Wertham noted that by the time he is eighteen, the typical boy comic book reader would have absorbed a minimum of 18,000 pictorial beatings, shooting, stranglings, blood puddles, and torturings to death. Although Wertham’s primary target was crime comics, he found Superman objectionable because he operated outside the law and because he belonged to a “superrace” that looked down at its inferiors; readers who identified with superheroes were presumably seduced into a form of racism. Wertham was especially concerned about Batman and his homoerotic attraction to Robin. Wonder Woman was, in Wertham’s view, a man-hating lesbian. As odd as Wertham’s views seem today, his fight against comic books relies on three assumptions that would persist in subsequent challenges to films, television, and video games. First, childhood is a time of innocence, a term that connotes both ethical purity and defenselessness. The child is not equipped to mediate in any way the messages of popular culture, and boys in particular are seen as predisposed to imitative action. Second, researchers examining the effects of violence in the media effectively bypassed children, rarely interviewing them or exploring their perceptions—preferring to view them as helpless victims involved in a stimulus-response situation (Tobin 2000). And third, violence in the media is a single construct that includes both the Kennedy assassination and Superman hitting Lex Luther. In other words, “violence” is conflated into a single definable stimulus. This reduction is justified by the often repeated but dubious claim that children cannot distinguish between fantasy and reality—a failure that is the byproduct of their innocence.
From the standpoint of the censors, institutions like the home and school must be bastions that protect children from the poisonous influences of popular culture. Protecting childhood innocence requires vigilance on the part of parents and other adult authorities. As Comstock urged, “Let fathers, mothers, and teachers watch closely over the pockets, desks, and rooms of their children. Be sure that the seeds of moral death are not in your homes and schools” (quoted in Beisel 1997, 71). Yet advocates of dime novels and comic books turn this argument on its head. The surveillance that characterizes home and school creates the need for a subversive, aggressive literature that allows escape from (and mockery of) legitimate authority. Reading becomes a form of “underlife,” a term used by Erving Goffman (1961) to explain a healthy form of resistance to being defined by institutional roles. So according to critics of popular culture, the home and school are protective bastions; according to supporters that very protectiveness pushes young boys to seek out subversive literature. Some of the appeals of superhero comics are obvious—and hardly pathological. Unlike the elite literature of the schools, the plots move quickly to epic confrontation. In fact, one of the consistent complaints boys have about the literature they read in schools is that it moves too slowly (Millard 1997). The superhero landscape is cleared of legitimate authority: the Gotham City police are always so inadequate to the task that Superman’s heroic action is necessary. The characters are dramatically illustrated; the idealized bodies of the male heroes are skillfully rendered, and those of female heroes like Wonder Woman have unmistakable sex appeal. The plotlines touch on readers’ interest in transforming technological change, for ex-
Superheroes ample, the concern about nuclear radiation (the Incredible Hulk) and biological mutation (the Fantastic Four). Although comic books are often dismissed as lacking three-dimensional characters, the writers at Marvel saw characterization as central to their work. Stan Lee, creator of Spiderman and other enduring superheroes at Marvel, conceived of the character as realistically human even while establishing Spiderman’s ability to walk on buildings like an insect and shoot webs that allowed him to swing freely: “we do our best to treat ol’ Spidey as if he could be your next door neighbor. Despite his superpowers, he still had money problems, dandruff, domestic problems, allergy attacks, self-doubts and unexpected defeats. In other words we try to think of him as real and to depict him accordingly” (Daniel 1993, 9). For all the fantasy that surrounds superheroes, their stories touch on real anxieties of young adolescent boys who feel themselves to be awkward misfits like Peter Parker, the imaginative, alienated, “real-life” character who transforms into Spiderman. The superhero stories suggest self-transformative possibilities—even the timid and hesitant are capable of virtuous heroism. In the words of Marvel historian Les Daniel, Spiderman “remains Everyman, the super hero who could be you” (1993, 96). Even the Charles Atlas bodybuilding advertisements on the backs of many issues in the 1950s and 1960s reinforced this promise of physical and social self-transformation for boys. The superheroes also became the subjects of satire and parody, particularly in the pages of Mad Magazine. Superman became Superduperman; Flash Gordon became Flesh Garden. In the 1960s the TV version of Batman was played as self-parody, particularly in the fight scenes when
661
the action was punctuated by exclamations (“POW,” “BOP,” “CRUNCH”) written comic book–style within the action. Dave Gilkey provided young readers with such epic titles as Captain Underpants and the Perilous Plot of Professor Poopypants and Captain Underpants and the Invasion of the Incredibly Naughty Cafeteria Ladies from Outer Space. But the most popular incarnation of self-parodying superheroes was undoubtedly professional wrestling as it began to aim for an audience of adolescent and preadolescent boys in the 1980s. Professional wrestling has been televised since the late 1950s, and many of the basic techniques were developed early on: the colorful names (Gorgeous George, Bobo Brazil, Argentina Roco, Haystack Calhoun) and special finishing tactics, often particular to a wrestler, like the Coco Bop, the Stomach Claw, the Piledriver, and the Sleeper Hold. Unlike superhero stories in which virtue prevails, victory was regularly achieved by deceit, often the use of an illegal substance hidden in the wrestlers’ trunks and employed when the feckless referee was invariably knocked senseless during critical moments in the bout. In the 1980s, bodybuilding techniques helped wrestlers sculpt their muscles to resemble the dimensions of comic book heroes. Their outfits were colorful spandex, and their props became more elaborate: Jake the Snake would enter the ring with a python around his neck; Ravishing Ricky Rude would spray his opponents with an atomizer containing perfume; and the Undertaker would carefully place his beaten opponent in a coffin at the end of each match. In the 1990s the World Wrestling Federation, presided over by Vincent McMahon, became so popular that Monday Night
662
Superheroes
Raw, the most popular show on cable television, began to threaten a staple of male sports viewing, Monday Night Football. At one point in early 2000, two of the top three nonfiction best-sellers were “written” by professional wrestlers—Mankind (Mick Foley) and the Rock (Dwayne Johnson). Wrestlers also gained political prominence with Jesse “The Body” Ventura winning the governorship of Minnesota and the Rock speaking in prime time to the Republican National Convention. To the standard features of professional wrestling, McMahon added soap opera, a constantly shifting story of sexual and professional intrigue in which he and his family played a part. Much of this story is played out in elaborate monologues between matches in which wrestlers would air their grievances, issue challenges, and express their feelings in a parody of New Age manhood. The announcers, the straight men in the drama, take this intrigue absolutely seriously, commenting on a spat between Stephanie McMahon and Triple H with the same seriousness (“Look at her eyes. Is she mad or what?”) as they do the matches themselves. For boys who are expected to conform to the rules at home, in school, and when they play sports, these wrestling productions are particularly attractive because they self-consciously challenge codes of appropriate behavior and speech and flaunt traditional athletic fairness. Over time, various versions of superhero genres have consistently created anxiety about the socialization of boys. Plato banished the poets from his imagined republic because their stories of the Trojan War would create unhealthy and uncritical emulation. Stendhal’s classic novel, The Red and the Black (1830), is a cautionary tale about the idolization of
heroes in the Napoleonic wars. Psychologists in the United States have consistently warned that extensive exposure to violent materials can increase aggressiveness, prompting periodic campaigns to restrict access for children. But the enduring popularity of superhero stories is evidence of the developmental function of the genre. They provide a way for boys to escape, even to mock and subvert adult authority that seems so confining. Mark Twain caught this appeal when he described the conclusion of Tom Sawyer’s pirate adventures: “The boys dressed themselves, hid their accouterments, and went off grieving that there were no outlaws anymore and wondering what modern civilization could claim to have done to compensate for their loss. They said they would rather be outlaws a year in Sherwood Forest than President of the United States forever” (1946, 88) Thomas Newkirk See also Books and Reading, 1900–1960; Comic Books References and further reading Beisel, Nicola. 1997. Imperiled Innocents: Anthony Comstock and Family Reproduction in Victorian America. Princeton: Princeton University Press. Daniel, Les. 1993. Marvel: Five Fabulous Decades of the World’s Greatest Comics. Introduction by Stan Lee. New York: Abrams. Goffman, Erving. 1961. Asylums: Essays on the Social Situation of Mental Patients and Other Inmates. Garden City, NY: Anchor/Doubleday. Leland, John. 2000. “Why America Is Hooked on Professional Wrestling.” Newsweek 135, no. 6 (February 7): 46. Millard, Elaine. 1997. Differently Literate: Boys, Girls and the Schooling of Literacy. London: Falmer Press. Sabin, Roger. 1996. Comics, Comix and Graphic Novels: A History of Comic Art. London: Phaidon.
Superheroes Schechter, Harold. 1996. “A Short Corrective History of Violence in Popular Culture.” New York Times Magazine (July 7): 32–33. Tobin, Joseph. 2000. “Good Guys Don’t Wear Hats”: Children Talk about the Media. New York: Teachers College Press.
663
Twain, Mark (Samuel Clemens). 1946. The Adventures of Tom Sawyer. New York: Grosset and Dunlap. Wertham, Frederic. 1953. Seduction of the Innocent. New York: Rinehart. ———. 1996. “The Psychopathology of Comic Books.” American Journal of Psychotherapy 50, no. 4 (Fall): 472–490.
T Teams
ican Revolution, boys in the new United States continued to play traditional European team sports, especially bat-and-ball games. These bat-and-ball games provided the foundation for the creation of baseball—one of the most popular games for American boys in the nineteenth and twentieth centuries. The creation of baseball in 1845 in New York City by Alexander Cartwright and the Knickerbocker Base Ball Club marked the beginning of a major shift in the importance of team sports in the lives of American boys. Baseball began as a city game, played first by young middle-class and working-class men, but was quickly adopted by boys in urban areas. From cities in the industrializing northeastern United States, the new game with its “official” written and standardized rules began to spread throughout the United States and supplant the traditional boys’ bat-and-ball games in towns, villages, and rural hamlets. During the 1840s and 1850s, in the same era in which baseball was created, the United States began to be transformed from a traditional, rural-agricultural nation into a modern, urban-industrial nation. These changes had an enormous impact on the lives of American boys and on their sports. The Industrial Revolution would increase the amount of leisure time available to upper-class and middle-
Athletic teams have played a central role in the lives of American boys. Playing on teams has been one of their favorite pastimes. Teams have provided one of the most important sites for organizing social structures and cultural patterns among boys. Adults have endorsed team sports as critical educational and political tools, and sports have also been battlegrounds between children and adults over control of the leisure time of American boys. Teams of North American boys played sports before the United States came into existence. Native American boys engaged in a variety of traditional games. Boys from North America’s eastern woodland tribes staged youthful contests called the “little war,” a highly competitive stickand-ball game with important political, military, and religious meanings. When Europeans arrived in North America, they labeled the little war “lacrosse.” During the conquest and colonization of North America, European settlers brought a variety of traditional team sports to their new homes, including premodern forms of football and English bat-and-ball games. Colonial boys often organized games of “cat,” “nines,” “stool ball,” and football or played in village games with adult men. After the English colonies in North America won their political independence from Great Britain in the Amer-
665
666
Teams
class boys and, after many decades, working-class boys. At the same time, changes in work, home, and family life created by the new industrial economy also limited the amount of adult control over the boys’ expanded leisure time. Many adults grew increasingly concerned about these changes and sought mechanisms to adjust the lives of American boys to the new social order. Team sports became a very popular tool in the efforts to adapt boys’ lives to modern social conditions. In earlier American society, team sports had been one of a multitude of recreational activities for boys, no more important than many of the other forms of physical activity that shaped their lives. In the new, rapidly modernizing American society, team sports would take on novel and important roles. They would replace the physical challenges of farm labor, hunting, and logging that had faced earlier generations of American boys. While boys were learning the skills necessary to succeed in the corporate bureaucracies of the industrial system by participating in team sports, they were also avoiding dangerous vices such as crime, gangs, violence, alcohol, and drugs. Team sports would be endorsed as the key to preparing American boys for the duties of citizenship in a democratic republic and to teaching them honor, courage, integrity, cooperation, and perseverance. Team sports would be advertised as one of the best ways to build a strong and vital American nation. American promoters of team sports for boys as a nation-building device were greatly influenced by British ideas linking athletics for youth to the construction of a strong, modern nation. The English writer Thomas Hughes’s immensely popular 1857 novel, Tom Brown’s School Days, captured the imaginations of gen-
erations of American schoolboys and adult athletic boosters. The book declared that team sports (especially cricket and the original form of rugby football) were the most important character-building activity for boys. Hughes advocated “muscular Christianity” for young boys, arguing that sound bodies trained through team sports produced not only sound minds but sound spirits. Many Americans wholeheartedly accepted Hughes’s assertions. Tom Brown’s School Days inspired American authors. The most popular imitations of Hughes’s sporting formula began to appear by the 1890s, led by the 208 volumes in Gilbert Patten’s (Burt L. Standish’s) Merriwell series and the numerous books in Edward Stratemeyer’s Rover Boy series. Books about sports teams proved enormously popular among young American readers. The American fascination with sports has produced a vast body of didactic literature, and sports books for boys continue to comprise an important genre in American literature. Tom Brown’s School Days also sparked a team sports boom in American schools. In the period from the 1850s through the 1890s, team sporting clubs based on British models sprouted at elite schools and colleges in the United States. Originally, these teams were organized and supervised by the boys themselves, as were the sports of the fictional Tom Brown and his chums. Sports teams in American high schools in the 1870s and 1880s were generally run without adult oversight and sometimes in spite of adult resistance. Interscholastic leagues organized by boys began to appear in American cities. Boston’s Interscholastic Football Association, formed in 1888, was a student-run organization that created a flourishing league in that city.
Teams By the 1890s adults had decided that team sports were far too important a tool for social organization to be left in the hands of boys. A new group of professional “boy workers,” as they were called at that time, took control of youth athletics. They used another British institution that flowered in the United States, the Young Men’s Christian Association (YMCA), to spread the doctrine of “muscular Christianity” through team sports among American boys. The YMCA had been founded in 1844 in England by evangelical Protestants, including Tom Brown’s creator, Thomas Hughes, and George Williams. The YMCA quickly spread to the United States in the 1850s. After the American Civil War (1861–1865), as the United States rapidly modernized and urbanized, the YMCA began to institute adultdirected sports programs to attract American boys to wholesome recreations. By the 1890s the YMCA had more than 250,000 members and had built more than 300 gymnasiums throughout the United States. Under the leadership of athletic evangelist Luther Halsey Gulick (1865–1918), the YMCA promoted competitive team sports as mechanisms for turning American boys into vital citizens of the republic. The YMCA sponsored teams in baseball and football, two popular sports among boys that had been Americanized from earlier British forms. The YMCA also organized team sports that were more common throughout the Anglo-American world: track and field, lacrosse, ice hockey, soccer football, and rugby football. Americans so loved these competitive team games that in the 1890s Gulick challenged the faculty at the YMCA training college in Springfield, Massachusetts, to invent a team game that could be played indoors during cold North Ameri-
667
can winters when weather conditions made it impossible to play baseball, football, or other sports in large parts of the United States. A Canadian instructor at the YMCA school, James Naismith, responded in 1891 by inventing basketball. In 1895 a graduate of the Springfield training academy, William G. Morgan, created volleyball for the YMCA’s arsenal of team sports. Both of these new team sports, especially basketball, became enormously popular games for American boys. The YMCA spread the new sports first around the United States and then around the world. Basketball, which required only modest amounts of equipment and space, and was characterized by relatively simple rules, adapted well to urban environments. In rapidly expanding U.S. cities, basketball quickly became a playground staple. Other institutions to ensure adult supervision of boys’ leisure time also sprang up in American cities. By the mid-1890s, adults had taken control of sports teams at American elementary and secondary schools. YMCA leader Gulick was the key figure in the creation of the first interscholastic sports organization in the United States. Gulick and his colleagues in New York City founded the Public Schools Athletic League (PSAL) in 1903. The PSAL sponsored competitions in baseball, basketball, crew, cross-country, lacrosse, soccer, and track and field and became the model for the creation of interscholastic team sports leagues in Baltimore, Chicago, and many other American cities. “Boy workers” also used settlement houses (reform organizations in turn-ofthe-century American inner cities designed to Americanize immigrants and improve the lot of the urban poor by
668
Teams
teaching them middle-class values) to take control of team sports. Settlement houses, such as Jane Addams’s nationally renowned Hull House in Chicago, created team sports competitions. In 1902, New York City organized the Inter-Settlement Athletic Association to promote team sports among immigrant boys. In 1906, settlement house leaders and other progressive reformers founded the Playground Association of America. Luther Gulick served as the first president of this new organization, and Jane Addams was the first vice president. By 1917 the playground and parks movement had managed to create recreational sports programs in 504 American cities. A park and playground building frenzy had altered the nation’s urban landscape and created a host of new sites for teams of American boys to play their games. By the 1920s team sports were firmly embedded in the lives of American boys. They had been planted in schools, in municipal recreation departments, and in the new parks and playgrounds. In addition, the ideas of “muscular Christianity” that had originally animated the American quest to inspire boys to play team sports had been supplanted by new scientific theories that argued that team sports were crucial instruments in the development of boys from childhood through adolescence into adulthood. The new scientific theories, added to the older religious endorsement of competitive sports, made athletic teams a permanent feature of American society. The new scientific ideas also confirmed the common belief that team sports for American boys needed adult supervision. Throughout the twentieth century, self-organized “sandlot” baseball and “pickup” basketball games declined, while adult-controlled games grew. By the last three decades of
the twentieth century, self-organized team games had become a rarity. Adultmanaged sport had come to dominate the recreational lives of American boys. Beginning in the 1920s, a decade labeled by contemporaries as “the golden age of sport,” comprehensive programs of interscholastic sports spread throughout the American school system. Elementary and secondary schools developed strong athletic programs for boys centered on team sports—especially baseball, basketball, football, and track and field. Sociologists Robert and Helen Lynd, in their landmark 1929 study of an “average” American town, were amazed by the importance of high school basketball in American society. The Lynds’ “Middletown” was actually Muncie, Indiana, and they reported that basketball had become the dominant pastime in middle America. Other regions of the United States also developed intense passions for high school sports. High school football preoccupied many communities, especially in Texas and Pennsylvania. High school baseball became very popular in the growing sunbelt regions of the American South and Southwest. High school sports teams became central to community life and shaped the identities of villages, towns, and urban neighborhoods. Younger boys were trained for future roles on high school teams by elementary school athletic programs and by a growing number of extrascholastic sports leagues. The Pop Warner football league for young boys, named after the famous early-twentieth-century football coach Glenn “Pop” Warner, began in 1929. In 1939 Little League Baseball was created by lumber company employee Carl Stotz in Williamsport, Pennsylvania. Both leagues modeled themselves after professional sporting practices, em-
Teams ploying uniforms, marked playing fields, player drafts, and other features derived from the pros. Dominated by adult coaches and officials, Little League and Pop Warner football imitated professional sports by focusing on winning and elite competition rather than recreation and sport for all. Both leagues struggled until after World War II, when they took off rapidly. By 1970 more than a million players annually played Pop Warner football. Pop Warner “bowl games” appeared in several American cities, including a “Junior Orange Bowl” in Miami. By the late 1950s nearly a million American boys in 47 states and 22 foreign nations were playing on 19,500 Little League teams. In 1963 the Little League World Series appeared for the first time on television. It soon became a summer staple. By 1990 Little League baseball had 2.5 million players in 16,000 different leagues that spread across all 50 U.S. states and 40 foreign countries. The success of Little League baseball and Pop Warner football inspired the creation of many other team sports leagues in a wide variety of sports. Basketball leagues drew huge numbers of players. Beginning in the 1970s, youth soccer took off, eventually drawing more participants than the baseball, football, or basketball programs. The National Alliance for Youth Sports, founded in 1981, sought to regulate youth sports for the more than 20 million participants that it estimated participated annually in out-ofschool sports programs. Millions of more boys participated in school-based sports. The rise of elite teams of young athletes raised concerns in American society. Since the late 1800s, critics of competitive interscholastic and extrascholastic sports have raised questions about the
669
benefits of these programs for American boys. They worried that these programs neglected the vast majority of boys and nurtured only the best athletes. The focus on elite competition left too many boys off teams. Many professional physical educators promoted sports programs focused on recreation and enjoyment that catered to all boys to replace the professionalized models that were limited to only the very best players. Critics also noted that the win-at-any-cost mentality spawned by elite teams undercut the values that participation in team sports was supposed to teach American boys. In their estimation, fair play, honor, honesty, and other virtues were trampled by blind devotion to victory. From the late 1800s to the present, many observers have also condemned the behavior of adults in youth sports programs. They point to the unsettling and recurrent history of uncivil and even violent outbursts by coaches and parents during games. Assaults and even murders of adult officials, coaches, and fans by other adults have plagued youth sports. Incidents of physical, emotional, and sexual abuse of children by adult coaches have tarnished leagues. Critics note that school sports have become so important that teachers are sometimes pressured to give academic favors to athletes. They observe that youth sports have become so significant in the lives of many American communities that boys who do not “make the team” feel excluded from society, whereas boys who do “make the team” face enormous pressures to perform at extraordinary levels. Exposés of abuses in youth sports have appeared regularly in the media since the 1920s. A variety of reform groups and local, state, and national governments have sought to curb abuses. Some reformers have even
670
Teasing
suggested returning control of team sports from adults back to children. As the twenty-first century begins, the chorus of criticism continues. The debate over the structure and function of modern team sports for boys has not yet retarded the growth of youth leagues and school sports in the United States. As they have for more than a century, most American adults continue to believe that team sports are one of the most important parts of boys’ lives. Although they feel uneasy with some of the recurring abuses appearing in sports programs, they are still committed to the idea that team sports make boys fit for modern society. Boys continue to flock to team games for their own reasons. Team sports remain a central feature of American boyhood. Mark Dyreson See also Baseball; Basketball; Football; Games; Muscular Christianity; Native American Boys; Young Men’s Christian Association References and further reading Bissinger, H. G. 1990. Friday Night Lights: A Town, a Team, a Dream. Reading, MA: Addison-Wesley. Boyer, Paul. 1978. Urban Masses and Moral Order in America, 1820–1920. Cambridge: Harvard University Press. Cavallo, Dominick. 1981. Muscles and Morals: Organized Playgrounds and Urban Reform, 1880–1920. Philadelphia: University of Pennsylvania Press. Dyreson, Mark. 1998. Making the American Team: Sport, Culture, and the Olympic Experience. Urbana: University of Illinois Press. Fine, Gary Alan. 1987. With the Boys: Little League Baseball and Preadolescent Culture. Chicago: University of Chicago Press. Goodman, Cary. 1979. Choosing Sides: Playground and Street Life on the Lower East Side. New York: Schocken Books.
Hardy, Stephen. 1983. How Boston Played: Sport, Recreation and Community, 1865–1915. Boston: Northeastern University Press. Kett, Joseph F. 1977. Rites of Passage: Adolescence in America: 1790 to the Present. New York: Basic Books. Lynd, Robert S., and Helen Merrell Lynd. 1929. Middletown: A Study in Contemporary American Culture. New York: Harcourt Brace. Macleod, David I. 1983. Building Character in the America Boy: The Boy Scouts, YMCA, and Their Forerunners, 1870–1920. Madison: University of Wisconsin Press. ———. 1998. The Age of the Child: Children in America, 1890–1920. New York: Twayne. Nasaw, David. 1985. Children of the City: At Work and at Play. New York: Doubleday. Oriard, Michael. 1982. Dreaming of Heroes: American Sports Fiction, 1860–1980. Chicago: Nelson-Hall. Rader, Benjamin. 1999. American Sports: From the Age of Folk Games to the Age of Televised Sports. 4th ed. Upper Saddle River, NJ: Prentice-Hall. Riess, Steven A. 1989. City Games: The Evolution of American Urban Society and the Rise of Sports. Urbana: University of Illinois Press. ———. 1995. Sport in Industrial America, 1850–1920. Wheeling, IL: Harlan Davidson. Yablonsky, Lewis, and Jonathan Brower. 1979. The Little League Game: How Kids, Coaches, and Parents Really Play It. New York: New York Times Books.
Teasing See Bullying
Television: Cartoons Animation, which is traditionally associated with child audiences, has played an important role in communicating implicit gender messages to American boys and girls in the twentieth century—particularly when broadcast via “the electronic babysitter,” television. Although
Television: Cartoons
671
Rocky, Bullwinkle, and friends, 1960 (Photofest)
animated films were assumed to hold appeal for both adults and children in the first half of the century, by the end of World War II, younger audiences were conceived as the main consumers of cartoons. Coinciding with the advent of daily broadcast programming across the United States, animation on television quickly became targeted exclusively at children. As the “science” of target market research developed throughout the latter half of the century, specific cartoons increasingly narrowed their sought-for audience. Consequently, television cartoons have often divided child viewers according to gender, helping to exemplify and regulate social concepts of
masculinity and femininity to boys and girls. Over the years, animation (and “children’s programming” in general) has had to deal with changing social ideas of what was appropriate for children to watch. Often, though not always, these changes included gender concepts that would affect how animation spoke to and about boys. Quite early in television’s history, animation became a programming staple. Stations quickly began broadcasting shorts originally produced for the theatrical screen by major studios like Warner Brothers or Disney. Yet original animation (made directly for television) began as early as 1948 with the premiere of
672
Television: Cartoons
Crusader Rabbit (1948–1951). Animation has always attracted viewers from all age groups, but producers, network executives, and (most important) advertisers consistently conceptualized animation as appealing primarily to children. Consequently, much of television animation has been aimed at younger audiences in choices of characters, plotlines, types of humor, and visual design. The development of a “limited animation” style for made-for-TV cartoons, necessitated by the quick turnaround times for these programs as well as the low returns expected from them, was considered allowable because television executives thought that children would not mind the lack of production detail common to theatrical animation. Most early original TV animation aimed at children ostensibly tried to attract both boys and girls indiscriminately. Cartoons produced by Jay Ward (Crusader Rabbit; Rocky and Bullwinkle, 1959–1961) and the production house of Hanna-Barbera (Huckleberry Hound, 1958–1962; Yogi Bear, 1961–1963) did not overtly play up gender roles. Yet all the heads of the animation studios and most of the animators themselves were male, and whether by design or not, practically all of the major cartoon protagonists were male. This predominance subtly establishes the “natural” dominance of men in American society by representing “maleness” as a sort of default category. Although being male is considered ordinary and unexceptional in these cartoons, the few female characters appearing in individual episodes conspicuously display difference and exception from the norm. This is similar to the creation of a “woman’s film” genre in Hollywood, which ghettoizes one circle of films and silently assumes that all other “normal” films are “men’s films.” This con-
cept continually reappears in animated programs to the present day. The Smurfs (1981–1990), for example, differentiates its large cast of characters by giving them easy-to-recognize personality traits: Brainy, Grouchy, and so on. Smurfette’s recognizable personality trait is that she’s the only female, thus marking being female as different, but the male gender of the other Smurfs goes unnoticed. In 1961, Newton Minow, the newly appointed chairman of the Federal Communications Commission (FCC), famously announced to the annual meeting of the National Association of Broadcasters (NAB) that he considered commercial broadcast television to be a “vast wasteland.” Minow included cartoons in his list of what he considered mind-numbing and unseemly programming, particularly in their relation to “screaming, cajoling and offending” commercials (Barnouw 1990, 300). By the 1960s, all three networks had begun programming Saturday mornings with animation and other shows aimed at children. This change was precipitated by advertisers (including toy manufacturers such as Mattel) using new methods in target research to isolate specific consumer groups, even though children had been acknowledged as a separate consumer group as early as the mid-1950s. Animated programs were consequently conceived as a vehicle to deliver young consumers to those advertisers hawking products aimed at children (particularly toys and breakfast cereals). In ensuing years, target marketing would break down populations into increasingly smaller chunks. Children themselves would be segmented into three different age groups—as well as by gender. As a result, animated programs began to reach out to more specified tar-
Television: Cartoons get groups, the better for sponsors to advertise their products. In the late 1960s, the first sustained wave of cartoons with female protagonists appeared (Penelope Pitstop, 1969–1971; Josie and the Pussycats, 1970–1974), and other cartoons increasingly seemed aimed solely at male children. Dramatic cartoons like Johnny Quest (1964–1965) simultaneously assumed that boys would be fascinated with violent action, adventure, and technological or mechanical gadgetry and taught them that such interests would mark them as successfully masculine. At the same time, various media watchdogs raised concerns that television was imparting messages about sex and violence to young minds. Social scientists began experiments to test how much children’s behavior was influenced by watching violent or sexually explicit programs. Even though the 1970 Presidential Commission’s Report on Obscenity and Pornography announced no definite proof that media’s images of sex and violence affected children’s behavior, discussion about television’s detrimental effect on children continued apace. The rise of race activism and women’s liberation fueled further critiques. As a consequence, a number of animated series attempted to become more socially responsible, often by including a moral lesson at the end of each episode about the value of hard work, fair play, and getting along with others. Animated series during this time like Fat Albert and the Cosby Kids (1972–1984), Sesame Street (1969–present), Super Friends (1973–1985), and Schoolhouse Rock (1973–1979) saw an increase in representation of racial and ethnic minorities and some discussion of environmental concerns. Yet discussions of how gender roles were presented basically went unaddressed. Representations
673
of race and ethnicity in these cartoons predominantly meant representations of males, silently assuming that racial uplift meant asserting male power in these communities. Similarly, although many of these new cartoons included both male and female characters, the male characters were often associated with mechanics, sports, and outdoor activity (i.e., traditional masculine activities), and the female characters were usually associated with nurturing and trying to look pretty (i.e., traditional feminine activities). When the Reagan administration relaxed federal regulations over the television industry in the 1980s, advertisers took advantage of the situation to flood syndicated children’s programming with series that were themselves half-hour advertisements for various toy lines. More than ever before, these shows drew clear boundaries between “boys’ play” and “girls’ play.” My Little Pony (1986–1990) and Strawberry Shortcake (1982–1984) plainly aimed at indoctrinating girls into conventional femininity, and He-Man (1983–1985) and GI Joe (1985–1987) just as plainly aimed at indoctrinating boys into consuming images of hypermasculinity. The outcry by critics eventually saw the waning of this style of programming. Although these critics often pointed out the blatantly gendered messages in these cartoons, their outcry focused more specifically on the overt attempts to commercialize children’s television. Simply because certain cartoons were envisioned to appeal to a certain gender does not mean that actual boys and girls watched or reacted in the ways that animators, advertisers, or network programmers had predicted. Although most boys would have eschewed “girly” cartoons (or felt peer pressure to avoid watching them), claiming that all boys reacted this way is
674
Television: Cartoons
too monolithic. Similarly, the appeal of such programs as GI Joe and He-Man seems to have differed from individual to individual. Predominantly, boys seemed to enjoy these cartoons as enactments of masculine power and aggression and desired to emulate these animated heroes. Yet, many homosexual men recall enjoying these shows as boys from a vastly different perspective, allowing them a secret space to begin desiring (rather than identifying with) these male figures. Television animation since the 1980s has seen a huge shift in industry structure. The rise of cable stations such as Nickelodeon and the Cartoon Network has increased the number of outlets available for viewing cartoons on television (and simultaneously eroded the perceived need for the commercial networks to keep Saturday mornings as exclusively for children). Also, partly in response to the criticisms of 1980s television animation, certain programs such as Where in the World Is Carmen Sandiego? (1994– present) attempt to appeal to both boys and girls. Further, the rise in popularity of adult-oriented animation in primetime programming (The Simpsons, 1989– present; South Park, 1997–present) often satirizes the clichéd gender messages presented in older TV animation. Intriguingly, the debut of such series helped spur calls for a ratings system for American television and the development of “V-chip” technology to block children’s access to “unsafe” programs. The need of advertisers to sell “girls’ toys” and “boys’ toys” still facilitates divisions in animation programming. The increasing popularity of importing Japanese animated series to American television often maintains this gender separation. Anime such as Sailor Moon (first
aired in Japan in 1990–1995) and Robotech (first aired in Japan in 1984–1989) often adopted the conventions from Japanese comic books (manga) that are divided into shonen (male) and shojo (female) genres. From the TV series adaptation of Disney’s The Little Mermaid (1992–1994) to the New Adventures of Batman and Superman (1997– ), much of American broadcast animation continues to instill in male and female children notions of what role models they are supposed to value and how to successfully enter into adult gendered behavior. Sean Griffin See also Video Games References and further reading Barnouw, Erik. 1990. Tube of Plenty: The Evolution of American Television. 2d rev. ed. Oxford: Oxford University Press. Englehardt, Tom. 1987. “Children’s Television: The Strawberry Shortcake Strategy.” In Watching Television: A Pantheon Guide to Popular Culture. Edited by Todd Gitlin. New York: Pantheon. Kinder, Marsha. 1991. Playing with Power in Movies, Television, and Video Games. Berkeley: University of California Press. ———, ed. 1999. Kids’ Media Culture. Durham, NC: Duke University Press. Kline, Stephen. 1993. Out of the Garden: Toys and Children’s Culture in the Age of TV Marketing. London: Verso. Levi, Antonia. 1996. Samurai from Outer Space: Understanding Japanese Animation. Chicago: Caris Publishing. Palmer, Patricia. 1986. The Lively Audience: A Study of Children around the TV Set. Sidney: Allen and Unwin. Schramm, Wilbur, Jack Lyle, and Edwin Parker. 1961. Television in the Lives of Our Children. Palo Alto: Stanford University Press. Seiter, Ellen. 1995. Sold Separately: Parents and Children in Consumer Culture. New Brunswick, NJ: Rutgers University Press.
Television: Domestic Comedy and Family Drama Woolery, George. 1983. Children’s Television: The First Twenty-Five Years. Metuchen, NJ: Scarecrow. Young, Brian M. 1990. Television Advertising and Children. Oxford: Oxford University Press.
Television: Domestic Comedy and Family Drama The family has long been the focus of contradictory impulses in American social life. On the one hand, it is imagined as a haven from the daily tensions of social, political, and economic life. On the other, it is considered a primary location for correcting current social ills through the proper upbringing of children—the adults of the future. From their beginnings, television networks have struggled with this contradiction, attempting to appear modern and socially relevant without alienating the family audiences so important to their advertisers, and social critics have alternately praised the medium as a means for bringing families together or blamed it for the decline of the institution (cf. Murray 1990). Boys have loomed large in this delicate negotiation, representing the fragile innocence of childhood, the main force of modernization in the home, and the nation’s social future. Prime-time television’s representations of boys in family life, then, have not so much reflected the reality of any given moment as they have the hopes, fears, and ideals associated with boyhood and the family. Because the early producers of live television dramas often tackled difficult issues of the day, the 1950s are often called the golden age of television. In live dramas such as Marty (1956) and The Days of Wine and Roses (1958) and in longrunning series such as Mama (1949–
675
1957), families struggled to find their place in the rapidly changing postwar American landscape. These dramas offered narratives of upward mobility and assimilation into a greatly expanded middle class and depicted families as struggling to negotiate intergenerational conflicts between the traditional extended family and the new suburban nuclear family. Grown sons faced the difficult task of leaving behind strong ethnic and neighborhood affiliations, and aging parents struggled to adjust to a modern world in which they represented a past being left behind (Lipsitz 1990). In these dramas of assimilation, the transition from immigrant roots to mainstream American culture was often refigured as an Oedipal struggle in which a grown son had to overcome his desire to care for his mother or had to demonstrate the superiority of his new life to his hidebound father. Even though chronologically an adult, the son did not become a man until he had severed his connections to his parents and the old world. Because their embrace of controversy made them unpopular with networks eager to attract advertisers, by the end of the 1950s these live dramas were gradually overshadowed by situation comedies, which also dealt with issues of upward mobility and modernization but through laughter rather than pathos (Boddy 1990). In sitcoms such as The Life of Riley (1949–1950, 1953–1958) and The Honeymooners (1955–1956), buffoonish, working-class husbands desperately tried to maintain their failing authority over wives and children (Spigel 1992). On The Life of Riley, father Chester was head of the family in name only, with son Junior often correcting him on matters of domestic technology and current language
676
Television: Domestic Comedy and Family Drama
A scene from a 1953 episode of The Life of Riley (Photofest)
and mores. In the more upscale world of The Adventures of Ozzie and Harriet (1952–1966), sons Ricky and Dave also regularly corrected father Ozzie’s misperception of himself as “with it.” In the extremely popular show I Love Lucy (1951–1957), Lucy Ricardo constantly schemed to break out of her role as housewife and into a career in entertainment, whereas her bandleader husband Ricky fought a losing battle to maintain a patriarchal authority based in his Cuban roots, and the program provided a fascinating intersection of issues of ethnicity and gender (Desjardins 1999). Following Lucille Ball’s on-air pregnancy and the birth of Desi Arnaz, Jr., “Little Ricky” grew up on television as a miniature version of his father, often joining Ricky on the bandstand from which Lucy was repeatedly barred and presenting the image
of an unbroken line of patriarchal succession from father to son. As the 1950s progressed, the networks played on the popularity of these television families by offering progressively blander, more suburban versions of boyhood and family life such as Leave It to Beaver (1957–1963) and The Dick Van Dyke Show (1961). In these situation comedies, fathers, though quirky, were more competent than their predecessors, wives and children were docile and pleasant, and their homes were models of suburban decorum. Fathers and sons shared an uneasy intimacy in which Dad attempted to pass on wisdom from his youth, while his son patiently sorted out the archaic advice from the useful. When fathers attempted to apply outmoded standards of discipline, TV sons turned to their mothers to bring Dad up-to-date. Throughout the decade, African Americans were relegated to roles as servants or appeared as entertainers in evening variety programming. With the exception of Desi Arnaz, Latinos were largely absent from prime time, as were Asian Americans. In an era in which men of color were still often referred to as “boys,” for children of color the father-son relationship was not a representational reality, and television offered a social landscape from which they and their families were largely absent. Generally speaking, the television family grew whiter and more affluent as the 1950s progressed, moving from environments marked by class and ethnicity to more homogeneous suburban surroundings. By the middle of the 1960s, the bland family of late 1950s prime time was infiltrated by unruly members whose differences had to be contained or disguised. On Bewitched (1964–1972), advertising executive husband Darren made futile
Television: Domestic Comedy and Family Drama demands on his wife, Samantha, not to practice witchcraft. When both daughter Tabitha and son Adam developed supernatural powers, Darren became even more disempowered. On The Munsters (1964–1966), a family of ghouls tried unsuccessfully to adjust to their suburban surroundings, and Frankenstein-father Herman was often at a loss as to how to counsel his werewolf son, Eddie, on how to fit in. The characters on The Addams Family (1964–1966), however, lived to confound their straight-laced neighbors, and parents Morticia and Gomez reveled in their overweight and sadistic son Pugsly’s habit of tormenting his peers. Although the fathers in these programs continued to be bumbling and ineffectual, situation comedies such as these simultaneously depicted the family as a primary location for society’s repression of a boy’s individuality and as the place where boys might effectively mount an assault on those constraints (Spigel 1991). Other 1960s popular prime-time depictions of family life tended to mix genres, as in Bonanza (1959–1973), a western centered on the family of widower Ben Cartwright, and My Three Sons (1960– 1972), which blended comedy and drama in a family also lacking a mother figure. Family Affair (1966–1971), which featured an advertising executive raising his orphaned niece and nephew, and the short-lived Courtship of Eddie’s Father (1969–1972) also blended comedy and drama around men as single parents. These programs offered a corrective to popular 1950s fears of the father emasculated by his wife’s power as manager of the domestic economy and primary parental authority in the life of their children. As such, they played upon calls by childrearing professionals for fathers to
677
take a more active role in their sons’ lives and upon popular concerns that a father’s increased domestic presence might feminize him and, by extension, his son. The plots of these programs often revolved around efforts by the father to occupy the unfamiliar role of primary caregiver and to understand how he could negotiate that traditionally feminine role from a masculine perspective. The grown or growing sons of these single fathers, in turn, performed a more sensitive version of masculinity, offering the emotional support that their absent mothers would have provided. These relationships depicted both fathers and sons as empowered by their efforts to understand and support each other rather than estranged by that familiarity as they often had been in the 1950s. By the end of the 1960s and the rise of the women’s movement, single mothers became more visible on prime-time television. In 1968, Diahann Cannon starred in the first dramatic lead for a black woman, as Julia (1968–1971), a single mother and nurse raising a son. A year later, Shirley Jones played a stage mother and performer in the persistently popular The Partridge Family (1970–1974), a situation comedy about a traveling family of pop singers in which the responsibility of raising three sons and two daughters was shared by the mother and the band’s bachelor manager. On Alice (1976–1985), a single mother replaced her dreams of becoming a country singing star with a job as a truck-stop waitress in order to raise her son. Each of these programs reversed the problem of single-father comedies and dramas, placing the focus on the needs of the female lead to maintain a strong relationship with her child(ren) while providing for them. The sons of these single mothers often attempted to
678
Television: Domestic Comedy and Family Drama
provide the emotional support of an absent husband or lover, and these plot points were usually resolved by the mother acknowledging her son’s effort while restoring his right to be a child. Both types of single-parent programs, however, raised concerns about a boy’s ability to develop a stable gender identity when faced with the absence of one parent and addressed them through a mix of comedy and drama. Prime-time representations of the family in the 1960s questioned straightlaced late-1950s domestic configurations in order to tap into tensions created by the increasing economic and social power of women and children. Television in the 1970s, however, seemed to reprise the early 1950s model in which the family became the site of intergenerational struggles over ethnicity, race, class, and gender. Particularly in the live situation comedies of Norman Lear, the American family appeared as a battleground over identity. On The Jeffersons (1975–1985), a bigoted black man, George Jefferson, constantly bridled at his son’s marriage to the daughter of a mixed-race couple and at his wife’s refusal to join him in a blanket condemnation of all white people. Sanford and Son (1972–1977) offered intergenerational conflict in the form of an elderly black junk dealer whose stereotypical ghetto masculinity was undermined by his educated adult son’s demands that he conform to more acceptable middle-class norms of behavior. Good Times (1974–1979) portrayed a working-class, two-income, black family’s struggles to stay together and get ahead. On Good Times, the teenage son, J.J., was an aspiring painter who celebrated ghetto smarts and street language, often to the frustration of his hardworking parents. Esther Rolle and John Amos,
who portrayed the parents, objected to the show’s increasing emphasis on J.J., arguing that his character encouraged negative stereotyping of young black men; both quit the show at different points over this concern (MacDonald 1983). Although these programs were of extremely high quality and critically acclaimed, their white writers, directors, and producers often condescended to their subjects, suggesting that working-class and poor whites and blacks offered object lessons in outdated behavior and thinking or examples of noble humanity to an implicitly white and middle-class audience (cf. Barker 2000). In particular, they offered few examples of stable and positive family life in which sons gained from meaningful and mutually respectful relationships with their parents. At the same time as some prime-time families struggled to resolve larger social tensions around integration, identity, and the reconfiguration of the nuclear family, other programs sought to remedy this dissolution via nostalgic representations of childhood innocence and stable and separate gender identities. The Brady Bunch (1969–1974) presented a blended family in which three boys and three girls faced the problems of suburban adolescence with minimal guidance from their loving parents. The program’s recurring narrative theme revolved around petty hostilities between the boys and the girls, keeping the two genders in clearly defined camps. Happy Days (1974–1984) returned to a mythic 1950s in which the weightier issues of the 1960s and 1970s had not yet happened. The Great Depression formed the backdrop for The Waltons (1972– 1981), a tale of a poor rural family whose intimacy and love allowed them to face hardship as a unit. The ongoing story in both Happy Days and The Waltons fo-
Television: Domestic Comedy and Family Drama
679
The dysfunctional Bundy family, from Married . . . with Children (Photofest)
cused on the eldest son, replacing models of Oedipal conflict with warm, sentimental cooperation and depicting the male passage into adulthood as a gradual attainment of increased rights and responsibility and the willing support of parental authority. Reflecting the demographic power of late baby-boom adolescents, each of these series emphasized sibling relationships and represented parents as competent and available but not central to family life. In the 1980s, savvy audiences inured to these two versions of the family found ironic pleasure in the absurdity of primetime family melodramas, beginning with the openly satirical Soap (1977–1981) and continuing with Dallas (1978–1991), Dynasty (1981–1989), and Falcon Crest (1981–1990). Like daytime soaps, these programs inverted the idea of the family as a location for social stability and safety, treating it instead as a convenient
vehicle for petty vengeance and selfaggrandizement, the vulnerable point in an individual’s social and personal armor. The sons of these imaginary American aristocrats connived with equal enthusiasm against parents and siblings as the programs converted hostility against the upper class and unease with the family as an economic unit into a cynical inversion of the ideal American family. By the end of the decade this representational trend spawned situation comedies—such as Married . . . with Children (1987), The Simpsons (1989–present), and Roseanne (1988–1997)—that treated the family as a parodic nightmare growing out of the American dream (cf. Kerwin 1994). Each of these programs featured playful gender antagonism as a central narrative feature, with sons and fathers mounting a mutual and usually feeble defense against the aggressive and often more sensible feminism of female family members.
680
Television: Domestic Comedy and Family Drama
These shows featured working-class families indifferent or hostile to the middleclass values inherent in that dream, living in a world far more ideologically and socially complex than the depictions of stable suburban family that continued in programs such as Family Ties (1982–1989), Growing Pains (1985–1992), and The Wonder Years (1988–1993). Drawing on popular discourses of the dysfunctional family—in which ideals of proper family life were mobilized to repress the needs of individual family members—these programs reveled in their impropriety and in the human frailty of their characters. In many ways, these domestic parodies represented a rebellion against the enormous social and ideological burden that television families had been made to carry— particularly in their tacit acceptance of sons and fathers as more significantly connected to the larger world of public social life. By the early 1990s, leaders of the Christian right declared these extremely popular dysfunctional prime-time families a threat to “family values,” with the resulting brouhaha culminating in candidate Dan Quayle’s famous denunciation of fictional character Murphy Brown’s choice to have a child out of wedlock. During the 1980s and 1990s, The Cosby Show (1984–1992) provided a counterexample to the stereotype of black family life in general and father-son relationships in particular as fragmented and economically and socially impoverished. The Huxtable family featured two professional, middle-class parents, respectful children, and an emotionally rich home life. Taking his cue from his parents, the Huxtables’ teenage son, Theo, treated his sisters with warmth and respect (within the bounds of adolescence) and received the same in kind. Simultaneously praised for offering positive role
models and criticized as unrealistic, the program bore the weight of the lack of widespread representations of black family life on television. (Asian American and Latino families continue to be conspicuously absent in prime time.) By the early 1990s and the birth of alternative networks such as United Paramount Network (UPN) and Warner Brothers (WB), programs such as Sister, Sister (1994– present) and Moesha (1996–present) also offered models of stable black family life. Unlike The Cosby Show, however, these programs targeted the profitable teen market, with plots centered on female adolescent characters, and parents appeared on the sidelines to offer moral guidance and provide narrative frames for their children’s experience. Through the 1990s and into the twenty-first century, prime-time representations of boyhood and family life have continued to emphasize relationships between adolescents and children while marginalizing parents. On popular programs such as Beverly Hills 90210 (1990–2000), Dawson’s Creek (1998–present), and Buffy the Vampire Slayer (1997– present), parents, although loving and supportive, either rarely appear or do not fully understand what their children are doing or feeling. Although the gender relations in these programs are relatively traditional, in each the boys and young men must attempt to deal with the girls and young women in their lives as equals. In the logic of consumer demographics, however, these programs are aimed primarily at teenage girls, with plots that balance action with emotional relationships and personal development. Teenage boys are considered to prefer the violence and sexism of action programming. Yet these programs enjoy significant demographic crossover in terms of
Television: Race and Ethnicity both age and gender, indicating, perhaps, that although the nuclear family of prime-time television, either as an ideal or as a problem, no longer resonates with boys and young men, nontraditional “families” of friends do. Generally, television has always struggled to negotiate tensions between the ideal and the real, and the television family has been a prime location for those negotiations. From television’s earliest days, a widespread popular belief that the medium affects the behaviors and life choices of children, particularly of boys, has endured. (Consider the outcry after the 1999–2000 spate of schoolyard assaults, most of which were committed by teenage boys.) There is no solid evidence to support this belief, but its persistence points to the incredible expectations we hold for the medium and its representations. Although we can always hope that television will play a role in helping boys (and girls) become the adults we would wish them to be, it is more reasonable to expect that television can show us how we have imagined boyhood and family life at any given historical moment, offering us a richer understanding of the place of boys in our social life. Nicholas Sammond References and further reading Ang, Ien, and Joke Hermes. 1991. “Gender and/in Media Consumption.” In Mass Media and Society. Edited by James Curran and Michael Gurevitch. New York: Routledge. Barker, David. 2000. “Television Production Techniques as Communication.” In Television: The Critical View. 6th ed. Edited by Horace Newcomb. New York: Oxford University Press. Barnouw, Erik. 1978. The Sponsor: Notes on a Modern Potentate. New York: Oxford University Press.
681
Boddy, William. 1990. Fifties Television: The Industry and Its Critics. Chicago: University of Illinois Press. Desjardins, Mary. 1999. “Lucy and Desi: Sexuality, Ethnicity, and TV’s First Family.” In Television, History, and American Culture: Feminist Critical Essays. Edited by Mary Best Haralovich and Lauren Rabinovitz. Durham, NC: Duke University Press. Haralovich, Mary Best, and Lauren Rabinovitz, eds. 1999. Television, History, and American Culture: Feminist Critical Essays. Durham, NC: Duke University Press. Kerwin, Denise. 1994. “Ambivalent Pleasure from Married . . . with Children.” In Television: The Critical View. 5th ed. Edited by Horace Newcomb. New York: Oxford University Press. Lipsitz, George. 1990. Time Passages: Collective Memory and American Popular Culture. Minneapolis: University of Minnesota Press. MacDonald, J. Fred. 1983. Blacks and White TV: African Americans in Television since 1948. Chicago: Nelson Hall. Murray, Michael D. 1990. “A Real Life Family in Prime Time.” In Television and the American Family. Edited by Jennings Bryant. Hillsdale, NJ: Lawrence Erlbaum Associates. Newcomb, Horace, ed. 2000. Television: The Critical View. 6th ed. New York: Oxford University Press. Spigel, Lynn. 1991. “From Domestic Space to Outer Space: The 1960s Fantastic Family Sitcom.” In Close Encounters: Film, Feminism, and Science Fiction. Edited by Constance Penley, Elisabeth Lyon, Lynn Spigel, and Janet Bergstrom. Minneapolis: University of Minnesota Press. ———. 1992. Make Room for TV: Television and the Family Ideal in Postwar America. Chicago: University of Chicago Press.
Television: Race and Ethnicity To discuss race and ethnicity in television in terms of boyhood requires dealing with two central issues, representation and identification. The question of racial
682
Television: Race and Ethnicity
A scene from Good Times, 1976: J.J. (Jimmie Walker, left) talks to his younger brother Michael (Ralph Carter, right) (Photofest)
and ethnic representation on television is of ongoing concern in American society. As recently as 1999, complaints were filed against the industry, and boycotts were organized by the National Association for the Advancement of Colored People (NAACP) in an attempt to address the lack of, or the quality of, representations of racial and ethnic groups on television as a whole and on network television in particular. Since the advent of and widespread popularity of network television in the late 1940s and 1950s, the impact of the medium on the American family has been explored by sociologists, psychologists, media scholars, and politicians. It is indisputable that television is one of the most influential and widespread media formats. Acknowledging
television’s ability to participate in the shaping of identity as well as its role in providing information on the world around us, the medium’s ability to address the needs and concerns of racial and ethnic groups has been called into question. Throughout the history of television, white American boys had televisual representations with which they could identify, from Timmy in Lassie (CBS, 1954–1971) to Beaver Cleaver of Leave It to Beaver (CBS/ABC, 1957–1963) to even Bart of The Simpsons (Fox, 1989– ). In contrast, the possibilities for racial representation of, or racial identification for, boys of other groups were limited by the medium’s choice of representation and America’s troubled racial past and contentious present. At its inception, television was perceived by many in the black community as a possible impartial space for African American representation in the media. African American magazines such as Ebony reported such beliefs in the pages of their magazines. As early as 1951, network television shows also espoused a treatise of tolerance. The Texaco Star Theater presented a musical revue called “The United Nations of Show Business,” hosted by Danny Thomas and Milton Berle. The show suggested that prejudice could not exist on television and that there was room for anyone regardless of race or ethnicity. Although network television executives openly promised that television productions would not be biased, their rhetoric contrasted with the reality, in which minority participation was minimal at best and was limited to very specified roles, such as that of musical entertainer. The major network series that included black casts, Beulah (1950–1953) and Amos ’n’ Andy (1951–1953), created con-
Television: Race and Ethnicity troversy because they were based on stereotypes of African Americans. Although oral histories reveal that the shows were enjoyed by segments of the black community, there was also significant criticism of them. This type of atmosphere within the industry determined the viewing position of young boys of minority backgrounds. Ensconced in a televisual white world, minority boys would have watched and enjoyed mainstream entertainment programming. However, many oral histories have reported the excitement that existed within black households when an African American appeared on television in a guest appearance or actually had a show created for him or her, such as the short-lived Nat King Cole Show (1956– 1957). In terms of television, as in many other aspects of life, members of many minority communities within the United States identified first with issues of race and ethnicity before turning to those of age. A brief look at some aspects of American television in terms of racial programming will clarify this assertion. African Americans exploded onto the television screen in the late 1950s and 1960s in the realm of news, news specials, and documentaries. The United States was involved in more than a decade of social unrest, the culmination of generations of racial oppression. The harsh reactions to the civil rights movement’s tactics of nonviolent protest were covered by the media. Urban uprisings were at a high point in the late 1960s and early 1970s. In the 1960s, all three major networks addressed the country’s racial upheaval. For example, in 1968, American Broadcasting Companies (ABC) television produced Time for Americans, National Broadcasting Company (NBC) produced What’s Happening to America,
683
and Columbia Broadcasting System (CBS) brought out Of Black America, all indepth series on racial issues. However, television still remained more accepting of moderate leaders such as Martin Luther King, Jr., and Ralph Abernathy while producing such documentaries as Mike Wallace’s five-part series for CBS, The Hate That Hate Produced, which discussed the so-called Negro racism of the Nation of Islam, or the “black Muslims,” whose concepts of separatism, revolution, black control, and self-defense had begun to resonate with large segments of the African American population. Television shows such as Black Journal (1968–1976) produced news by and for the black community and often provided a viewpoint opposite to those of mainstream news networks. Here, African American boys could truly find images of themselves and news topics of interest to them. Black Journal gave voice to the young black male leaders disparaged by mainstream media, such as Huey P. Newton and Bobby Seale, and featured figures and organizations that appealed to black youth. The program also often interviewed youth working within the community and overtly encouraged youth participation in political struggles. Outside news programming, race would not come to the forefront of television until the 1970s, and the show that catapulted black youth culture onto U.S. television screens was Soul Train (1970– ). The music, performances, and dancers of Soul Train have been a part of African American culture and the American television landscape since 1970. Don Cornelius invested his own money to create a pilot for Soul Train, which he then took to the merchandising manager for the five Sears Roebuck stores located in Chicago’s inner-city community. With
684
Television: Race and Ethnicity
Sears as a sponsor and an agreement from WCIU-TV, Soul Train premiered in Chicago on August 17, 1970. In October 1971, Soul Train made its debut in eight new urban markets and was very successful. The show caught on quickly and became the benchmark for style and hipness in 1970s American society. As the show’s tag line suggested, it was “the hippest trip in America.” The young dancers in the Soul Train line were the stars of the show. All high school and college students, they sported naturals and other contemporary hairstyles, and were dressed in the hip accouterments of the day: bell bottoms, broad-collared shirts, and wide belts. They set the clothing and dance styles for the American public. The Soul Train dancers primarily performed to the music of contemporary black artists. The list of guest appearances on Soul Train reads like a Who’s Who of black artists of the 1970s, many of whom have cultural currency today: James Brown, Curtis Mayfield, B. B. King, the Temptations, Stevie Wonder, Aretha Franklin, the Jackson Five, and Marvin Gaye. Soul Train remains the longestrunning syndicated show on television. In the 1970s in general, images of African Americans and members of some other racial and ethnic groups increased in entertainment network television, particularly in situation comedies, as shows such as Sanford and Son (NBC, 1972– 1977), The Jeffersons (CBS, 1975–1985), and Chico and the Man (NBC, 1974– 1978) premiered on network television. However, it was Good Times (CBS, 1974– 1979), Welcome Back, Kotter (ABC, 1975–1979), and What’s Happening!! (ABC, 1976–1979) that specifically targeted the young racial and ethnic male. Good Times, for instance, was the first
sitcom to have a black nuclear family. The story focused on the father, James (John Amos), who worked long hours for little pay or was out of work and looking for a job; Florida (Esther Rolle), the mother who was struggling to make ends meet and keep her family together; and Thelma (BernNadette Stanis), the intelligent daughter who was determined to succeed. The young male characters in the show, J.J., or James Junior (Jimmie Walker), the eldest son, and Michael (Ralph Carter), the youngest son, represented different sides of urban black life. J.J. was the artist, a self-proclaimed ladies’ man, and the stereotypical comic relief for the show. Michael was the voice of militancy who spoke the rhetoric of the revolution and was often called the “militant midget” by other characters. Most of the story lines surrounding Michael’s character dealt with situations that many urban and minority youth faced. For example, in “IQ Test” (1974), Michael deliberately fails the intelligence quotient (IQ) test because he believes that such tests are biased. “Michael the Warlord” (1976) addressed the youngest son’s involvement with a street gang. ABC created the popular multiethnic youth sitcom Welcome Back, Kotter, with such characters as John Travolta’s Vinnie Barbarino, Lawrence Hilton-Jacobs’s Freddie Boom-Boom Washington, Robert Hegyes’s Juan Epstein, and Ron Palillo’s Arnold Horshack as the Sweathogs, a classroom of students at James Buchanan High School in Brooklyn. ABC concurrently developed What’s Happening!!, which used the West Coast as its setting and a group of Los Angeles high school students. The show was based on the 1975 film Cooley High and featured actors Ernest Thomas as Roger Thomas, and
Television: Race and Ethnicity Haywood Nelson, Jr., and Fred Berry as his best friends, Dwayne and Rerun. Black life also broke into Saturday morning animation in the 1970s with such shows as The Jackson Five (ABC, 1971–1973), based on the famous Motown all-boy singing group of the same name; I Am the Greatest (NBC, 1977–1978), based on the adventures of Muhammad Ali; and the show that perhaps had the largest impact, Fat Albert and the Cosby Kids (CBS, 1972–1984). Fat Albert was hosted by comedian Bill Cosby and featured the characters Fat Albert, Mush Mouth, Weird Harold, and Donald. Based on Cosby’s boyhood friends, each episode usually incorporated a moral lesson. The 1970s proved to be a high point for ethnic and racial television programming, particularly as it addressed boyhood. In the 1980s a backlash began that led to an almost complete disappearance of racial and ethnic characters from network television. Young minority male characters, in particular, were seemingly removed from their typical environments and, when featured on television, were placed within the care of the all-white world. Examples of this include Diff’rent Strokes (NBC/ABC, 1978–1985) and Webster (ABC, 1983–1987). Each of these shows featured young black boys who were adopted into all-white middle- and upper-class families after the death of their parents. One of the only shows with a black cast to gain and sustain popularity during the 1980s was another production of Bill Cosby, The Cosby Show (NBC, 1984–1992). Again using his own life as a basis for his comedy, Cosby took Americans through the growth and development of his televisual family, which included his son Theo, portrayed by Malcolm Jamal-Warner, who aged from
685
preadolescence to young adulthood during the run of the program. Theo would deal with life issues ranging from growing up in a family of girl siblings to coping with dyslexia, attending college, living away from home, and choosing a job. Because of the longevity of the show, it was one of the most complete examples of the development of the young black male in the realm of television. With the advent of cable and growth in the number of available channels, the opportunities for niche marketing increased, and new cable channels such as Black Entertainment Television (BET) were the result. Stations such as Nickelodeon and the Disney Channel, which targeted youth audiences, also created shows that were inclusive of the ethnic and racial population increasingly visible in American society. However, in the late 1980s and 1990s, the televisual forum that produced the greatest impact on youth in general, with a particular impression on young boys of racial and ethnic backgrounds, had a musical format. Music Television (MTV) created a space that specifically addressed youth culture and the role of both boys and girls as producers and consumers of contemporary rhythm and blues, pop, rap, and hip-hop. MTV did not specifically carry rap music until the premiere of YO! MTV Raps in 1988, when it realized that it could no longer ignore this widespread art form. Since then, because of the popularity of rap music and hip-hop as a whole, the music videos of so-called minority artists, who are in the majority on MTV, have flooded the station in regular rotation and in the new programs that incorporate their music, such as The Lyricist’s Lounge and Total Request Live (TRL). The impact of hip-hop as visualized
The family of The Cosby Show (Photofest)
Television: Westerns through MTV can be seen not only on young boys of diverse races but also on American culture as a whole, through the music as well as clothing, language, and style. Christine Acham See also African American Boys; Asian American Boys References and further reading Gray, Herman. 1995. Watching Race: Television and the Struggle for Blackness. Minneapolis: University of Minnesota Press. Jhally, Sut, and Justin Lewis. 1992. Enlightened Racism: The Cosby Show, Audiences and the Myth of the American Dream. Boulder, CO: Westview Press. MacDonald, J. Fred. 1992. Blacks and White TV: Afro Americans in Television since 1948. Chicago: Nelson-Hall Publishers. McNeil, Alex. 1996. Total Television. New York: Penguin.
Television: Westerns For many years, Westerns were one of the most popular genres of television shows, appealing to both children and adults and frequently sparking merchandising crazes and inspiring childhood play. The Western owes much of its popularity to hordes of male youth idolizing cowboy and outlaw heroes and fantasizing about a life of adventure and freedom on the frontier. Although Westerns were not produced only for a young male audience, writers and producers quickly recognized the importance of these fans and often took them into consideration. Consequently, the Western genre can be read as a prime example of how mass media taught American boys concepts of national identity and pride while merging those concepts with specific models of masculin-
687
ity. Girls may also have been attracted to the Western genre (the early 1950s TV series Annie Oakley serves as an example of the genre that attempted to speak to young female fans), but the prevalence of male protagonists whose main emotional attachment was either to their comic sidekick or their horse heavily gendered the genre as male. During the period of their highest popularity, the late 1950s to the early 1960s, a consistent majority of the ten top-rated (most watched) TV shows in the United States were Westerns. The popularity of the form declined in the 1960s, and by most accounts the format was dead by the 1980s. Hollywood film Westerns had been a staple of American film production since the early 1900s, when they evolved out of other Western media entertainment such as the dime novel and the Wild West show. The Western usually was set during the period of the Indian wars on the Great Plains (roughly 1850–1900) and told a story about a “good bad man”—a cowboy hero who was often a loner and a rebel but who came to aid the of settlers when needed. The “forces of nature” that the white settlers often battled against frequently included Native Americans, or as they were known within the genre, “Redskins” or “Injuns.” Many critics of the Western understand the genre as a system of mediated texts that in some way attempts to “justify” the genocide of Native American people by depicting them as a cultureless hostile menace in need of eradication so that “civilization” (i.e., white Christian patriarchy) could prevail. This concept of Manifest Destiny (the alleged God-given right of white people to colonize the globe) was a dominant belief in the nineteenth century, and it lingered well into the twentieth century in some areas of the United States, especially
688
Television: Westerns
within the thematic meaning of the Western film genre. The filmic Western went through many periods of change and evolution before being critiqued and parodied in the 1960s and 1970s. It is no longer a very popular Hollywood genre (perhaps because of its inherent racism), but according to some sources, one-fifth of all Hollywood films produced before 1960 could be considered Westerns. The TV Western drew its main inspiration from Hollywood B-Westerns (cheaply produced and quickly made films that did not feature major stars or high budgets) as well as radio Westerns. Radio Westerns such as The Lone Ranger, which had been on the air since 1933, could be easily adapted to television with the simple addition of visual images to preexisting scripts and situations. The Lone Ranger ran on the American Broadcasting Companies (ABC) network from 1949 to 1957 and made its titular star (Clayton Moore) and sidekick buddy Tonto (Jay Silverheels) into American icons; their catch phrases “Kemo Sabe” and “Hi-Yo, Silver, away!” are still recognized in the twentyfirst century. A more usual method of producing TV Westerns was the adaptation of B-Western filmmaking units into television producing units. For example, B-Western film stars Gene Autry and Roy Rogers moved from the movies to TV with ease. The Roy Rogers Show aired on NBC from 1951 to 1957, and The Gene Autry Show ran on CBS from 1950 to 1956. Autry’s Flying A Productions also produced many other Westerns during this period, including The Range Rider (in syndication, 1951–1953), Death Valley Days (in syndication, 1952–1970), Annie Oakley (in syndication, 1952–1956), Buffalo Bill Jr. (in syndication, 1955), and The Adventures of Champion (CBS, 1955– 1956). Roy Rogers and Gene Autry were
actually singing cowboy stars who had become popular in the 1930s and 1940s via radio and the movies. Their western heroes were gentlemen cowboys, and they wooed audiences with song as well as “clean living” heroics. The vast majority of these Western TV shows were shot on film (unlike much of early live television) and could thus exploit outdoor settings and action-filled narratives. As the B-Westerns they evolved from were assumed to attract mainly “kiddie matinee” audiences, their transfer to television predicated an emphasis on younger viewers during the first years of commercial television in the United States (roughly the late 1940s and early 1950s). The television resurgence of the career of William Boyd as Hopalong Cassidy (NBC, 1949–1951; in syndication, 1952–1954), who specifically appealed to his young fans in ads and public appearances, stands as an overt example. Many parents of the era considered TV Western heroes to be good role models for young boys. Many, like Gene Autry, publicized their own “Ten Cowboy Commandments” of proper behavior. The earliest TV cowboys were also merchandising phenomena. When Walt Disney aired a three-part telling of the legend of Davy Crockett on ABC in 1954–1955, the resultant avalanche of toy merchandising (coonskin caps, rifles, moccasins, tents, etc.) further indicated the genre’s appeal to young (predominantly male) children. Yet underneath this simple schoolyard Western playacting lay more disturbing ideological meanings: the common sentence, “The only good Injun is a dead Injun,” usually part of the game of cowboys and Indians, is equally well known as “Hi-Yo, Silver, away!” By the mid-1950s, the kiddie matinee cowboy shows were being transformed
The Lone Ranger and Tonto hunt down the bad guys, 1955. (Kobol Collection)
690
Television: Westerns
into the so-called adult Western, with shows such as Gunsmoke (CBS, 1955– 1975), Cheyenne (ABC, 1955–1963), and Bonanza (NBC, 1959–1973) going on the air for the first time. The adult Western on TV coincided with the arrival of adult Westerns on movie screens. In place of simplified cowboy action heroes, now arose more psychologically complex (and in some cases outright neurotic) Western characters. The genre’s racism was challenged in movies such as Broken Arrow (1950), which attempted to depict Native American cultures in more accurate ways; Broken Arrow also became an ABC TV show from 1956 to 1958. Generally, the TV Westerns differed from their filmic counterparts in several ways. TV Westerns tended to be less violent and not as potentially controversial as were filmic Westerns. This characteristic had as much to do with the demands of series narrative as it did with television censorship codes: the need for recurring characters made it very difficult to kill them off as easily as in the movies. Many TV Westerns eventually became more like domestic or community melodramas than filmic Westerns. Gunsmoke, for example, reconstitutes a family unit with a motherfigure (Miss Kitty), a father-figure (Marshall Matt Dillon), a brother-figure (Doc), and various son-figures. Bonanza was directly structured around an all-male family and sought to be everything to all audiences: a situation comedy, an action adventure series, and a family melodrama set in the West. Allegedly, the show’s creator, Dave Dotort, had designed the show to combat “Momism,” the 1950s idea that women were having an effeminizing (and therefore negative) effect on American masculinity. As such, the show became an effective voice of patriarchal
moralizing and national identity building during the Cold War era. By the late 1950s, there were so many Western series on the air that the public became overly familiar with their styles and narrative patterns. (In 1959 alone there were forty-eight Western series on the air.) TV satirist Ernie Kovacs could include Western parodies in his famous comedy specials, and entire Western TV shows that parodied the genre began to appear. Perhaps the most famous of these was Maverick (ABC, 1957–1962), which starred James Garner as a card shark and comic grafter. Although that characterization alone was enough to critique the idea of the heroic westerner, the show would also frequently spoof other popular TV shows, such as Gunsmoke and Bonanza, and even cop and crime shows, such as The Untouchables and Dragnet. By the mid-1960s, ABC was airing F Troop (1965–1967), a situation comedy that presented its cavalrymen as con men, idiots, and buffoons. Perhaps most interesting were generic hybrids such as The Wild Wild West (CBS, 1965–1969). As the 1960s progressed and James Bond fever swept the globe, TV shows about spies, such as The Man from UNCLE (NBC, 1964–1968), became very popular. The Wild Wild West might be considered a combination of TV Western and TV superspy spoof, focusing on the adventures of Jim West (Robert Conrad) and Artemus Gordon (Ross Martin), two secret service agents in the nineteenth-century Wild West. They routinely rode horses, engaged in fisticuffs and gunplay, but also battled mad geniuses with high-tech spy gadgetry. Although most young boys probably responded to The Wild Wild West as an example of a rousing male-dominated
Television: Westerns buddy action show, others may have responded more to the show’s latent homoeroticism, especially surrounding its star Robert Conrad’s tight pants and proclivity for bare-chested heroics. Still, even as the TV Western was mutating into parodies, hybrids, and potential deconstructions, another family Western, The Big Valley (ABC, 1965–1969), made its debut and lasted for several years. In it, Barbara Stanwyck played the matriarch of a western family; like Bonanza, the show was perhaps more of a family melodrama set in the West than a Western per se. By 1970, the major networks (NBC, CBS, and ABC) realized that their Westerns and rural comedies (shows such as The Andy Griffith Show, The Beverly Hillbillies, Petticoat Junction, and Gomer Pyle USMC) were enormously popular, but with the “wrong” audiences. In other words, these shows were watched by children and adults in rural and lower socioeconomic classes, and thus even though such audiences gave the shows very high ratings, they did not give the television advertisers high sales profits. Decisions were made at all networks (but especially at CBS) to jettison rural programming in favor of urban crime dramas and sitcoms (All in the Family, The Mary Tyler Moore Show) in order to capture a more upscale consumer audience. Many Westerns fell under the ax. Perennial favorites such as Gunsmoke and Bonanza would hold on for a few more years, but new Western hybrids such as Kung Fu (ABC, 1972–1975), which might be thought of an “Eastern,” and urban detective shows with Western heroes, such as McCloud (NBC, 1970– 1977), became the newest incarnation of the TV Western. Later manifestations of the TV Western include Little House on the Prairie
691
(NBC, 1974–1982), a series created by and starring Michael Landon of Bonanza fame, but it too was actually more of a family melodrama set in the past than a Western with cowboy heroes, Indians, and gunplay. More recent shows like Dr. Quinn, Medicine Woman have exploited a certain Western flavor but little of the thematic mythology of the classical Western genre. In the 1990s, more “faithful” Western fare such as The Magnificent Seven (based on the movie of the same name) has failed to find a television audience. Whether one points to growing audience sophistication, changing national demographics, or the genre’s inherent racism and sexism, the Western in its classical incarnation is for the most part dead in both contemporary American film and television. Harry M. Benshoff Sean Griffin See also Films References and further reading Barson, Michael. 1985. “The TV Western.” Pp. 57–72 in TV Genres: A Handbook and Reference Guide. Edited by Brian G. Rose. Westport, CT: Greenwood Press. Brauer, Ralph. 1975. The Horse, the Gun, and the Piece of Property: Changing Images of the TV Western. Bowling Green: Popular Press. Buscombe, Edward. 1988. The BFI Companion to the Western. New York: Da Capo Press. Buscombe, Edward, and Roberta E. Pearson. 1998. Back in the Saddle Again: New Essays on the Western. London: BFI Press. Cameron, Ian, and Douglas Pye. 1996. The Book of Westerns. New York: Continuum. Cawelti, John G. 1985. The Six-Gun Mystique. Rev. ed. Bowling Green: Bowling Green University Popular Press.
692
Tennis
Coyne, Michael. 1997. The Crowded Prairie: American National Identity in the Hollywood Western. New York: St. Martin’s Press. Jackson, Ronald. 1994. Classic TV Westerns: A Pictorial History. New Jersey: Carol Publishing Group. Parks, Rita. 1982. The Western Hero in Film and Television. Ann Arbor: UMI Research Press. Schatz, Thomas. 1981. Hollywood Genres: Formulas, Filmmaking, and the Studio System. New York: Random House. Slotkin, Richard. 1992. Gunfighter Nation: The Myth of the Frontier in TwentiethCentury America. New York: Atheneum. West, Richard. 1987. Television Westerns: Major and Minor Series, 1946–1978. North Carolina: McFarland.
Tennis Although Americans played forms of tennis as early as the eighteenth century, not until 1873 did an Englishman, Major Walter Wingfield, invent lawn tennis, a game played on a rectangular court by two players or two pairs of players who use rackets to hit the ball back and forth over a low net that divides the court. Socialite Mary Outerbridge, after observing British officers at play, brought this version of the racket game from Bermuda to the New York area in 1874. It gained popularity among the society set and took root in New England. By 1875 a tournament was held near Boston. The famed English tourney at Wimbledon began in 1877. Californians returning from a trip to England established the sport in the Santa Monica area by 1879. The United States Lawn Tennis Association (USLTA) was organized in 1881, representing thirty-four clubs. That August its first championship matches were held at the casino in Newport, Rhode Island. The “L” was dropped from the USLTA in
1975 because the organization’s tournaments were held on a variety of surfaces, chiefly hard court and clay. In 1883, Ivy League boys won the first intercollegiate tourney, and they continued to dominate the sport until 1921, when Stanford’s Philip Neer broke the string. From 1883 to 1921, all the champions hailed from all-male colleges: fifteen from Harvard, eight from Yale, and five from Princeton. The Olympic Games included tennis competition from 1896 (Athens) through 1924 (Paris). Boys in the United States had new game skills to learn and sports heroes to emulate. Opportunities to master tennis fundamentals and to play the sport improved. The playground movement, begun in the 1870s with a social service purpose, led to ten cities building such recreational facilities between 1890 and 1900. As early as 1878, John H. Vincent introduced lawn tennis in the summer to the western New York Chautauqua, a religious, recreational, educational, and cultural program begun at Chautauqua Lake in 1874 and a movement that became popular in the United States and Canada by the early twentieth century. The Young Men’s Christian Association (YMCA) and private boys’ camps for the well-to-do provided diverse activities. A major leap forward was the formation by Luther Gulick of the Public Schools Athletic League (PSAL) in New York City in 1903. Open to all boys, and especially those of average skills, the innovative program offered a dozen sports, including tennis, by 1907. (A girls’ branch of PSAL was formed in 1905.) President Theodore Roosevelt wholeheartedly backed the sportsfor-all-boys concept, which was designed to improve discipline, sportsmanship, loyalty, and athletic ideals in competitions leading to city championships.
Tennis
693
Young tennis players practice their strokes. (Courtesy of Harold Ray)
By the mid-1890s, boys’ sports were finally accepted as a legitimate part of school programs. Michigan and Wisconsin led the way by regulating track and field, football, baseball, and tennis. The PSAL played a significant role in popularizing interscholastic sport. By 1910 athletic leagues modeled after the PSAL had been formed in seventeen cities. Although lawn tennis was not as popular among boys as were football, baseball, and basketball, tennis was one of the top ten fastest-growing sports. By the early 1920s a National Federation of High School Athletic Associations (NFHSAA) had been established, winning prompt endorsement by the National Association of Secondary School Principals. By 1928 scholastic tennis championships existed in twenty states. When all-
weather courts were introduced in the early 1920s, tennis’s popularity grew steadily. Companies such as Kellogg in Battle Creek, Michigan, and Hawthorne in Chicago, Illinois, offered recreational tennis programs for their employees and their families. At the end of the 1930s, there were thousands of courts across the country, ranging from clay to newer compositions of many kinds. Although tennis had become a major spectator sport, boys and girls could enjoy tennis via intramural, extramural, interscholastic, club, and after-school recreational programs. One offshoot was platform paddle tennis, originally called “paddle tennis,” devised in New York City in 1921. Played on half-size tennis courts, this game was an exciting urban recreational activity. By 1936, with development of 12-foot-high
694
Tennis
wire walls, it became a popular off-season game for tennis buffs. Under the aegis of the USTA, a wide variety of tennis tournaments evolved in which boys could test their skills. The Interscholastic Boys’ 18 singles tournament began in 1891 in Cambridge, Massachusetts, with Robert Wrenn of Cambridge Latin winning the event. Disrupted by World War I and again from 1925 to 1935, it has been hosted since 1970 on college campuses. A doubles tournament was added in 1936. Boys’ 15 singles were added in 1916. Vincent (Vinnie) Richards won in 1917, added three 18-and-under titles, and then earned a gold medal in the 1924 Olympics in Paris. Richards later became an outstanding performer on the international circuit and enjoyed a stint as a broadcaster. (Because of a dispute over amateur versus professional status, tennis disappeared as an Olympic sport between 1924 and 1988.) Other age-group tournaments for boys included the National Jaycees Championships, which ran until 1966, and the National Public Parks Championships. In the late twentieth century and into the twenty-first century, USTA tournaments for youths proliferated. As of the year 2000, the adjective super was thrown into the mix, with the USTA Super National Boys 18–16 Hard Court Championships, held each August in Kalamazoo, Michigan. Other tournaments include a series of other USTA Super Nationals (Winter, Spring, and Clay Court events). The rite of passage among young male tennis players in the United States is the USTA Boys’ 18–16 Championships. The tournament was initially hosted by the West Hills Tennis Club in Forest Hills, New York, in 1916. The first Boys’ 18 singles winner was Harold Throckmorton. With the exception of the inaugural tour-
nament, the event has always been held in late summer, normally in August. After the beginning in Forest Hills, it was moved back and forth from the East Coast to the Midwest before finding a permanent home at Kalamazoo College in southwestern Michigan in 1943. This venerable liberal arts institution, founded in 1833, proved to be an excellent site blessed with enthusiastic community support. In the twenty-first century, the USTA Super National Boys’ 18–16 Hard Court Championships still utilize this hospitable home. Winners in eighteen-and-under singles and doubles competition in the USTA Super National Hard Court Championships receive automatic berths, as wild card entries, in the U.S. Open in Flushing Meadows, New York, an obvious example of a rite of passage. Finals in the eighteen competition are the best of five sets, as at the U.S. Open, a true test for the emerging young male tennis players. All other matches in the tournament are best of three sets. Robert Falkenburg took the 1943 singles title. A repeat winner in 1944, he won Wimbledon four years later. Dr. A. B. Stowe directed the tournaments until he was killed in an automobile accident in 1957. A new site, fittingly named Stowe Stadium, was constructed in 1946. Today, its eleven lighted courts boast the all-weather Deco Turf II, which is also used in the U.S. Open. Typically, more than 3,000 spectators view the talented boys in the 18–16 singles finals each August. More than 100,000 have attended the tourney since its move to Kalamazoo—locally dubbed “the ’Zoo.” Sportsmanship trophies are a cherished part of the Nationals’ tradition. In 1958 the Allen B. Stowe Sportsmanship Award was introduced to honor players in eighteen-and-under singles;
Tennis Paul Palmer of Phoenix, Arizona, was the first recipient. The respect the tennis community had for the distinguished African American tennis player Arthur Ashe was evident when Kalamazoo College bestowed an honorary doctorate of humane letters on the popular athlete in 1992. Youngsters dream of earning international acclaim by winning the U.S. Open, established in 1881; the French Open (1891); the Australian Open (1905); or the prestigious Wimbledon tournament (1877); to win singles crowns in all of them in one season equals a Grand Slam. Most of the United States’ future Grand Slam champions competed as boys at Stowe Stadium. Some famous Wimbledon winners are Bob Falkenburg, Tony Trabert, Rod Laver, Chuck McKinley, Stan Smith, Jimmy Connors, Arthur Ashe, John McEnroe, Andre Agassi, and Pete Sampras. Aussie Laver, who won at “the ’Zoo” in 1956, was the first to take the Grand Slam championships twice. Topten rankings often include former Nationals players such as Sampras, Agassi, Michael Chang, and Jim Courier. Curiously, Sampras, Agassi, and Courier never won in singles at Kalamazoo. Just before his thirteenth birthday in 1984, Sampras played the longest three-set match (five hours, twenty-three minutes) in the history of the Nationals, losing to Texan T. J. Middleton. He also lost in 1987 to Courier who, in turn, was defeated by Chang in the Boys’ 18 finals. African American boys are making an impact on the sport once dominated by white males. Timon Corwin, director of the USTA Super National Hard Court 18–16 Championships and a former NCAA Division III singles champion at Kalamazoo College, believes that “sport kind of transcends race and classes.”
695
Young tennis player in California, 2000 (Joe McBride/Corbis)
Nonetheless, tennis, like golf, is an individual sport, and both favor boys from prosperous families because to succeed at national and international levels boys require costly private instruction, and there is no immediate guarantee of financial success. Corwin emphasizes the importance of role models for blacks in tennis and golf: “Mali Vai Washington, when he reached the singles finals at Wimbledon, thanked Arthur Ashe. And in golf, Tiger Woods thanked Lee Elder. The Williams sisters (Venus and Serena) really have made an impact in tennis for African-American girls, and they have thanked Althea Gibson” (personal interview with Bob Wagner, fall 2000).
696
Tennis
The first African Americans who played tennis had to form the American Tennis Association (1916) because racial barriers then prevented them from competing in all other tournaments, which were for whites only. When Arthur Ashe was moving into the junior tennis ranks, segregation still prevailed in the South. “The difficulty was that I never qualified for Kalamazoo from my home section [in the South],” Ashe reveals. “I had to qualify through the Eastern section. I could also play in Middle Atlantic events, but I could not play in my home state. For all of us black kids we had to arrange any attempt to get to Kalamazoo outside the South.” But Ashe opened the door for other black players like Martin Blackman, a player from the Bronx who became the first African American in either age group to win a singles championship when he defeated Michael Chang in the sixteen-and-under final in 1986. Blackman also won the doubles with Chang that year, but he lost in the eighteen-andunder singles final two years later to Tommy Ho from Winter Haven, Florida. In 1991, J. J. Jackson from Henderson, North Carolina, became the second black to win the sixteen-and-under singles. Mali Vai, Mashiska Washington, and Lex Carrington also figured prominently in the tournament in Kalamazoo. This change in the rite of passage in tennis could be called the “Arthur Ashe legacy.” Certainly, tennis is no longer a sport exclusively for well-off white boys. Boys of various races and social backgrounds learn to play tennis in diverse ways. Julian, who grew up in a barrio in East Los Angeles, recalls learning the fundamentals of the game at age fourteen in the alleys and streets. He played without a net. Charles, who grew up in Kalamazoo,
changed from baseball to tennis at age fourteen; he developed into a universitylevel champion and a ranked senior competitor. Harold, a product of a rural New York community where baseball and basketball were the most popular sports, learned to play tennis in college. Today’s facilities and programs provide multiple options for boys, and many start by elementary school (personal interview with H. L. Ray, fall 2000). Harold Ray Robert Wagner References and further reading Ashe, Arthur, and Arnold Rampersad. 1993. Days of Grace. New York: Alfred A. Knopf. Ashe, Arthur, with Alexander McNabb. 1995. Arthur Ashe on Tennis. New York: Alfred A. Knopf. Betts, John R. 1974. America’s Sporting Heritage: 1850–1950. Reading, MA: Addison-Wesley. Collins, Bud. 1989. My Life with the Pros. Pp. 1–15. New York: Dutton. Feinstein, John. 1991. Hard Courts. New York: Villard Books. Galenson, David W. 1993. “The Impact of Economic and Technological Change on the Careers of American Men Tennis Players, 1960–1991.” Journal of Sport History 20, no. 2 (Summer): 127–150. Grimsley, Will. 1971. Tennis: Its History, People and Events. Englewood Cliffs, NJ: Prentice-Hall. Ladd, Wayne M., and Angela Lumpkin, eds. 1979. Sport in American Education: History and Perspective. Reston, VA: NASPE-AAHPERD. Lumpkin, Angela. 1985. A Guide to the Literature of Tennis. Westport, CT: Greenwood Press. (An excellent tool for researchers and aficionados.) Nelson, Rebecca, and Marie J. MacNee, eds. 1996. The Olympics Factbook. Detroit: Visible Ink Press. Phillips, Dennis J. 1989. Teaching, Coaching and Learning Tennis: An Annotated Bibliography. Metuchen, NJ: Scarecrow Press. Stowe, Catherine, M. 1978. “The National Junior and Boys Tennis Championships
Theatre (June).” Unpublished history project, Kalamazoo, MI. U.S. Lawn Tennis Association. 1931. Fifty Years of Lawn Tennis in the United States. New York: USLTA. ———. 1972. Official Encyclopedia of Tennis. New York: Harper and Row. U.S. Tennis Association. 1995– . Tennis Yearbook. Lynn, MA: H. O. Zimman. Wagner, Bob. 1992. The Nationals and How They Grew in Kalamazoo. Kalamazoo, MI: J-B Printing. (Oral histories.)
Theatre Mimesis, or playacting, has been an ingredient of boyhood in America since the country’s earliest settlements, just as theatre has been part of an American experience of childhood, albeit a marginalized component of cultural activity. The study of theatre for young audiences, particularly in gender-specific terms, remains relatively unexplored. Theatre as an art form and a profession has struggled for acceptance in the United States in part because many early settlers’ evangelical beliefs focused on the denial of enjoyment in exchange for spiritual reward and in part because a frontier mentality eschewed any activity that smacked of aristocratic pretension. Theatre targeted toward an audience of boys and girls was considered a frivolous endeavor until the twentieth century and emerged in the contexts of education and social work rather than artistic enterprise. The goals of the earliest children’s theatres were to educate immigrant children and assist them in learning the language and to provide decent and respectable entertainment for children and families. Most pioneers of the children’s theatre movement were women, which further served to exclude the field from serious theatre scholarship.
697
Children’s theatre consists of two distinct fields of endeavor: children’s theatre (or theatre for young audiences, as it is now called), defined as theatre performed by adults and targeted toward an audience of children; and creative dramatics, theatrical activity for boy and girl performers that focuses on the process and experience of creating rather than on the finished production. From its inception, formal children’s theatre relied to a great extent on the adaptation of traditional folk and fairy tales for its content, but with the advent of postmodern critical readings of these stories, contemporary artists have begun writing and performing original plays in lieu of these adaptations. Although criticism of fairy tales based on gender issues has become popular and feminist critiques of adult theatre have generated controversy, little scholarship has been directed toward gender and performance in children’s theatre. Contemporary children’s theatre productions occur in every state in professional, community, or educational theatres, and although these theatres face different issues from those of the early twentieth century, the stated goals of education and entertainment remain components of nearly every group performing for or with children. The history of children’s theatre in America is not easily traced. Nellie McCaslin’s groundbreaking work provides a thorough and accepted compilation of information; other sources of information are dissertations written for graduate degrees and records from individual children’s theatre companies. Prior to the twentieth century, boys and girls mainly attended theatrical performances targeted toward adult or family audiences. Although there were isolated incidents of theatrical productions geared specifically
698
Theatre
toward children, such as puppet plays that were given on the plantations owned by both George Washington and Thomas Jefferson, traveling troupes were common, such as the Shakespearean actors encountered by the fictional Huck Finn in Mark Twain’s novel. These troupes targeted their plays toward family attendance. In addition to formal theatrical productions, informal entertainments encouraged family attendance, and the Wild West shows performed by Kit Carson and William Cody drew groups of rowdy gallery boys who came to applaud their heroes. The first well-documented theatre specifically for boys and girls, the Children’s Educational Theatre (CET), was founded in 1903 at the Educational Alliance in New York City. Its founder, Alice Minnie Herts, was a social worker who held strong commitments to the education of underprivileged children and to the performing arts. The first production offered by CET was Shakespeare’s The Tempest, a play that incorporated a large and colorful vocabulary with the excitement of an entertaining and magical plot. This production was followed by performances of adaptations of classic fairy tales. The success of CET encouraged the establishment of similar children’s theatre companies in major cities across the country. Samuel Clemens (pseudonym Mark Twain) was a strong supporter of children’s theatre and became a member of CET’s board. Although the concept of theatre targeted toward children caught on at the community level, few professional productions were mounted, and none survived for long. This lack of financial success increased professional disinterest in the field, but the Association of Junior Leagues of America, a women’s service
organization, took up the cause. Young women who recently graduated from colleges or preparatory schools often volunteered for dramatic activities with and for children. Another association that helped the children’s theatre movement to flourish was the Drama League of America, founded in 1910 to promote the establishment of community theatres across the country. Although the Drama League focused primarily on theatre for adults with the mission of ensuring that all citizens, not just those in major cities, had the opportunity to see theatrical performances, they provided guidance and stimulated grassroots activity that included children’s theatre. The Depression saw the establishment of the Federal Theatre Project (FTP) as a temporary relief measure for unemployed performers and technicians. These performers were mandated to provide entertainment linked with education, and FTP companies produced children’s theatre in several states across the country. The first institution of higher education to organize a children’s theatre was Emerson College in Boston in 1920. Soon educational theatre became an established field of study at several colleges and universities, and in 1936 a group of professors met to form the American Educational Theatre Association. In an attempt to identify themselves as professional theatre artists more than as educators, the group became the American Theatre Association. Rising costs and budget cuts brought about the association’s demise in 1986, but by that time it had succeeded in making educational theatre an accepted component in an arts curriculum. Early attempts to introduce dramatization into the schools, particularly as accompaniment to the study of literature,
Theatre became part of the Progressive education movement of the early twentieth century. The first person to differentiate between children’s theatre and creative dramatics was Winifred Ward, who taught at Northwestern University and worked in the public schools in Evanston, Illinois. The publication in 1930 of her suggested guidelines for creative dramatics directed attention to the possibilities for the use of theatre in the education of the whole child, and by the 1950s there was a movement to include training in creative dramatics in the curricula of several colleges and universities. The field of creative dramatics consists of informal drama, often improvisational in nature, undertaken for the benefit of the players rather than for an audience. It is difficult to measure the benefits of participation other than by observation. Participation in creative dramatics hopes to inspire and foster qualities such as creativity, social cooperation, sensitivity, physical poise and flexibility, communications skills and fluency in language, emotional stability, a sense of morality, and an appreciation of drama. Every leader of creative dramatics teaches boys and girls her or his individually developed or chosen games and exercises to address these goals. Studies have been conducted only recently in the public elementary schools to assess the impact of creative dramatics on children’s scores on standardized tests, with promising results. Although the field of creative dramatics is often associated with preschool and primary education, these activities also occur in community centers, religious institutions, correctional facilities, and recreational programs. Early children’s theatre performances were plays that targeted adults and families as well as children. Many of William
699
Shakespeare’s plays, particularly those that dealt with magic, were deemed suitable for children’s audiences. Another adult play that encouraged the attendance of children was the adaptation of Harriet Beecher Stowe’s Uncle Tom’s Cabin (1851–1852). As children’s theatre gained acceptance, adaptations of fairy tales provided much of the content. Since children’s theatre continued to be considered education or social work rather than serious theatre, the field remained ignored by most serious writers. One notable exception was J. M. Barrie, whose 1904 play Peter Pan, the story of a boy who refuses to grow up, continues to be performed frequently. One of the first noteworthy writers of children’s plays was Charlotte Chorpening, director at the Goodman Theatre in Chicago and faculty member at Northwestern University during the 1930s. Many of Chorpening’s plays are adaptations of classic fairy tales, as are those of another noted early writer of children’s plays, Aurand Harris. During the latter decades of the twentieth century, the fairy tales that formed the basis of many children’s theatre plays came under scrutiny by psychologists and literary critics, particularly those interested in feminist theory. Bruno Bettelheim’s 1976 book The Uses of Enchantment, despite later criticism, continues to be the seminal text in the Freudian interpretation of fairy tales. These interpretations focus on gender relationships, and many fairy tales have undergone fierce examination in consideration of archetypes and gender roles. Although much study has been devoted to fairy tales from feminist perspectives, few have explored fairy tales and folk stories in relation to masculinity and gender identity. Many of the more popular adaptations of fairy and folk tales, such as Cinderella, Alice in
700
Theatre
Wonderland, The Wizard of Oz, Snow White, and Sleeping Beauty, focus on central female figures. Jack Zipes, scholar and author of numerous books on fairy tales, devotes some attention to the presentation of role models for boys in Fairy Tales and the Art of Subversion (1983). Fairy tales that have been adapted for the stage that feature boys as central characters (for example, the Grimm brothers’ “Adventures of Tom Thumb” and “Jack and the Beanstalk”) encourage boys to explore, take risks, go out into the world and face challenges, outwit their opponents, and take responsibility for ensuring the security of home and family. However, despite the emergence of the field of gender studies in academia, little examination of children’s theatre, performance, and gender has surfaced. The strongest factor that has influenced the choice of stories for performance has been the link between education and entertainment in children’s theatre. In the final decades of the twentieth century, fairy tales that were designed to socialize children and teach them proper behavior came under fire as gender roles came to be questioned. Fairy tales teach girls to be passive and compliant, become good homemakers and housekeepers, and wait for the handsome prince to fulfill their dreams. In contrast, boys learn to be aggressive, adventuresome, clever, and courageous from fairy tales. In conjunction with gender differences, concerns about violence and its effects on children also surfaced. Fairy tales teach children to deal with conflict, and boy characters confront ogres, witches, thieves, and robbers whose threats are physical. From these tales, boys learn to fight bravely as well as to use their intellectual abilities to overcome their foes. They also learn to be competitive, aggressive, and acquisitive in
order to gain money and power. This aggression often is directed toward an antagonist who is a fantastic, less-than-human being (ogres, giants) or toward someone who has been cast in the role of Other due to racial, ethnic, or gender differences. In a society that has come to embrace diversity and gender equality, this aggression toward other humans based on difference is no longer deemed appropriate or desirable social behavior, and reason, often portrayed in fairy tales as wit and cunning, rather than violence, has become the socially acceptable way of dealing with interpersonal conflicts. Competition, aggression, and the acquisition of money and power remain desirable characteristics and goals for young males, but in a society fearful of physical aggression, fairy tales that valorize violence as a means of conflict resolution face strong criticism. At the dawn of the twenty-first century, children’s theatres face many of the same problems encountered by the field’s founders a century ago. Chief among these is the qualified respect offered children’s theatre and its practitioners from professionals in theatre. For many serious theater artists, children’s theater continues to belong to the realm of education rather than art. However, these attitudes are beginning to shift as original plays by respected professional playwrights are commissioned and performed and as children’s theatres focus on artistic quality in lieu of or in addition to educational messages. Those theatres that continue to focus on education as a primary goal are faced with new issues to explore artistically, such as the use of tobacco and drugs, child abuse, racial and ethnic discrimination, and the effects of divorce on children. Carol Schafer
Toys See also Melodrama; Performers and Actors; Vaudeville References and further reading Bedard, Roger L., ed. 1984. Dramatic Literature for Children: A Century in Review. New Orleans, LA: Anchorage Press. Bedard, Roger L., and C. John Tolch, eds. 1989. Spotlight on the Child: Studies in the History of American Children’s Theatre. Westport, CT: Greenwood Press. Bettelheim, Bruno. 1976. The Uses of Enchantment: The Meaning and Importance of Fairy Tales. New York: Alfred A. Knopf. Croteau, Jan Helling. 2000. Perform It! A Complete Guide to Young People’s Theatre. Portsmouth, NH: Heinemann. McCaslin, Nellie. 1971. Theatre for Children in the United States: A History. Norman: University of Oklahoma Press. ———. 1987. Historical Guide to Children’s Theatre in America. Westport, CT: Greenwood Press. Siks, Geraldine Brain, and Hazel Brain Dunnington, eds. 1967. Children’s Theatre and Creative Dramatics. Seattle: University of Washington Press. Ward, Winifred. 1958. Theatre for Children. Anchorage, KY: Children’s Theatre Press. Youth Theatre Journal. 1986– . American Association of Theatre for Youth (formerly Children’s Theatre Review). Zipes, Jack. 1983. Fairy Tales and the Art of Subversion: The Classical Genre for Children and the Process of Civilization. New York: Wildman Press.
Toys Little has changed as much in the history of American boys as have the number and kinds of their toys. From colonization until the Civil War era, male children had relatively few playthings, especially after the toddler years. The pressures of work on farms and in trades limited the time of play. In those times of settlement and relative scarcity, parents seldom thought of toys as tools of learning or character
701
building. The young learned their sex roles and job skills by assisting in their father’s or a master’s daily work. And religious strictures against idleness, especially in Puritan New England, made games suspect. In many families, adults brought out toys only on special holidays. A common toy dating back to sixteenthcentury Germany was Noah’s ark, a play set complete with animal figures, but parents allowed children to play with it only on Sundays, presumably to teach a Bible story. Most important, however, was simply the relative paucity of manufactured luxury goods of any kind. Many manufactured toys were expensive and until the end of the nineteenth century were often imported from Germany. This scarcity, of course, did not mean that early American boys had no toys. At least in more wealthy, free, and settled families, infants and toddlers received teething toys, rattles, hobbyhorses, jumping jacks, and building blocks. These playthings grew more elaborate in the nineteenth century, with the availability of mechanical push toys that rang bells or toy instruments (horns, drums, and pianos). These “child quieters” were used to divert those too young to work. Even if older boys were given few toys, they found time to play and often made their own toys. They improvised, creating fantasy worlds with whittled sticks, castaway bits of cloth, stones, gourds, wheel rims, and mother’s clothespins. More often, however, they played their own, often rough, games in unsupervised groups. Particularly in rural and smalltown America where the press of parents’ work and the availability of open space gave boys ample opportunity to form into small gangs, they tested each other’s courage and displayed loyalty at play without necessarily requiring toys. Finally,
702
Toys
Nineteenth-century boys played with miniatures of adult life, like this 1895 cast-iron horse and fire wagon made by Wilkens. (Courtesy, The Strong Museum, Rochester, NY)
toys were probably more available for boys than for girls. In American portraits taken between 1830 and 1870, 66 percent of boys are shown with toys, whereas only 20 percent of girls were depicted with any plaything, mostly dolls. Only after 1865 did American manufacturers produce toys in large numbers, and then often as a sideline. Manufacturers of wood, metal, mechanical, and print and paper goods often produced miniatures of their “adult” products or used waste materials to make modest batches of cheap children’s playthings for Christmas sales. Pennsylvania upholsters made toy drums from scrap. Samuel Leeds Allen manufactured farm equipment but diversified with the famous Flexible Flyer sled in 1889, and the immigrant Al-
bert Schoenhut of Philadelphia imported German toys until he had the resources to manufacture his own line of toys in 1872. Toys remained secondary retail items, often sold by peddlers of housewares or from hardware catalogs. They were almost afterthoughts because they were relatively unimportant to parents. When cast-iron toys began replacing tinplated toys in the 1860s, cheap, easily varied molds became possible. Improved and cheaper brass clockwork mechanisms also stimulated the production of a plethora of mechanical toys. For example, in 1868, a son of watchmakers named Edward R. Ives of Plymouth, Connecticut, began manufacturing a vast array of windup toys, often on topical themes: fiddle players, performing bears, black preachers, and General
Toys Grant smoking. Other manufacturers copied English parlor science devices from the 1820s and 1830s (like the flashcard “moving picture” and the more sophisticated zoetrope, in which a paper strip of pictures appears to be animated when viewed through turning slits on a drum) and moralizing games of chance (like the “Mansion of Happiness”). More innovative toys from the 1860s included sand molds and cap and air guns. Roller skating also became a popular family activity in 1875, and children’s pedal toys began to appear shortly after the introduction of velocipedes and bicycles made for adults in the 1870s. The mechanical savings bank (featuring a figure that shot a bear or danced when a coin was deposited) taught the parent-approved lesson of thrift even as it amused all. Charles Crandall (1833– 1905) introduced interlocking building blocks for construction play. Most of these toys were still too expensive for any but the affluent (a clockwork figure cost from $1 to about $3 when daily wages were scarcely that high). But even the poor could afford cheap “penny toys” (wooden tops, tiny toy swords, and crude animal figures, for example). Most late-nineteenth-century toys were essentially miniatures of adult tools and invited boys to anticipate adult male sex roles. Toy catalogs featured toy hammers, saws, and even garden tool sets for boys and dolls and miniature houseware sets for girls. With notable exceptions, these toys were not designed to encourage fantasy (there were no masks or cowboy hats, no figures made in the image of boy heroes). Gradually toys were becoming substitutes for training in work. They also served as tools for more solitary play encouraged by parents desiring to isolate their sons from the influence of unsupervised gangs.
703
A series of changes clustering around 1900 created new toys for young boys. Manufacturers began to address boys’ imaginations rather than just parents’ concerns. Although children had little pocket money, boys’ magazines from the 1870s onward offered young readers rather expensive toy steam engines for selling subscriptions to their magazines. In the 1900s, parents also began to give children allowances, in part to teach them shopping skills. Toy makers responded by advertising heavily in boys’ magazines. This publicity was the secret of the success of such “staple” toys as Flexible Flyer sleds and Albert C. Gilbert’s erector sets. Increasingly, manufacturers featured toys for older boys. The percentage of American males between the ages of fourteen and nineteen years old who worked had decreased from 61 percent in 1890 to 40 percent around 1930 (Bureau of the Census 1965, 70). Middle-class boys of ten to even sixteen years of age could look to sophisticated construction sets as fun but practical training for modern careers in engineering and science. Such toys appealed as well to parents who wanted their older sons to spend their playtime “wisely.” Toys became even more sex-stereotyped as boys’ toys increasingly idealized technology, constant innovation, and the values of competition and teamwork. By contrast, a new generation of playthings for females featured companion and baby dolls, encouraging emotional attachments and nurturing. Of course, male children also played with dolls; part of the reason for the 1906 craze for the teddy bear was that it was a masculine image that attracted little boys. Still, the gender divide shaped the vision of the future: to boys it promised an exciting public world of mechanical progress and to
704
Toys
girls a personal life of warm relationships and fashion. Many boys’ toys from 1900 to 1950 closely reflected dramatic changes in transportation, science, communications, and construction. The introduction of toy cars and airplanes closely followed the real things. Chemistry and other science sets introduced boys to the secret processes of nature. Boys’ playthings also gloried in media and communications technology, including working cameras, slide projectors, phonographs, and radios. All these toys attempted to minimize the barrier between the plaything and the real thing. They taught boys to admire the technologies of the future and allowed youths to imagine themselves in control of modern power. From the 1910s to the 1960s, model electric trains were the capstone toy for many middle-class American boys. Facilitated by Joshua Lionel Cowen’s 1906 introduction of the electric current transformer, the electric locomotive made boys feel powerful. Central to the appeal were the carefully designed replicas of coal and refrigerated cars, colorful boxcars, and cabooses that gave boys a sense of being part of a real world of commerce and success; and the miniatures of roundhouses, railroad crossing signals, and other accessories completed the romance. In this period the media and educators were beginning to encourage fathers to spend more time with their sons. The pressures of work may have prevented many fathers from following this advice, and surveys showed that boys still preferred their mothers. But men did embrace the idea of fathering through play with their sons. Electric train manufacturers encouraged boys to make “the lad the pal to dad” by getting the father involved with their train play.
Another favorite toy was the construction set. The best-known examples were Tinkertoys, Lincoln logs, and especially Albert C. Gilbert’s erector sets (1913). Boys were supposed to bolt together Gilbert’s metal strips to make models of modern railroad and industrial equipment. But he provided more. Gilbert gave boys a dream of play, accomplishment, and preparation for future success. In his ads and catalogs, Gilbert touted his own fun-filled but also successful life and promised in his various promotions that lads who played with his construction and science toys were bound to become engineers and business titans. Boys responded by building models of specialized machines of the rail era: jib cranes, swing bridges, pile drivers, inclined delivery shoots, and coal tip cranes. One element of contemporary boys’ toys was, however, relatively rare in early-twentieth-century toys—war play. Toy weapons were, of course, sold. Cap guns appeared as early as 1859, pop guns had their debut in the early 1870s, and the Daisy Air Rifle (BB gun) began its long success in 1888. A few cowboy suits with holsters and revolvers appeared shortly before World War I. Battleships and even machine guns arrived during that conflict. But all these items were designed to promote bonding with older brothers and fathers at war. Daisy advertised its toy rifles as essential tools in making boys into men and never glorified violence or destruction. Toy gun sales dropped off sharply after World War I. The excitement of trains, cars, commercial flight, and construction prevailed over war toys in boys’ play in the thirty years after 1900. A major shift in boys’ playthings began in the 1930s during the Great Depression. In response to reduced sales, toy
Toys makers offered cheaper toys, often sold by the piece. Ironically, this tended to make it possible for children to purchase their own toys. Thus ten-year-old boys, using their earnings from running errands, could buy single miniature cars or rubber toy soldiers rather than have to wait for an adult to purchase a complete set of metal soldiers. Boys had long collected objects that surrounded them. In the 1930s, the collecting habit began to shift from amassing shells or bottle caps to collecting the constantly expanding number of military figures sold in dime stores. Toy companies also began to license images of popular media personalities to increase sales. Talking movies, especially color cartoons, greatly increased the appeal of film to boys, and during the Depression theaters offered Saturday matinees that featured children’s fare. Westerns and space heroes like Buck Rogers, but also Mickey Mouse and other Disney cartoon personalities, attracted boys. Network radio introduced widely popular after-school adventure programs designed especially for boys. They shaped boys’ play by introducing stories and images designed specifically for children that required toys to serve as props for the reenactment of their dramatic narratives. Radio serials gave voices and sound effects to the images in the daily comic strips. For example, Chester Gould’s “Dick Tracy” (a comic strip introduced in 1931 that shortly afterward also became a radio program) featured strong images and colors, manly personalities, and striking situations that could be easily converted into boys’ toys like the Dick Tracy Jr. Click Pistol. When boys listening to the radio heard Tracy’s police car screech down the street and his gun fire as he chased crooks, they wanted
705
“official” Dick Tracy police cars and pistols. Buck Rogers, an American accidentally sent to the twenty-fifth century, became the hero of a long-lasting science fiction adventure program. The makers of the Daisy Air Rifle offered a Buck Rogers Space Pistol at 25 cents in 1934 to supplement stagnant sales of their $5 BB gun. In the 1930s, the hero began to replace the machine as the central prop of play. Although the construction sets of the 1910s and 1920s called the boy to imitate practical men and to imagine his future role in an orderly world of economic and technological progress, the new male fantasy toy beckoned the youth to a faraway realm where conflict dominated. No longer did technology seem to offer a future of progress and prosperity. Rather than inviting the boy to identify with the father (often unemployed in the Depression), the new toys evoked an image of strong men free from the bonds of family. The cowboy star, tough detective, boxer, spaceman, and superhero became father substitutes. In the 1930s, Tom Mix, Dick Tracy, Popeye, Buck Rogers, and Superman offered boys a wide variety of toys (both guns and windup figures). They all shared a common penchant for fighting and subduing enemies rather than constructing things or achieving goals. And they all lived in a world where a boy could forget he was a child and the fact that he may have had an unheroic father without a steady job. The 1930s saw an extraordinary growth in toy weaponry in fantasies of the Wild West, G-men against criminals, and intergalactic war. The combination of more aggressive marketing of toy guns and general anxiety about crime in the gangsterridden 1930s produced a negative public reaction. In 1934 and 1935, Rose Simone,
706
Toys
GI Joe underwent many transformations, thus revealing a changing world of boys’ play. The large GI Joe is a doll that boys in the 1960s dressed and play=acted the duties of real U.S. Navy frogmen. Underneath is the 1995 version, GI Joe Extreme, a pair of fantastic figures designed merely for combat. (Courtesy, Alexander Cross, State College, PA)
a militant opponent of weapons toys, organized a bonfire in Chicago into which guns gathered from children in sixty area schools were thrown. In certain ways, toys popular during the post-1945 baby-boom generation harked back to the toys of pre-1930 generations. In a period of new scientific advances and perhaps closer bonds between fathers and sons, many new playthings were miniatures of contemporary technology. Gilbert’s Atomic Energy Set may have been a commercial failure (because
of its cost), but it was an extravagant example of the toy as promoter of progress. Chemcraft claimed that its new science sets drew on wartime discoveries in plastics, wonder drugs, and atomic energy to inspire a new generation of children to be inventors. The new technology of jet propulsion was mirrored in the model airplanes offered by plastics manufacturers like Revell. Science fiction films and comics inspired a curious run of space toys in the early 1950s. These included plastic green “men from Mars,” space helmets, battery-operated robots, and even a Space Scout Spud Gun that shot “harmless little plugs of raw potato up to 50 feet.” More realistic were the toy miniatures that celebrated the space program and missile development in the late 1950s. The appeal was less to war than to science and its industrial applications in the future. Only in the mid-1950s did many toys appear celebrating World War II combat and then only as a historical event commemorated along with other past heroics. Most of this celebration of men’s deeds and technology in boys’ toys was more peaceful and prosaic—model bulldozers, trucks, and service stations offered by companies like Tonka. The postwar period also produced a craze for cowboy toys that went well beyond the well-established traditions of cowboy suits, holster sets, and Lincoln logs. Cheaply made miniature frontier towns, ranches, and especially forts let boys reenact cowboy-and-Indian dramas seen at the movies. Radio and movie cowboys, including Hopalong Cassidy, Roy Rogers, Cisco Kid, Davy Crockett, and the Lone Ranger, graced the toy shelves of the late 1940s and early 1950s. But in the five years after 1955, the prime-time westerns that were designed for the whole family and that attracted fathers as
Toys much as sons made western toys ever more popular. Both romantic settings— space travel in the future and western heroes in the past—were imaginative worlds that fathers and sons could share. Although the 1950s seemed to be a throwback, the decade did mark the beginning of mass advertising of toys on television programs directly to children, a change that eventually revolutionized boys’ play. Although the Mickey Mouse Club was not the first children’s show to promote toys when it first appeared on television in 1955, its advertising was aggressively designed to appeal to the child’s imagination rather than to the parents’ values. Mattel toys proved that year-round advertising featuring child actors could create mass rushes to buy “burp guns” and “Fanner 50 smoking cap guns” even outside the Christmas gift season. Increasingly, boys pressured their parents into buying “must-have” toys after seeing them on television. Playthings began to represent the world of boys’ fantasy as presented in the contemporary media. Instead of toys being sold to parents (and thus designed to please them), playthings increasingly were sold that appealed to boys’ imagination. The 1960s and 1970s witnessed a transition to another phase in the history of toys, dominated by the action figure and video game in which playthings became props or electronic means for reenacting fantasy stories. For the most part, these toys were divorced from the memories and expectations of parents. Not only did parents find these toys increasingly alien from their recollections of their own childhoods, but these playthings increasingly had no connection to the boys’ future. The most revealing example of this change is found in the history of Hasbro’s GI Joe. When this figure first appeared in
707
1964, it was a boy’s dress-up doll, realistically representing the average soldier. Unlike those cheap and impersonal plastic soldiers of the 1950s, GI Joe had movable limbs and was 1 foot tall; thus, he could be posed and equipped with the latest military clothing and weaponry. Boys could play war the way their fathers might have fought in Europe in World War II or in Korea. And they could dress their Joes in battle gear similar to that worn by conscripted uncles or older brothers serving their two-year stints in the army of the mid-1960s. GI Joe still connected fathers with sons. In the late 1960s, however, GI Joe suffered major changes. By 1967, as the Vietnam War heated up and adults (like pediatrician Benjamin Spock) attacked war toys, sales decreased. Beginning in 1970, Hasbro responded to a growing hostility to war toys among adults by transforming the “fighting” Joes into an “Adventure Team” in which the hero searched for sunken treasure and captured wild animals. As the Vietnam War wound down to its bitter end in 1975, it was awkward to sell military toys glorifying contemporary jungle warfare. In 1976, with the Vietnam War in the past, GI Joe once again became a fighter. Although the new “Super Joe” had shrunk to 8 inches (because of higher costs for plastic) and no longer could be dressed, he was even more exciting to boys as a high-tech warrior. GI Joe did not rejoin the ranks of enlisted men and was no longer part of a world that fathers, uncles, or older brothers had ever experienced. Instead, his laser beams and rocket command vehicles helped him fight off aliens. The object of play was to pit good guys against bad guys, not to imitate real military life. Play no longer had anything to do with the experience of fathers and their hopes for their sons’ future.
708
Toys
These action figures set the stage for the craze stimulated by the movie trilogy Star Wars from 1977 to 1983. During these years and beyond, American boys were inundated with toy figures, vehicles, and play sets built around the play of reenacting the onscreen rivalry of Darth Vader and Luke Skywalker. The theme of violence was very pronounced in these action figures and their many imitators. Still, because the violence was so unrealistic, it was easy for boys not to take it seriously. Action figures used wildly imaginary weapons, and conflict was reduced to the scale of the play sets. Parents no longer dressed their sons like soldiers or even gunslinging cowboys. Instead of turning their backyards into play battlefields or dueling with cap gun and holster in family dens, boys were allowed to collect tiny warriors that reminded no adult of any war they had ever known. The conflicting feelings of adults toward the military were avoided, allowing most parents to ignore the war play of their young. At the same time, war play became detached from whatever historical or moral purpose that military toys had earlier embodied. Star Wars and GI Joe toys were just the beginning. Mattel’s He-Man and Masters of the Universe, appearing in 1982, closely paralleled the Star Wars formula. The youthful, blond, and muscular HeMan and his team of good guys fought the aged, bony, and evil Skeletor and his horde. A major feature of the Mattels line was Castle Grayskull, which was shaped like a mountain. The figures in effect played “king of the mountain” at Grayskull, the center of the fray in a “fantastic universe beyond all time.” There were many imitations in the 1980s and 1990s: the Transformers, Dino-Riders, Teenage Mutant Ninja Turtles,
Power Rangers, and Pokémon, for example. These action figures and their accompanying play sets were all linked to fantastic stories. Many were products of television cartoon series shown on Saturday mornings and in after-school hours that featured action-figure images. Indeed, following the liberalization of rules on children’s television in 1982, many of these cartoons were developed for the toy companies specifically to promote their product lines. Thus, they were called “program-length commercials.” Beginning in 1972, action figures competed with and often paralleled video games. Simple games like electronic Ping-Pong played in arcades were quickly supplemented with video action available on home game consoles for television and handheld electronic toys. These products introduced younger boys to electronic interactive play. Although this craze died in the early 1980s with the collapse of Atari and other manufacturers of video games, the much improved graphics and action of Nintendo and other video systems from 1988 onward brought the video game back. These interactive electronic adventures heavily emphasized fantasy violence and brought criticism for their increasing intensity, addictive attraction, and tendency to isolate boys from others. While action figures, video games, and other toy fads sparked repeated concern among parents and educators, traditional toys like marbles, yo-yos, and even imaginative construction toys like Lego found it difficult to compete. By the end of the twentieth century, American boys’ toys were drawn primarily from a never-ending and always changing world of media fantasy. Gary Cross
Transitions (through Adolescence) See also Comic Books; Films; Superheroes; Television: Cartoons; Video Games References and further reading Bruegman, Bill. 1992. Toys of the Sixties. Akron, OH: Cap’n Penny Productions. Bureau of the Census. 1965. The Statistical History of the United States. Washington, DC: Government Printing Office. Calvert, Karin. 1992. Children in the House: The Material Culture of Early Childhood, 1600–1900. Boston: Northeastern University Press. Cross, Gary. 1997. Kids’ Stuff: Toys and the Changing World of American Childhood. Cambridge, MA: Harvard University Press. Gilbert, Albert C., with Marshall McClintock. 1953. The Man Who Lives in Paradise. New York: Rinehart. Greenfield, Laurence. 1991. “Toys, Children, and the Toy Industry in a Culture of Consumption, 1890–1991.” Ph.D. diss., Ohio State University. Hewitt, Karen, and Louis Roomet. 1979. Educational Toys in America: 1800 to the Present. Burlington, VT: Robert Hull Fleming Museum. Kline, Stephen. 1993. Out of the Garden: Toys and Children’s Culture in the Age of TV Marketing. New York: Verso. O’Brien, Richard. 1990. The Story of American Toys. London: New Cavendish Books. Payton, Crystal. 1982. Space Toys. Sedalia, MO: Collectors Compass. West, Elliott, and Paula Petrik, eds. 1992. Small Worlds: Children and Adolescents in America, 1850–1950. Lawrence: University of Kansas Press. Whitton, Blair. 1981. American Clockwork Toys, 1862–1900. Exton, PA: Schiffer Publishing.
Transitions (through Adolescence) During adolescence, boys undergo many transitions as they move from childhood to adulthood. Although these transitions are challenging for many, if not all, boys, most will meet these challenges successfully. Entry into adolescence involves puberty, enrollment in a new type of school
709
(that is, middle or junior high school), new cognitive skills, changing relationships with peers that will start to include sexual experiences, and changing relationships with parents that often include more frequent arguments as boys seek greater independence. The exit from adolescence or entry into adulthood also involves many important transitions, including enrolling in advanced education, entering military service, or beginning a first full-time job; being able to support oneself financially; and engaging in relationships with romantic partners that may be longer lasting and more stable than the relationships of adolescence. In between these major transitions are other important developmental landmarks as boys grow from children to adults. Part of becoming an adult is not just looking more like an adult but also taking on the roles of adults (Graber and Brooks-Gunn 1996). Exploration of the roles of adulthood, or figuring out what it means to be an adult, is one of the challenges of adolescence, with new roles emerging at each transition. In nearly every culture or society, there is a period between childhood and adulthood during which individuals are expected to learn the roles of an adult (Schlegel and Barry 1991). The length of the adolescent period may vary; some individuals will need to make transitions at younger or older ages depending on their opportunities and experiences. The second decade of a child’s life in its entirety is often defined as adolescence. In addition, adolescence is usually divided into three main transitional periods: entry into or early adolescence, middle adolescence, and late adolescence. There are no exact ages for each period, but each one usually matches a time when a boy makes a school change, at least in the
710
Transitions (through Adolescence)
United States. During the entire decade of adolescence, cognitive development increases boys’ abilities to understand future outcomes, make plans and decisions, and think in abstract terms. These developmental advances interconnect with changes in feelings and behaviors as boys develop their identities and intimate relationships. The entry into adolescence, or early adolescence, typically stretches from ages eleven to fourteen. During this period, most boys will experience the transition of puberty, the school transition from elementary to middle school, and relationship and role changes with parents and peers. Puberty is not a single event—boys do not go to bed one night with the body of a child and then wake up the next morning with the body of an adult. Instead, most boys will take four to five years from the beginning to the end of puberty. Boys may begin to show outward signs of puberty anywhere from age nine to thirteen and a half and still be in the range of normal development. Thus, some boys may go through puberty mainly during the middle school years, if they start in late elementary school and finish by age thirteen or fourteen, whereas others may experience most of puberty during high school, if they start in middle school and finish by age eighteen. On average, boys’ puberty usually begins with the appearance of pubic hair and changes in the genitals occurring between eleven and eleven and a half years of age (Marshall and Tanner 1970). These changes may be noticeable mainly to the boy himself. In contrast, many of the other changes of puberty are noticeable to boys and those around them. For example, vocal changes and the appearance of facial hair are noticed by others as well as by the boys themselves. The growth
spurt in height begins around eleven and a half years of age. On average, the most rapid changes in height occur around age fourteen for boys. There is also a spurt in strength after the growth spurt, with boys adding a substantial amount of muscle (54 percent of body weight) by the end of puberty. These changes in physical appearance have been associated with behavior and adjustment changes for boys. Typically, all boys are likely to have periods of time when they are uncomfortable with their changing bodies. Although growing in height and muscle may make an adolescent boy feel more like an adult, different parts of the body tend to grow at different times, leading to that “gangly” feeling; acne is common throughout puberty; and many boys may be self-conscious about their changing bodies, especially in the locker room. With increasing focus in the media—television, magazines—on a lean, muscular body type for men, more boys are starting to have problems with body image as they go through puberty. These types of experiences and external pressures make puberty a challenging transition for boys. In addition, some boys may be particularly sensitive about their developing bodies (Graber, Petersen, and Brooks-Gunn 1996). For example, boys who mature later than their peers may be sensitive to being shorter and less muscular when other boys have already grown. Boys who mature earlier than their peers seem to have an advantage in sports and physical activities, especially in the middle school years, as they are bigger and stronger than other boys. By the end of the pubertal transition, boys will have gone from having the body of a child to one that looks (and functions) much more like an adult’s body. Most boys in the United States will also
Transitions (through Adolescence) make a school transition around this same time. This transition is not merely to a new school but usually to a very different type of school (Graber and BrooksGunn 1996). For example, most middle or junior high schools are much larger than the elementary schools that boys attended in childhood. Students frequently change classrooms and teachers every period, a situation that often prevents them from making personal connections with teachers. In addition, grading standards are often higher, workload usually increases, and students are expected to work independently more of the time. Taken together, these changes in school context can be a difficult transition for boys. Most boys will find these changes somewhat stressful but eventually get used to the new expectations and situation by the second semester or the next year. However, it is important to remember that boys in general more often have difficulties at school, as evidenced by their higher rates of learning and reading disabilities (Sommers 2000). Boys who have struggled with schoolwork in the elementary years are at increased risk for having increased problems at this time and throughout adolescence. The entry into adolescence also includes transitions in the family environment. As most parents will confirm, arguing between young adolescents and parents is a common experience (Steinberg 1990). As adolescents mature, so does their capacity to think critically and make decisions on their own. As boys take on more responsibility at school and begin to look older, they often want to make more of their own decisions and question rules established by parents. In the big picture, arguments are over the struggle for independence or autonomy, but in the reality of day-to-day experi-
711
ence, arguments are usually over chores, curfew, and similar issues. The frequency of arguments seems to peak during midpuberty (Steinberg 1990). Over time, the rate of arguing declines as parents begin to give their adolescent sons more control over or input into rules and regulations. For most families, maintaining warm and supportive relationships during this time helps the adolescent feel secure while testing the waters of independence. Some research has found that it is mothers rather than sons who seem to be most upset by the arguing (Steinberg 1990). Mothers and daughters seem to maintain more closeness during early adolescence, whereas mothers often feel that their sons are more rejecting of their concerns and affection. Finally, boys’ relationships with their peers also make a transition at this time. Perhaps in part because of the larger school context, young adolescent boys more often make friends based on shared interests rather than just hanging out with the kids who live in their neighborhoods. As boys begin to think about who they are and who they want to be as adults, they may seek out friends whom they perceive to have similar interests and goals. Friendships in early adolescence may seem transitory as individuals explore different groups with whom they want to identify. At the same time, boys begin to explore sexual and dating situations. Usually, young adolescents start going out in mixed-gender groups and may engage in kissing or exploration games at parties. Making the transition to intercourse is not common for young adolescent boys, but about one in five boys will have intercourse by age fourteen (Alan Guttmacher Institute 1994). The transitions of early adolescence are not made quickly but set boys on the
712
Transitions (through Adolescence)
course of continuing adolescent challenges. Middle adolescence (ages fifteen to seventeen or eighteen) most often encompasses the high school years. The transitions of this period are more varied: some boys make them, but others may not. Most boys typically become licensed drivers during middle adolescence. Little is known about how this transition is experienced by boys, but it seems likely that most feel that they are one step closer to adulthood via this achievement. In particular, driving often allows more freedom and independence from parents to pursue social activities. At this time, dating is more common, and more and more boys make serious sexual transitions; by age seventeen, nearly 60 percent of boys have had intercourse (Alan Guttmacher Institute 1994). During the course of high school, many boys will also become workers. Certainly, some boys will have had parttime jobs at earlier ages, but it is common for boys to engage in more formal part-time employment experiences in middle adolescence. Becoming a worker is a positive experience for many boys because a job brings them additional challenges (e.g., balancing school and work, making new friends, being responsible) while also giving them more money to spend on activities that they choose for themselves. However, boys who spend too much time working (more than twenty hours per week) during the school year often are more disengaged from school and may engage in more problem behaviors (e.g., drinking or smoking) (Steinberg and Avenevoli 1998). By late adolescence, boys again face numerous simultaneous transitions as they begin to move closer to taking on adult roles. Although the entry into adolescence is defined by many transitions
that typically are experienced by nearly every boy (e.g., puberty, school transition), late adolescence (ages eighteen to twenty) and the transition to adulthood are perhaps more complicated because these transitions depend more on the opportunities available to young men as well as their own desires. Across adolescence, boys form their ideas about what they want to be as adults—workers, spouses, fathers, and so on. As such, they prepare to take on adult roles and define these roles based on what they would like to be, along with the opportunities that are available to them. The transition to adulthood may be quite long or quite short and may occur quickly or slowly. For example, boys typically begin living away from home during late adolescence and the transition to adulthood. Initially, they may not be financially independent from parents and family but are often finishing high school, continuing their education (e.g., college or trade school), entering the military, or starting their first full-time job. Some young men will not finish the level of education or training that they want and may find themselves redefining the type of worker they will be. For adolescents who do not go on to college, some may struggle to find more regular, consistent work, making it difficult to become financially self-sufficient or to go on to support a family. Young men who go on to college frequently do not become financially self-sufficient or start a family until they have finished college and have started working. In addition, a few young men may never marry or have children, whereas others become parents while they are still adolescents. Information from the Census Bureau indicates that the median age of first marriage for men is currently just under twenty-seven
Transitions (through Adolescence) years of age (Lugaila 1998). Thus the transitions to stable marital and family roles occur later for boys than they did in the 1960s and 1970s. It has been suggested that the entire period from ages eighteen to twenty-five is an extended transitional period (Arnett 2000). In general, most boys navigate the transitions of adolescence successfully. That is not to say that they will not experience some stress and a lot of challenges. Rather, most adapt to the changes and develop the skills they need to manage the challenges of each transition. Certainly, boys who enter adolescence with better coping skills, supportive family relationships, positive friendships, and success in school usually fare well during this time period. However, even for these boys, a particular transition may upset the system. In order to keep boys on a healthy pathway to adulthood and to move boys from less healthy pathways to healthy ones, continued attention to the challenges of each transition are needed. Targeting programs, community resources, and family supports for the times when transitions are most common will help a greater number of boys successfully take on adult roles. Julia A. Graber See also Adolescence; Bodies; Fathers, Adolescent; Learning Disabilities; Mothers; Same-Sex Relationships; Sexuality; Smoking and Drinking References and further reading Alan Guttmacher Institute. 1994. Sex and America’s Teenagers. New York: Alan Guttmacher Institute.
713
Arnett, Jeffrey J. 2000. “Emerging Adulthood: A Theory of Development from the Late Teens through the Twenties.” American Psychologist 55: 469–480. Graber, Julia A., and Jeanne Brooks-Gunn. 1996. “Transitions and Turning Points: Navigating the Passage from Childhood through Adolescence.” Developmental Psychology 32: 768–776. Graber, Julia A., Anne C. Petersen, and Jeanne Brooks-Gunn. 1996. “Pubertal Processes: Methods, Measures, and Models.” Pp. 23–53 in Transitions through Adolescence: Interpersonal Domains and Context. Edited by Julia A. Graber, Jeanne Brooks-Gunn, and Anne C. Petersen. Mahwah, NJ: Erlbaum. Lugaila, Terry A. 1998. “Marital Status and Living Arrangements: March 1998 (Update).” Current Population Reports. U.S. Bureau of the Census Publication no. P20-514. Washington, DC: U.S. Department of Commerce. Marshall, William A., and James M. Tanner. 1970. “Variations in the Pattern of Pubertal Changes in Boys.” Archives of Disease in Childhood 45: 13–23. Schlegel, Alice, and Herbert Barry III. 1991. Adolescence: An Anthropological Inquiry. New York: Free Press. Sommers, Christina H. 2000. “The War against Boys.” The Atlantic Monthly (May): 59–74. Steinberg, Laurence. 1990. “Autonomy, Conflict, and Harmony in the Family Relationship.” Pp. 255–276 in At the Threshold: The Developing Adolescent. Edited by Shirley Feldman and Glen R. Elliott. Cambridge, MA: Harvard University Press. Steinberg, Laurence, and Shelli Avenevoli. 1998. “Disengagement from School and Problem Behaviors in Adolescence: A Developmental-Contextual Analysis of the Influence of Family and Part-time Work.” Pp. 392–424 in New Perspectives on Adolescent Risk Behavior. Edited by Richard Jessor. New York: Cambridge University Press.
V Vaudeville
tions about the language, content, and costumes of the vaudeville acts in order to develop and maintain a theatrical experience appropriate for women and children. Throughout the history of theatrical entertainment, societies around the world have wrestled with issues of propriety in public performances and the right of a government to censor artists. This struggle has been particularly fierce in the United States, a nation founded by religious refugees from Europe who lived by strict moral codes. Their deep sense of morality (periodically expressed by governmental prohibitions against acting) prevented professional theater from fully developing as a commercial venture until the industrialization of the East Coast in the 1820s. Even when permanent theaters and touring companies did develop, much of the public feared the corruption of audiences by the new ideas, behaviors, and costumes presented on stage. By 1865, theaters that presented complete plays, whether written by Americans or imported from Europe, were known as “legitimate” theaters and appealed to the more highly educated, wealthier classes of society. The music, comedy sketches, animal acts, and dancing preferred by many working-class men were most prominent in the concert saloons notorious for drinking, brawling, and connections to prostitution. In partic-
A form of live, popular entertainment in the United States from the late nineteenth century through the early twentieth century, vaudeville consisted of a variety of diverse short acts that included music, dance, trained animals, and eccentric feats and was designed to appeal to the entire family. In the mid-1800s, New York City theatrical managers such as P. T. Barnum (1810–1891) and Tony Pastor (1837–1908) sought to provide wholesome entertainment that would attract workingmen, their wives, and their children. In creating the variety format, Pastor drew from traditions such as the minstrel show, which used blackface song-anddance numbers to grotesquely parody life on southern plantations; the concert saloon, where male comedians might be accompanied by scantily clad chorus girls; and the English music hall, which also interspersed comical sketches with musical numbers. These earlier forms were not appropriate for family viewing, since they contained sexually suggestive songs and jokes. Furthermore, these performances were presented in drinking houses, where the entertainment encouraged the purchase and consumption of liquor and patrons quickly became rowdy and even violent. In response to these conditions, Pastor and other vaudeville managers who followed him instituted strict regula-
715
716
Vaudeville
ular, the presence of female performers in flesh-colored tights that displayed the shape of their legs, who performed high kicks while singing sexually suggestive lyrics, contributed to the reputation of moral corruption in the concert saloons. It was this environment that Tony Pastor sought to change by providing the songs, dance, and comedy of the variety format in a clean, well-furnished auditorium that evoked the glamour of the “legitimate” theaters. In appealing to workingmen to bring their wives and children to this new type of “clean” theater, Pastor targeted a huge new audience—complete families. Pastor encouraged mothers to take charge of their families’ leisure hours by providing a “wholesome” venue where all members of the family could feel safe and comfortable. His success soon inspired the proliferation of many variety theaters, not only in New York but across the country. The further efforts of producers B. F. Keith (1846–1914) and Edward F. Albee (1857–1930) resulted in a huge touring network: variety performers traveled by train from New York to San Francisco and back again, presenting the same acts to audiences everywhere. Although urban families were more likely to experience vaudeville than those in remote rural areas, this network was the beginning of a national popular culture. Nearly every class of people in every part of the country could laugh at the same jokes and learn the same songs. It was Keith who insisted on calling the form “vaudeville,” a word that comes from the French vaux-de-Vire, a type of satirical song from the town of Vire in Normandy. He believed that the exotic French word lent sophistication to the variety show. Indeed, working- and middle-class audiences enjoyed paying mere pennies to visit the elegant vaudeville
palaces of Keith and Albee, which resembled the fancy legitimate theaters where drama and opera were presented. Several aspects of the vaudeville format were particularly appealing to women and children. First of all, lyric sheets, sheet music, and whole booklets of music called “songsters” were available at performances for a few pennies. A child who purchased a lyric sheet on the way into the theater could sing along with the musical numbers. Families who owned a piano or guitar could then play the music at home and remember the acts they had seen at the theater. In the days before radio, this kind of public ownership of current music was a popular diversion. Second, the vaudeville theaters generally presented several performances of all the acts on a bill in one afternoon. For the price of a ticket, a child could enter the theater in the middle of a bill and stay until he had seen everything. Although young girls were generally kept at home with their mothers during the workweek, afternoon shows were ideal for a boy who had time after school before dinner or who spent his days selling newspapers or slinging water buckets for the fire department. The boys generally sat in the cheapest seats up in the highest balcony, which was called the “gallery.” They would buy bags of peanuts and their song sheets and settle in for a rowdy session of singing, cheering, and throwing peanut shells down upon the audience and stage below. Although this sounds like extremely unruly behavior today, the atmosphere of the vaudeville theater involved much more give-and-take between actors and spectators than today, and much of this behavior was expected, if not entirely condoned. Managers often gauged the success of particular acts ac-
Vaudeville cording to the applause, foot stomping, cheers, and insults raining down from the gallery during a performance. Third, there were quite a few young people to be seen on the stage, which must have been exciting for boys in the audience. Children performed as tap and ballet dancers; singers of both popular tunes and operatic arias; dramatic interpreters of Shakespearean monologues; high-flying acrobats and contortionists; snappy-patter comedians; and animal tamers, sharing the stage with dogs, cats, all manner of rodents, birds, goats, and the occasional horse. Irish comedians Edward Harrigan (1845–1911) and Tony Hart (1855–1891), who opened their own theater in 1876, began touring as young children. Harrigan played guitar and sang original compositions, and Hart dressed as a young girl and sang heart-wrenching melodramatic ballads. As adults, Harrigan and Hart used choruses of children—including Harrigan’s own sons—in some of their most popular shows. Lotta Crabtree of San Francisco (1847–1924) gained fame by dancing jigs and polkas. She won such a devoted following that she erected a public drinking fountain to thank her audiences. James Cagney (1904–1986), later a successful film star, began his career as a tap dancer, or “hoofer,” and learned many of his athletic moves from the other dancers and acrobats he met in vaudeville. Fred Astaire (1899–1987) began ballroom dancing on the vaudeville circuit, and Milton Berle (b. 1908) appeared with his young partner, Elizabeth Kennedy (b. 1909), in wisecracking comedy skits. Many acrobatic and musical teams featured siblings or families, such as Ruth and Giles Budd, who performed in the 1910s. As the older of the two, Ruth carried and tossed her little brother both on the ground and while hanging from the
717
trapeze. The Marx Brothers, Harpo (1893–1964), Gummo (1894–1977), Chico (1891–1961), and Groucho (1890–1977), slept all four in one bed as they toured the country throughout their adolescence— before hitting the big time as movie stars. Gummo, who joined the army in 1918, was replaced by his youngest brother Zeppo (1901–1979) but later returned to show business as a talent agent. Gummo managed his brothers’ act for many years. Buster Keaton (1895–1966) began performing with his family at the age of five, in a type of comedy act known as “slapbang.” Slap-bang revolved around Three Stooges–type misunderstandings that would result in wild fights, including comical punches and eye gouges and the use of special props to hit, poke, and slap. Buster’s name actually came from his ability to withstand an enormous amount of physical abuse by training himself to twist, roll, and fall away from the blows. “Slap-bang” in particular attracted the attention of reformers who were concerned about the physical and moral health of children in vaudeville. Unfortunately, many of the youngest performers were exploited as cheap labor, educated poorly and improperly fed, and abandoned once their youthful appeal wore off. Such conditions mirrored the abuse of working children in industrial jobs and were not unique to the theater world. But one man, Elbridge T. Gerry, was particularly disturbed by the long hours, lack of schooling, and poor health of child actors. In 1874, he founded the Society for the Prevention of Cruelty to Children and wrote many articles publicizing and denouncing the working conditions in theaters. Many vaudevillians recall the disruptive practices of the “Gerry society,” which worked to “save” the children by protesting at performances and
718
Vaudeville
bringing legal suits against parents, guardians, and theater managers. Sadly, these children sometimes provided the only livelihood for their families, and the interruption in their careers only worsened their situation. Of course, such controversy did not prevent boys from attending the theater. The content of songs and comedy sketches seen in vaudeville usually reflected current concerns, including the trials and tribulations of the growing immigrant populations arriving in the coastal cities. Young spectators might recognize aspects of themselves or their friends during performances of Irish, German, Jewish, Italian, and Chinese impersonators. Although many of the performers were in fact immigrants, the characters they portrayed were so exaggerated—emphasizing differences of dress, speech, or mannerism—that they could all be called “impersonators.” For instance, the “stage Jew” character had a grotesque face with blackened teeth and a hooked nose enhanced by makeup. He spoke strongly accented English peppered with ridiculous exclamations such as “Und I vish dot I vas dead!” (“And I wish that I was dead!”). The blackface character featured enormous pink lips and wide eyes in contrast to the burnt cork used to darken his face. He, too, spoke an exaggerated dialect intended to mock the inability of African Americans to speak English “correctly.” He would either shuffle around the stage in rags or dart about in fancy, bright-colored clothing he wore to imitate wealthy white people. Such characters usually tried to better their position in life through some scheme, such as finding a tree that grows ham on it, only to be knocked back down by their own stupidity or the superiority of the white society they tried to join. Many historians believe
that vaudeville humor provided a release for ethnic tensions in crowded urban areas, where minorities could laugh at themselves and others as an alternative to violent confrontation. Others describe such ethnic performances as detrimental to the reputation and actual economic conditions of poor immigrants in this country. In either case, vaudeville attracted many ethnic groups as spectators. Therefore, a trip to the theater exposed young people to various cultures—and stereotypes of cultures—both on and off the stage. This may have provided boys with a sense of diverse community, but it also may have contributed to anxieties about life in the increasingly crowded and complicated city. By the late 1920s, the emergence of moving pictures threatened vaudeville’s popularity. Furthermore, the Great Depression closed most remaining theaters. People had lost the extra income to devote to entertainment. When the economy recovered, live performance continued in the forms of musical theater, which presented one long play with musical interludes, and legitimate drama. The song-and-dance girls and cruder comedians of vaudeville were absorbed into the disreputable genre of burlesque, which turned the “leg shows” of the early concert saloons into outright striptease acts. But many of the most successful, family-oriented entertainers, such as the Marx Brothers and George Burns (1896–1996), gave up live performance and went on to great acclaim in radio, film, and television. Television became the vaudeville of the late twentieth century. Its format provides a wide variety of distractions, including music, dance, drama, and comedy. Like vaudeville, its schedule is flexible, and its ticket price is fairly low. Also
Video Games like vaudeville, television continues to struggle with standards of propriety, issues of censorship, and the demands of a competitive marketplace. This is particularly true because of television’s role as entertainment for families and children. Boys can now identify with the action heroes, rap stars, and young comedians of television as they once did with Buster Keaton or James Cagney. But this targeting of young audiences by television draws the attention of governmental agencies and parental “watchdog” groups who follow in the footsteps of Elbridge Gerry in trying to protect young people from morally corrupt images and ideas. Ultimately, morality is interpreted by each individual, and the relationship of entertainment to that interpretation is unclear. With the ascendancy of the personal computer, boys can make even more individualized choices about what entertainment to view and how to interpret it. Television may follow vaudeville into the darkness offstage, to be replaced by the enormous online variety show. Leslie Pasternack See also Melodrama; Performers and Actors; Theatre References and further reading Brockett, Oscar G., and Frank Hildy. 1999. History of the Theatre. 8th ed. Boston: Allyn and Bacon. Erdman, Harley. 1997. Staging the Jew: The Performance of an American Ethnicity, 1860–1920. New Brunswick: Rutgers University Press. Gilbert, Douglas. 1940. American Vaudeville, Its Life and Times. New York: McGraw-Hill. Green, Abel, and Joe Laurie, Jr. 1951. Show Biz: From Vaude to Video. New York: Henry Holt. Kibler, M. Alison. 1999. Rank Ladies: Gender and Cultural Hierarchy in American Vaudeville. Chapel Hill: University of North Carolina Press.
719
Levine, Lawrence W. 1997. Highbrow/ Lowbrow: The Emergence of Cultural Hierarchy in America. Cambridge, MA: Harvard University Press. Moody, Richard. 1980. Ned Harrigan: From Corlear’s Hook to Herald Square. Chicago: Nelson-Hall. Slide, Anthony. 1994. The Encyclopedia of Vaudeville. Westport, CT: Greenwood Press. Snyder, Robert W. 1989. The Voice of the City: Vaudeville and Popular Culture in New York. New York: Oxford University Press. Stein, Charles W., ed. 1984. American Vaudeville as Seen by Its Contemporaries. New York: Alfred A. Knopf. Toll, Robert. 1976. On with the Show! The First Century of Show Business in America. New York: Oxford University Press.
Video Games Video games comprise an array of computer-based entertainment products whose form combines an animated graphical user interface and the real-time interpretation of user input, applied to a fictional, playful, or nonutilitarian goal. Evolving from an oscilloscope display of simple animation running on a room-sized computer at New York’s Brookhaven National Laboratory in 1958 into the contemporary three-dimensional virtual-reality systems sold as consumer electronics for the home, in their approximately forty years of existence video games have grown from an experiment conceived to explore possibilities of representation and user interaction in cybernetic systems into a mass-market medium rivaling the film industry in terms of revenue. Current revenues for video games are estimated at $10 billion yearly for the domestic American market. Within recent media culture, video games tend to be characterized in one of
720
Video Games
A boy playing “Sonic the Hedgehog,” 1990s (Bill Varie/Corbis)
two ways. First, video game mastery is thought to constitute a masculine rite of passage of sorts, in which boys bond with each other and pass into adulthood with an upper hand over girls by having attained superior familiarity with the computer skills critical in an information society. Alternatively, video games have been characterized as an addictive, infantile form of fantasy activity through which males act out antisocial tendencies in the abstract world of “cyberspace.” When the knowledge became public that the “school shooters” responsible for the 1999 deaths of fourteen teenagers (including themselves) and one teacher at Columbine High School in Littleton, Colorado, had been avid players of
violent video games, whose competitive goals revolved around causing murder and mayhem to gain points, media hysteria about video games’ contribution to youth violence reached fever pitch. Several themes can be discerned within this intersection of technology, violence, and masculinity. Video games are a technology-intensive product. Although the success of a hit game may rival that of a hit movie, musical recording, or television show, video games literally are software in the functional sense. User interaction with a programmed computer is central to the ways consumers create meaningful experiences with these products. The popularity of video games and hardware has overturned a key paradigm in the technology sector. In the 1970s and 1980s, video games constituted a manufacturing sector that inherited technologies from more advanced sectors, such as graphics rendering for scientific visualization or military applications, once the technologies became affordable through increased economies of scale. Currently, however, research and development of a variety of technologies, such as three-dimensional graphics rendering, artificial intelligence, and physical modeling, is directed toward primary deployment in the consumeroriented, mass-market, high-value sector of video games. Consumers of video games have been primarily, but not exclusively, young males. In recent years, the average age range targeted by video game marketers has increased from the teenage male demographic to the eighteen- to thirtyfive-year-old adult male, and increasing numbers of females have appeared as consumers. In the early years of home computing, the primary audience for video games was perceived to be teenage
Video Games computer hobbyists oriented toward gaming and programming. The more recent, graphics-intensive 32- and 64-bit machines such as Sony Corporation’s very successful PlayStation console have established a mainstay audience with an older demographic, targeting nightclub, dance, and other young adult cultures. In their evolution from hobbyist machine to mass-market entertainment, the demographics of video game consumers have trended upward. Following the development of an adult market for video games and amid concern among adults and parents that children are being exposed to harmful and violent influences through the medium, a voluntary ratings system was adopted by a consortium of video game developers in 1994, modeled on the ratings system used by the motion picture industry. The ratings system adopted by the computer gaming industry indicates the broadly profitable sectors of the gaming industry; significantly, the ratings cover an even broader spectrum of specific age ranges than the ratings system for films. Video game ratings are EC, for “early childhood” gamers from three years old and up; E, for “everyone,” or general audiences older than six years; T, for “teen” audiences older than thirteen years; M, for “mature” audiences older than seventeen years; AO, for “adults only,” meaning consumers older than eighteen years; and RP, “rating pending,” meaning not yet rated by the Entertainment Software Ratings Board (ESRB). Content descriptions may also appear on packaging, indicating what one of these generic ratings might mean for specific products. They fall into categories such as “violent,” which may range from cartoon violence to “realistic” bloody gore; “language,” from mild to strong, including profani-
721
ties; “sexual themes,” including “strong sexual content”; “comic mischief,” including “gross vulgar humor”; and glorified “use of drugs or alcohol.” Since the establishment of the ratings system, more than 7,000 titles have been reviewed, with participation in the system comprising up to 85 percent of video game publishers, according to the ESRB in 2000 (Gudmundsen 2000). The ratings systems for video games have been paralleled by ratings systems for websites and software filtering devices for television and Internet content, indicating a broad expansion of concern by public-interest groups and lawmakers with regard to digital media content in direct proportion to the growing reach of the distribution of these products and the profitability of their markets. In addition, arcade operators such as GameWorks, a joint venture of DreamWorks SKG, Universal Studios, and Sega Corporation have responded to pressure from federal lawmakers by banning consumers younger than sixteen from playing video games considered violent in their arcades. The film industry’s involvement in the video game market reaches back to the first heady days of runaway profitability, when Warner Communications acquired Pong pioneer Atari Corporation in 1976. The television industry cemented a longer-standing interest in the possibility of delivering interactive television to consumers in 1972, with Magnavox offering the first home video game system, the Odyssey. Even as industry response to public concerns of youth exposure to game violence has increased under pressure from lawmakers, the concerns themselves are not new. Protests were reportedly voiced with regard to the 1976 arcade game, Death Race, based on the exploitation
722
Video Games
movie Death Race 2000. In spite of the primitive graphics of early games, the gratuitous violence of Death Race, in which arcade game drivers scored points by running over fleeing pedestrians, was enough to spark an outcry. In the early and mid-1990s, advanced graphics techniques meant that games like Doom, a “first-person shooter” in which gamers raced down three-dimensional corridors and blasted all comers to gory bits in order to stay alive and win points, and Mortal Kombat, in which victorious characters were pictured graphically ripping out the bloody spines of their fallen opponents, appeared particularly threatening to impressionable youth playing in the privacy of their own homes—or worse yet, alone in their own rooms. These debates on the negative values of video games have occurred in the midst of changing conditions in the video game market. The 1983 film WarGames dramatized the dangers of a young boy who was both hacker and gamer, but it also suggested that only computer-literate youth with a keen sense of the value of play over the value of conflict could avoid apocalyptic Cold War conflagration invited by the proposed Star Wars missile defense system. Hackers and gamers seem more to be separate camps today, largely because video games now are so complex that an amateur user able to program a game single-handedly would be extremely rare (early game machines like the beloved Atari did offer users the capability of programming their own games). With the sophistication of graphical user interfaces, real-time three-dimensional graphics, and complex gestural input sequences controlling onscreen action, gaming and programming have fallen into overlapping but distinct domains of reception and production, respectively.
Still, although the text-oriented adventures of early home gaming might be found now only in less profitable or noncommercial domains of networked multiuser dungeons (MUDs) or objectoriented MUDs (MOOs), the broad genres of video game play have largely remained static over the years, with hybrids between genres arising to take advantage of advances in computing power. Genres include computerized versions of traditional games, such as chess and solitaire; role-playing fantasies derived from games like Dungeons and Dragons; simulations of realistic action situations such as jet flight and drag racing; war games such as the classics Tank and Space Invaders; first-person shooters such as Doom; adventure explorations such as Myst; competition games such as Bust a Groove, in which players show off their best “dance moves”; simulations of science, history, or culture such as Sim City; cybernetic versions of earlier arcade games such as the shooting gallery; sports games based on major-league athletics such as football and basketball; and “edutainment” such as “Where in the World Is Carmen Sandiego?” As graphics, sound, and interaction capabilities have increased, games have hybridized genres, with adventure taking place in three-dimensional worlds where action sequences lead to riddle-solving activities, rewarded by full-motion video playback that advances the narrative by revealing more details of character or background. In recent years, the perceived computer advantage held by boys by virtue of their being the primary marketing target for video games has led to research, development, and design of a new genre: girl games. Girl games have been oriented toward attracting girls who, as a group, are ob-
Video Games served to turn away from computers as they enter the teen years, becoming more interested in social activities oriented toward communication and sharing as opposed to activities dominated by boys such as competitive sports, and video games or technical activities such as computer programming. The genre of girl games has been forwarded as a way of bringing young girls closer to computers, enhancing girls’ familiarity with common hardware and software platforms and paradigms, and broadening computer-based play to include less competitive, nonviolent activities. Approaches taken to achieve these goals have been controversial, even as the games produced toward these ends have been various. Games such as Barbie Fashion Designer have sold in massive quantities, qualifying as bona fide hits in the software marketplace and demonstrating that the market for girls can be as large and as profitable as that aimed at boys. However, critics have responded to such products in positive and negative ways. Some girl games have been criticized for reducing female identity and pleasure to traditional norms of beauty and bodily appearance. In this sense, competition is not absent from girl games, which turn out not simply to emphasize communication and sharing but also restage an older problem of women competing against each other for men’s attention. From this perspective, the video game industry is held to allow room for games for girls so long as the games perpetuate traditional roles for women. Feminist entrepreneurs in the field have characterized purportedly male-oriented video games as uninteresting and have attempted to design games that move beyond traditional girls’ play activities. These entrepreneurs’ goal is to engage and encourage the positive
723
values of nurturing and communicating that girls are thought to be good at and interested in. Yet, some female gamers have responded by pointing out that there is nothing at all “boring” about exploring unknown planets, fighting off alien invaders, and building new worlds—traditionally male fantasies made accessible to girls and women through gaming. The relationship between boyhood and computer play has shifted as consumer products and recreational environments have changed. Where once, in a less technological culture, boys might have formed their responses and engagements with masculinity in clubs and groups outside the home, out of sight of fathers, mothers, and sisters, today fewer open spaces outside the home exist to play in, and fewer noncommercial, safe settings are available to set play in, especially in large cities. Traditional American roles for boys are undergoing stress, and the environments in which these roles are established are increasingly urban and domestic. Even though the computer makes a virtual setting for play possible, if gaming is meant to transition boys into the workplace, the passage from boyhood to manhood may be less clear than previously, since the setting of the play approximates the setting of work. Yet although gaming seems certain to bring youth closer to computers in general, the specific value of commercial video game play for developing work skills seems less apparent than in earlier years, when gamers might program their own play activities. Amid these tensions, the violent video game might signify a certain loss: the “first-person shooter” might emblematize an increasing lack of meaningful play activities available to growing boys as cultural values shift. James Tobias
724
Violence, History of
See also Computers References and further reading AtariWorld.com. “The Atari Timeline,” http://www.atariworld.com/AtariTimeline.html (accessed December 27, 2000). Bennahum, David S. 1998. Extra Life: Coming of Age in Cyberspace. New York: Basic Books. Buckingham, David. 1993. Children Talking Television: The Making of Television Literacy. London: Falmer Press. Formanek-Brunell, Miriam. 1993. Made to Play House: Dolls and the Commercialization of American Girlhood 1830–1930. New Haven: Yale University Press. GeekComix.com. “A Brief History of Home Video Games,” http://www. geekcomix.com/vgh/main.shtml (accessed December 27, 2000). Graetz, J. M. 1981. “The Origin of SpaceWar.” Creative Computing (August). Greenfield, Patricia Marks. 1984. Mind and Media: The Effects of Television, Video Games, and Computers. Cambridge: Harvard University Press. Gudmundsen, Jinny. 2000. “Strategy for Parents: Use Ratings, Be Involved. Choosing Titles by the Letters.” Los Angeles Times, October 26, T8. Herz, J. C. 1997. Joystick Nation: How Computer Games Ate Our Quarters, Won Our Hearts and Rewired Our Minds. New York: Little, Brown. Huffstutter, P. J., and Claudia Eller. 2000. “GameWorks to Restrict Youngsters at Arcades.” Los Angeles Times, October 6, C1. Hunter, William. 2000. “The Dot Eaters: Videogame History 101,” http://www. emuunlim.com/doteaters/index.html (accessed December 27, 2000). Jenkins, Henry. “‘Complete Freedom of Movement’: Video Games as Gendered Play Spaces.” Pp. 262–297 in From Barbie to Mortal Combat: Gender and Computer Games. Edited by Henry Jenkins and Justine Cassell. Cambridge, MA: MIT Press. Kafai, Yasmin B. 1998. “Video Game Designs by Girls and Boys: Variability and Consistency of Gender Differences.” Pp 90–117 in From Barbie to Mortal Kombat: Gender and Computer Games. Edited by Henry
Jenkins and Justine Cassell. Cambridge, MA: MIT Press. Kinder, Marsha, ed. 1999. Kids’ Media Culture. Durham: Duke University Press. Kline, Stephen. 1993. Out of the Garden: Toys, TV and Children’s Culture in the Age of Marketing. London: Verso. Rushkoff, Douglas. 1996. Playing the Future: How Kid’s Culture Can Teach Us to Thrive in an Age of Chaos. New York: HarperCollins. Sheff, David. 1993. Game Over: How Nintendo Zapped an American Industry, Captured Your Dollars, and Enslaved Your Children. New York: Random House. Videogames.com. “The History of Video Games,” http://www.videogames. com/features/universal/hov/ (accessed December 27, 2000). “Voices from the Combat Zone: Game Grrrlz Talk Back.” Pp. 328–341 in From Barbie to Mortal Combat: Gender and Computer Games. Edited by Henry Jenkins and Justine Cassell. Cambridge, MA: MIT Press.
Violence, History of American boys encounter violence in each of the three major settings of their lives—their homes, their schools, and their communities. Abusive parents and siblings, bullying students, teachers who inflict physical punishment, and threatening communities and gang members all impose violent and potentially harmful risks to young boys. It stands to reason that boys who fear victimization at the hands of others will develop their own violent tendencies as a means of defense and protection. In this way, a cycle of violence is initiated. This cycle is far from new; its roots can be traced through centuries of history. But in a modern world with modern weapons, aggression can result in much more serious consequences for both the victims and perpetrators of violence. In order to stop the cycle, violence in the lives of American
Violence, History of boys must be understood as an ongoing tradition that has a long history and an undetermined future. Violence against boys in their families is far from new. Colonial religious leaders taught parents that children were born tainted with sin and that a child’s disobedience was the manifestation of this sin. Parents were told to stifle expressions of sin (e.g., resistance and willfulness) with stern and often harsh disciplinary practices. Fathers were the unchallenged rulers of children and the primary family disciplinarians until the late eighteenth and early nineteenth centuries. However, as the mode of production shifted from the family to the market, men left their primary role as day-to-day heads of families to work outside the home, and women became the primary keepers of children. It is thought that severe corporal punishment and child abuse decreased with this shift to maternal discipline. Today, this pattern remains. Mothers are overwhelmingly the primary caregivers of American children, and by extension, mothers are their primary disciplinarians as well. National data indicate that women are more likely than men to physically abuse their children (Sedlak and Broadhurst 1996), but this finding could merely be a result of the fact that children spend much more time with their mothers than with their fathers. Even though most children spend more time with their mothers, 89 percent of children who are sexually abused by one of their birth parents are sexually abused by their fathers (Sedlak and Broadhurst 1996). In addition, 23 percent of physically abused children, 14 percent of emotionally abused children, and 46 percent of sexually abused children are abused by adoptive, step-, or foster parents (Sedlak and Broadhurst 1996). A significantly
725
higher number of sexual abuse cases are perpetrated by stepfathers as compared to biological fathers, biological mothers, and nonbiological mothers. In 1994, one-third of all child murders in the United States for which there was a known perpetrator involved a child being killed by a family member (Greenfeld 1996). By their second birthday, more than 90 percent of American children sustain at least one act of physical aggression at the hands of their parents (Straus and Gelles 1990). Physical abuse in childhood has been linked to many forms of maladjustment, including insecure primary attachments, posttraumatic stress disorder, academic difficulty, diminished self-esteem, delinquency, and violence (see Wekerle and Wolfe 1996 for a review). The Third National Incidence Study of Child Abuse and Neglect (NIS-3) estimated that boys from infancy to age seventeen sustain physical abuse at a rate of more than 9 per 1,000 (Sedlak and Broadhurst 1996). Although this rate does not differ significantly from the corresponding rate for girls, The NIS-3 data also indicate that, although boys are less likely than girls to be sexually abused, they are more likely to be emotionally neglected. Also, boys tend to be more seriously injured by physical abuse than girls. The U.S. Department of Health and Human Services reported in 1996 that child maltreatment appears to be on the rise. Children are not only victimized directly in their families but also indirectly through exposure to violence between their parents. As with corporal punishment of children, wife beating has been common throughout history. Laws supported a husband’s use of physical aggression as a means of getting his wife to conform to his demands. For example, the
726
Violence, History of
English “Rule of Thumb Law” allowed husbands to beat their wives with objects as long as the objects were no thicker than the man’s thumb. Historical reviews of colonial life portray men as the primary—if not sole—perpetrators of marital violence. This is no longer the case, as evidence suggests that men and women perpetrate marital violence at similar rates and frequencies. This changing trend can be explained largely by the fact that marital violence is less about physical aggression and more about inequality and the struggle for marital power. As women gained status and economic independence in American society, they became more resistant to their husbands’ attempts to assert power over them. Surprisingly, national data reveal that women are as likely as men to initiate marital violence, refuting the notion that women only engage in marital violence defensively in response to attacks initiated by their husbands. However, marital violence perpetrated by men tends to result in more physical harm than does marital violence perpetrated by women (Stets and Straus 1990). Today, millions of American children are exposed to marital violence each year. Specific estimates range from 3.3 million (Carlson 1984) to more than 10 million (Straus 1992). Research indicates that children exposed to marital violence exhibit less social competence and more aggression, anxiety, depression, and withdrawal than do their nonexposed counterparts (for reviews, see Edelson 1999; Fantuzzo and Lindquist 1989). Analyses of national data suggest that boys are more likely than girls to witness violence between their parents (Vorrasi, Eckenrode, and Izzo 2000). Since the 1990s, violence perpetrated by children against their parents has received increased attention due, in part, to
a series of highly publicized instances of parricide (the killing of one’s parent or parents). For example, in Los Angeles in 1989, brothers Eric and Lyle Menendez, who complained of years of physical, psychological, and sexual abuse at the hands of their parents, shot and killed their mother and father. In 1997 in Pearl, Mississippi, sixteen-year-old Luke Woodham savagely killed his mother and hours later opened fire on his classmates, killing three. Woodham cited a desire for peer acceptance as his motive for killing his mother. In Springfield, Oregon, in 1998, fifteen-year-old Kip Kinkel, citing a desire to save his parents from the embarrassment they would have endured having to tell their friends that their son was charged with illegal possession of a stolen firearm, shot and killed his mother and father. Despite recent interest, parricide remains an uncommon occurrence. National estimates indicate that parricides account for approximately 2 percent of all murders in this country (Dawson and Langan 1994). Several risk factors are common to most instances of parricide, including patterns of child abuse and neglect, alcohol abuse, social isolation, mental illness, suicidal ideation, and access to firearms. A study by Adam Weisman and Kaushal Sharma (1997) found that approximately 75 percent of parricide offenders have criminal records prior to killing their parent(s), but the best predictor of parricide remains the gender of the assailant. With only a handful of exceptions, parricide offenders are overwhelmingly male. Not only in the home but also in schools, boys have been the victims and the perpetrators of violence. At the outset of colonization in the seventeenth century, American schools were set up
Violence, History of by their Puritan founders to ensure that everyone could read the Bible. Acting in loco parentis, teachers, who were almost all male until the nineteenth century, used the same harsh punishments as parents. Children were required to submit to the will of the teacher just as they would with their own parents. Violence against children in the schools had the potential to be even harsher than that in the home because parents had fewer children to manage and a greater stake in youngsters’ well-being than did teachers. Furthermore, the school setting was conducive to neither learning nor good behavior, especially for younger children. Pupils of all ages and abilities learned together. Lessons were dull and repetitive. Generally, the only reading material was the Bible, and teachers were poorly trained, often having little more education than their pupils and no formal training in teaching. Although whipping and flogging were the most usual punishments, there were other more unusual forms of violence inflicted upon students. For example, children were forced to hold an inkstand or book at arm’s length or made to bend over before a hot stove and remain in this position with one finger on a peg in the floor (Bybee and Gee 1982). Other reported punishments included wedging children’s mouths open with a wooden block, tying them to a chair leg for an hour, and locking them in closets (Crews and Counts 1997). The Reverend Warren Burton explained that “such methods of correcting offenders have been in use time out of mind” (Bybee and Gee 1982). In the middle of the eighteenth century, reports appeared depicting school discipline problems, student rebellions, and attacks on teachers. One schoolmaster, for example, was set upon by three
727
large boys after he hit the younger brother of one of them, drawing blood. The boys wrested his weapon from him, dragged him out of the schoolhouse, and threw him down a steep incline (Bybee and Gee 1982). As the population of the United States grew and diversified with the influx of immigrants, a need for universal education was perceived. The 1830s saw the rise of the common school movement, the main thrust of which was to instill a uniform set of social and moral values in the country’s youth. At this time, the control of schools began shifting from the community to the state. Although one of the tenets of the common school movement was that harsh punishments be avoided, physical violence against children remained common in the schools. As state regulation of education spread, parents had less influence over how their children were treated in school. Several court rulings in the latter half of the nineteenth century overrode parental authority in favor of teacher authority, including the authority to use violent means to control student behavior (Rovetta and Rovetta 1968). Nevertheless, court records of this period show evidence of a growing concern from parents regarding the use of violence in maintaining school discipline. Another shift in education at this time was the movement toward training and hiring women as teachers, especially of younger children. Women teachers were less likely than their male counterparts to use physical force to control children’s behavior. This, along with the public outcry against corporal punishment, resulted in restrictions on its practice in many states beginning in the middle of the nineteenth century and continuing until recently (Rovetta and Rovetta 1968). Schools became more child-centered.
728
Violence, History of
Curriculum became more relevant and interesting, and discipline became gentler. A period of relative calm reigned in the nation’s schools in the first half of the twentieth century (Crews and Counts 1997). A 1949 survey revealed that school principals considered student lying and disrespect to be the biggest problems in their schools. Though corporal punishment is no longer endorsed in most schools, teacher violence against children remained and has continued to the present time. By 1989, twenty-one states had abolished corporal punishment, but thirteen states still authorized its use by teachers or other school personnel (Hyman 1990). There is little evidence, however, that a return to more violent means of controlling student behavior would result in a reduction of student violence against peers or teachers. Children who are hit by teachers when they are too small to fight back may retaliate when they have grown (Welsh 1976). That schools in the past were described as battlegrounds between students and teachers suggests that corporal punishment was never very effective even when it was the normative form of discipline. Teachers who used it successfully appear to have done so because they were bigger and stronger than their pupils and could beat them in handto-hand combat. With the increased availability of handguns, however, even the smallest children have the potential to win a battle against the teacher. Although interpersonal violence among students is mentioned before the middle of the twentieth century, it is difficult to determine its prevalence and severity. Where references to bullying or fighting are made, it is in the context of other discipline problems and not portrayed as any more severe than other rule violations. As
severe forms of physical punishment were still sanctioned, however, it is possible that physical violence per se was not frowned upon, but only the disruption that fighting caused to classroom routine. However, by 1956, school violence, especially student attacks on teachers, became a general concern. Desegregation brought a new form of violence to the schools—threats to the students by the general public. Early attempts at desegregation in the South required the National Guard to protect African American children not only from other students and teachers but from white parents and community members who were opposed to integrated schools (Crews and Counts 1997). Interracial violence is still a problem in many schools, but the phenomenon is usually associated with gang activity and therefore seen as part of the more general issue of gang violence in schools. It is often unclear when violence should be defined as “bullying,” a phenomenon common to almost every school. However, when violence is defined as “gang activity,” it generally implies a more serious problem than does “bullying.” The use of weapons and symbols, and specific attacks against students of certain races or ethnic groups are usually indicators of gang-related violence. The 1970s saw a tremendous increase in student-perpetrated violence against teachers and fellow students (Warner, Weist, and Krulak 1999). Moreover, students began carrying and using weapons more frequently, resulting in more severe consequences of this violence. In the last two decades of the twentieth century, many students felt that they needed to bring a weapon to school in order to protect themselves from bullies and gangs. A Harvard School of Public Health survey reported that 12 percent of children,
Violence, History of kindergarten through twelfth grade, reported carrying weapons to school in 1995, with handguns accounting for onethird of these reports (Kopka 1997). A rash of “school shootings” in the 1980s and 1990s raised public alarm about guns in schools, especially because these shootings were not committed by gang members but rather by upper-middleclass white boys who had been victimized by bullies. These acts of “retaliatory violence” spurred many school administrators to adopt a “no tolerance” position regarding guns, racism, and violence in their schools. Nevertheless, in 1993, 82 percent of school principals reported an increase in violence in their schools during the previous five years, 60 percent of schools had at least one weapons incident, and 13 percent reported at least one knifing or shooting (Warner, Weist, and Krulak 1999). Some sources indicate a leveling-off of interstudent violence in the mid- to late 1990s, but others report an increase up to the present time (Warner, Weist, and Krulak 1999). Not only do boys encounter violence in their homes and schools, but they also experience it in their communities. Youth gangs have existed in America since the late eighteenth century. Gang members have typically been young urban males from poor neighborhoods. The gang phenomenon grew in size as massive waves of immigrants poured through cities such as New York, Chicago, Boston, and Philadelphia in the mid- to late 1800s. Anti-Catholic riots and immigrant lynchings in urban areas forced many Irish, Jewish, and Italian immigrants to form protective gangs due to constant fear of harassment and brutality at the hands of prejudiced men and police officers who blamed the immigrants for all of society’s ills (Klein 1995).
729
Slum neighborhoods such as New York City’s Five Points and Hell’s Kitchen were the breeding ground for many of the early youth gangs. These groups were generally set up along racial or ethnic lines. Some gangs, like the Irish Molly Maguires and the Sicilian Cosa Nostra, were formed as American branches of old world gangs. Other gangs such as the Irish Bowery Boys in New York City and the white southern Ku Klux Klan were formed as distinctly American organizations (Klein 1996). The members of the first American gangs were impoverished. Illicit economies offered poor boys a way to make a lot of money very quickly. Gang crimes soon escalated from petty theft, fighting, and burglary to extortion, strikebreaking, and other organized rackets such as “numbers running,” prostitution, and gambling. By the end of the nineteenth century, the template for American gangs was set. Gangs were groups of young men of the same race and ethnicity who engaged in criminal activities, used symbols in communications, controlled specific territories, and ran illicit economies within those territories (Curry and Decker 1998). Since these gangs functioned outside the law, conflicts between gangs were settled by themselves—typically in a violent fashion. The immigration boom in the early part of the twentieth century, coupled with the advent of Prohibition in 1920, changed the face of gang crime completely. Gangsters now had millions of potential customers who would pay top dollar for illegal alcohol. The distilling, bottling, transportation, distribution, and selling of illegal alcohol became a billion-dollar industry. Organized crime became a profession run by adult men rather than teenage delinquents, but when Prohibition ended, the
730
Violence, History of
mob had to branch out into other areas in order to remain profitable. Youth gang members became the primary salesmen of organized crime’s new cash crop—illegal drugs. In the latter half of the twentieth century, the face of urban youth gangs changed from primarily European groups to Hispanic and African American groups. The main source of income for these gangs was the sale and distribution of illegal drugs, such as marijuana, cocaine, amphetamines, and heroin. Hispanic gangs such as the Latin Kings and Latin Disciples and African American gangs such as the Vice Lords and Black Disciple Nation took control of the drug trade in large territories of major cities like New York and Chicago. In southern California, Mexican American gangs, or cholos dominated most drug trafficking between America and Mexico. In Los Angeles, the African American rival gangs, the Bloods and the Crips, became so large and prosperous that they crossed state lines and are now nationwide organizations with activity reported in forty-two states (Skolnick et al. 1988). The estimated number of gangs grew exponentially, from 2,000 gangs with 100,000 members in 1980 to over 31,000 gangs with 846,000 members in 1996 (Moore and Terrett 1998). The use of intense stimulants, coupled with an increased access to high-powered rapid-fire weapons and a heightened demand for illegal drugs, resulted in a resurgence of gang violence (Klein and Maxson 1989). Deadly gunfights between rival gangs known as “gangbangs” and planned ambushes in the form of “driveby” shootings became commonplace in many American cities. The aggressive defense of gang territories and the use of
identifying “colors” such as blue for the Crips and red for the Bloods led to frequent street violence that often resulted in the death of innocent bystanders. The introduction of crack cocaine in the 1980s only increased the violence. Crack is a less expensive yet highly addictive form of cocaine that provides the user with an intense but quick high, resulting in a heightened demand for the drug. The “crack epidemic” led to an unprecedented government focus on gang violence and drug use during the Reagan era. Currently, the typical gang member is a male between the ages of twelve and twenty-four; his average age is seventeen to eighteen. Traditional territorial gangs average about 180 members, but gangs that specialize in a certain area of drug trafficking average about 25 members. However, some large city gangs number in the thousands and the largest nationally affiliated gangs number in the tens of thousands (Howell 1998). According to a nationwide survey, about 90 percent of all gang members are either African American or Hispanic; the other 10 percent are either Asian or white (Curry and Decker 1998). Though other researchers have found slightly different percentages, the overrepresentation of African Americans and Hispanics in gangs is a stable finding across studies. Although gang violence among African Americans is typically related to the drug trade, Hispanic gang violence is more often the result of disputes over gang territory or “barrio” (Block, Christakos, and Przybylski 1996). Though they still comprised a very small minority of youths in America, gangs were more predominant in the 1990s than they were in the 1970s and 1980s (Klein 1995). Recent trends in gang activity include elevated drug use and violence, increased
Violence, History of female membership, and the migration of gang affiliations across city and state lines. Nevertheless, the reasons for joining a gang remain the same. These reasons include protection from other gangs, access to money from illicit economies, increased prestige, excitement, and a strengthened sense of identity and belonging. Some of the corresponding risk factors for gang membership are the presence of gangs in the neighborhood, poverty, lack of economic or social opportunity, high levels of crime, lack of adult male role models, academic failure, friends or family members in gangs, delinquency, aggression, victimization, and alcohol or drug use. Intervention and gang prevention programs have focused on these risk factors, but consistent positive results for any one type of program have not yet been found (Howell 1998). Gang violence is usually directed at other gangs. A longitudinal study in Chicago found that 75 percent of gangrelated homicides were intergang killings, 11 percent were intragang, and only 14 percent involved the killing of a nongang victim (Block, Christakos, and Przybylski 1996). Gang members are sixty times more likely to be killed than nongang members, and gang homicides are unique in that each killing creates an extremely high potential for deadly retaliation (Morales 1992). Though drug trafficking is common among inner-city gangs, it is not the prime cause of gang violence. Most gang homicides are the result of intergang conflict, territorial disputes, acts of retaliation, and defending gang “honor” or reputation (Block, Christakos, and Przybylski 1996). Major cities such as New York, Boston, Chicago, and Los Angeles are currently working to reduce gang violence by establishing community-based programs
731
grounded in social prevention, vocational rehabilitation, and strict gun control (Howell 1998). Violence has always been a significant part of American life throughout the nation’s history. The country was founded by a bloody revolution, reconstructed after a brutal civil war, and defended through a succession of fierce military conflicts. It must be remembered that many of America’s soldiers were young boys—barely eighteen years old or less—who were taken out of school or off the farm and expected to kill and die for their country. The violence demanded of these boys by their nation was not invented in army training camps. American boys learned to fight at home, in school, and in the streets of their community. Many of them learned how to take punches from abusive parents, teachers, schoolyard bullies, and street gangs. Many of them also learned how to throw punches from the same people. The double-edged dilemma of violence in the lives of American boys should be understood as a product of both the violence expected of them and the violence inflicted upon them by American society. William Indick Joseph A. Vorrasi Faith Markle See also Abuse; Bullying; Discipline; Gangs; Schools, Public; Violence, Theories of References and further reading Block, Carolyn Rebecca, Antigone Christakos, and R. Przybylski. 1996. “Street Gangs and Crime: Patterns and Trends in Chicago.” Research Bulletin. Chicago: Criminal Justice Information Authority. Bybee, Rodger W., and E. Gordon Gee. 1982. Violence, Values, and Justice in the Schools. Boston: Allyn and Bacon.
732
Violence, History of
Carlson, Eve B. 1984. “Children’s Observations of Interparental Violence.” In Battered Women and Their Families. Edited by A. R. Roberts. New York: Springer Publishing. Crews, Gordon A., and M. Reid Counts. 1997. The Evolution of School Disturbance in America: Colonial Times to Modern Day. Westport, CT: Praeger. Curry, G. David, and Scott H. Decker. 1998. Confronting Gangs: Crime and Community. Los Angeles: Roxbury. Dawson, John M., and Patrick A. Langan. 1994. Murder in Families. Washington, DC: U.S. Department of Justice, Bureau of Justice Statistics. Edelson, Jeffery. L. 1999. “Children’s Witnessing of Adult Domestic Violence.” Journal of Interpersonal Violence 14, no. 8: 839–870. Fantuzzo, John W., and Carrol U. Lindquist. 1989. “The Effects of Observing Conjugal Violence on Children: A Review and Analysis of Research Methodology.” Journal of Family Violence 4, no. 1: 77–93. Garbarino, James. 1998. Lost Boys: Why Our Sons Turn Violent and How We Can Save Them. New York: Free Press. Greenfeld. Lawrence A. 1996. Child Victimizers: Violent Offenders and Their Victims. Washington, DC: Office of Juvenile Justice and Delinquency Prevention. Howell, James C. 1998. “Youth Gangs: An Overview.” Juvenile Justice Bulletin. Washington, DC: U.S. Department of Justice, Office of Juvenile Justice and Delinquency Prevention. Hyman, Irwin A. 1990. Reading, Writing and the Hickory Stick: The Appalling Story of Physical and Psychological Abuse in American Schools. Lexington, MA: Lexington Books. Hyman, Irwin A., and James H. Wise. 1979. Corporal Punishment in American Education: Readings in History, Practice, and Alternatives. Philadelphia, PA: Temple University Press. Klein, Malcolm W. 1995. The American Street Gang. New York: Oxford University Press. ———. 1996. “Gangs in the United States and Europe.” European Journal on Criminal Policy and Research (special issue): 63–80.
Klein, Malcolm W., and Cheryl Lee Maxson. 1989. “Street Gang Violence.” Pp. 198–234 in Violent Crime, Violent Criminals. Edited by M. E. Wolfgang and M. A. Weiner. Newbury Park, CA: Sage. Kopka, Deborah L. 1997. School Violence: A Reference Handbook. Santa Barbara, CA: ABC-CLIO. Moore, John P., and Craig P. Terrett. 1998. Highlights of the 1996 National Youth Gang Survey. Washington, DC: U.S. Department of Justice, Office of Juvenile Justice and Delinquency Prevention. Morales, Armando. 1992. “A Clinical Model for the Prevention of Gang Violence and Homicide.” Pp. 105–118 in Substance Abuse and Gang Violence. Edited by R. C. Cervantes. Newbury Park, CA: Sage. Rovetta, Catherine Humbargar, and Leon Rovetta. 1968. Teacher Spanks Johnny: A Handbook for Teachers. Stockton, CA: Willow House Publishers. Sedlak Andrea J., and Debra D. Broadhurst. 1996. Third National Incidence Study of Child Abuse and Neglect: Final Report. Washington, DC: U.S. Department of Health and Human Services. Skolnick, Jerome H., Theodore Correl, Elizabeth Navarro, and Roger Rabb. 1988. The Social Structure of Street Drug Dealing. Unpublished report to the Office of the Attorney General of the State of California. Berkeley: University of California at Berkeley. Stets, Joan E., and Murray A. Straus. 1990. “Gender Differences in Reporting Marital Violence and Its Medical and Psychological Consequences.” Pp. 151–165 in Physical Violence in American Families: Risk Factors and Adaptations to Violence in 8,145 Families. Edited by M. A. Straus and R. J. Gelles. New Brunswick, NJ: Transaction. Straus, Murray A. 1992. Children as Witnesses to Marital Violence: A Risk Factor for Lifelong Problems among a Nationally Representative Sample of American Men and Women. Report of the 23rd Ross Roundtable. Columbus, OH: Ross Laboratories. Straus, Murray A., and Richard J. Gelles, eds. 1990. Physical Violence in American Families: Risk Factors and
Violence, Theories of Adaptations to Violence in 8,145 Families. New Brunswick, NJ: Transaction. U.S. Department of Education, National Center for Education Statistics. 1998. Violence and Discipline Problems in U.S. Public Schools: 1996–1997. NCES 98-030. Washington, DC: U.S. Government Printing Office. U.S. Departments of Education and Justice. 2000. Indicators of School Crime and Safety, 2000. NCES2001017/NCJ-184176. Washington, DC: U.S. Government Printing Office. Vorrasi, Joseph A., John J. Eckenrode, and Charles V. Izzo. 2000. Intergenerational Transmission of Marital Violence: A Gender-Similarity Hypothesis. Paper presented at the Fifth International Conference on the Victimization of Children and Youth, Durham, NH. Warner, Beth S., Mark D. Weist, and Amy Krulak. 1999. “Risk Factors for School Violence. Urban Education 34: 52–68. Weisman, Adam M., and Kaushal K. Sharma. 1997. “Parricide and Attempted Parricide: Forensic Data and Psychological Results.” In The Nature of Homicide: Trends and Changes. Washington, DC: U.S. Department of Justice, Office of Justice Programs, National Institute of Justice. Wekerle, Christine, and David A. Wolfe. 1996. “Child Maltreatment.” In Child Psychopathology. Edited by E. J. Mash and R. A. Barkley. New York: Guilford Press. Welsh, Ralph S. 1976. “Severe Parental Punishment and Delinquency: A Developmental Theory.” Journal of Clinical Child Psychology 5, no. 1: 17–21.
Violence, Theories of Theories about the origin of violence can be classified on the basis of whether their authors locate the causes of violence in the organism itself, in its environment, or in both places. The biological purist spots the cause of violence in the organism itself. Sarnoff Mednick (1977) has provided an explanation of this type. According to him, punishment is both the most practi-
733
cal and efficient means for teaching children to behave as adults wish. He believes that children become violent because they inherited a dull central nervous system that prevents them from learning from punishment. His entire theory can be boiled down to three simple propositions: (1) children’s genes determine the sensitivity of their central nervous systems; (2) the sensitivity of children’s central nervous systems determines their level of fearfulness; and (3) children’s fearfulness determines whether they can learn from punishment. For example, a boy who inherits a dull central nervous system feels the urge to react violently when provoked. Because past punishment has failed to make him fearful of taking violent action, he acts on rather than inhibits this urge. No matter how repeatedly or severely these boys are punished, they are incapable of learning to inhibit their violent urges. A boy who inherits a sensitive central nervous system will also feel the urge to react violently when provoked, but because of past punishment, this boy will become fearful and inhibit his violent urge. The inhibition of this urge reduces this boy’s fearfulness, which rewards him for his nonviolent reaction. According to this theory, boys more often than girls become violent because the genes that wire the central nervous system are sexlinked. Unlike the biological purists, the environmental purist spots the cause of violence in boys’ living habitats rather than in their bodies. Marvin Wolfgang and Franco Ferracuti (1967, 143) provide an environmental theory of violence. They equate the environment with a culture, the principal components of which are norms and values. However, the cultural environment is heterogeneous. The norms and
734
Violence, Theories of
values that prevail in one environmental niche may not prevail in another. Their basic underlying assumption is that children absorb the prevailing norms and values of their environmental niche like a dry sponge dropped into a large pool of water (Sutherland 1973, 43). According to them, boys become violent from living in a “subculture of violence,” an environmental niche in which “pro-” rather than “antiviolent” norms and values prevail. The subculture of violence explanation is also based on three simple propositions: (1) children are exposed to different environmental niches or subcultures; (2) children who have been exposed more to violent rather than nonviolent subcultures absorb “pro-” rather than “antiviolent” norms and values; and (3) the absorption of proviolent norms and values creates a violence-prone personality. Thus, it is the relative amount of exposure that children have had to violent and nonviolent subcultures that accounts for their violent behavior. For example, when someone provokes a boy who has had greater exposure to a violent subculture, his norms would dictate that he act violently rather than nonviolently, and his values would cast his taking violent action in a positive light and his taking nonviolent action in a negative one. In contrast, a boy who has had greater exposure to a nonviolent subculture would have the opposite reactions. According to this theory, boys more often than girls become violent, not because their central nervous systems are wired differently but because boys have more contact with violent subcultures. Unlike the biological and environmental purists, eclectics spot the causes of violence in multiple factors located in the organism and the environment rather than in one or the other place alone.
Dorothy Lewis (1992, 1998) prefers an eclectic theory. According to her, at least five biological and environmental factors must come together for children to become violent. The first factor is the “XY syndrome,” from which all males suffer, Lewis believes (1998, 287–288). Because, according to her, the XY syndrome is a genetically determined condition that has two defining characteristics, high androgen production and a “masculinized brain,” she locates this syndrome wholly in the body. Like the XY syndrome, brain damage, the second factor, is also located in the body and therefore is an organic factor. If children suffer injuries that disrupt the pathways between their frontal lobes and the brain’s reptilian base buried beneath it, then they cannot control their primitive urges. Lewis (1998, 288) describes this condition as analogous to driving a truck with worn-out brakes. The third organic factor is an overly sensitive amygdala, the portion of the brain “hidden within each temporal lobe” that is primarily responsible for “our sense of fear.” Regarding this brain disorder, Lewis (1998, 288) observes: “We cannot do without the amygdala. But fear is often the nidus for paranoia. A certain amount of fear is necessary for survival. On the other hand, too much can make us dangerous.” The concentration level of neurotransmitters in the brain is the fourth organic factor. Lowered levels of neurotransmitters such as serotonin cause children to be irritable and prone to anger (1998, 289). Unlike the first four factors, the fifth and final factor, “violent abuse,” is the only environmental factor. It can result from children being either the actual victim of a physical attack or an eyewitness to one. In either case, the violent abuser provides them with a role model for their own future violent behavior.
Violence, Theories of In short, the presence of two or three of these factors in a child’s life is not enough to make him or her become violent. Instead, all five factors must coalesce for this to happen (1992, 387–389). Brain damage (factor two), amygdala disorder (factor three), and neurotransmitter depression (factor four) only make children susceptible to becoming violent. In order for them to succumb to this susceptibility, they must also be violently abused (factor five). More boys than girls become violent, not because girls suffer significantly less often from brain damage, amygdala disorders, depressed neurotransmitter levels, or violent abuse but because of their immunity to the XY syndrome (factor one). The chief weakness of the violence theories of biological purists, environmental purists, and eclectics is that they do not take into account the essential character of “human experience.” The term human experience refers to the outer physical reactions together with the inner thoughts and emotions that occur when human beings interact with their environment at a particular point in time (Dewey 1929). Both a higher organism—one with a mind—and an environment are needed for a human experience to occur. However, a human experience results from the interaction of the human organism as a whole and not some special organ of it, such as the brain alone, with its environmental niche. Moreover, human organisms, their environmental niche, and their ongoing experiences exist in an interdependent relationship to one another (Mead 1934, 129). The experience produced from the interaction between a human being and the environment changes, however slightly, not only the human organism but also the environmental niche, such as a neighborhood.
735
The newly changed organism and environmental niche, in turn, change all the human organism’s subsequent experiences, which still in turn change both the human organism and its environmental niche even further. Thus, the relationship between human organisms and environmental niches are not only interdependent but also developmental (Lowontin, Rose, and Kamin 1984, 265–290; Montagu 1985; Lewontin 2000). Because of the interdependent and developmental nature of this relationship, children, like adults, always play an active rather than merely passive role in their own violent transformations (Athens 1997, 22–27; 115–120; Blumer 1997, 3–6). Lonnie Athens (1992) develops another theory to explain violent behavior in boys and girls. The name of the theory is “violentization,” which he formed from combining the words “violent” and “socialization.” Although first published more than a decade ago (Athens 1989), this theory remained relatively unknown until Richard Rhodes (1999) popularized it. In this entry, he revises the theory in two ways. First, he renames two of the four stages to make more explicit the ubiquitous role dominance plays (Athens 1998). Second, he explains why more boys than girls become violent. Violentization is composed of both unitary and composite experiences. A unitary experience is a distinct, elementary experience that cannot blend with other elemental experiences any more than oil can mix with water, whereas a composite experience is composed of distinct, elemental experiences that coalesce. Whether unitary or composite, the experiences comprising violentization do not occur all at once but occur over a process with four separate stages that build on each other like the layers of a cake.
736
Violence, Theories of
The first stage is “brutalization,” a composite experience made up of three distinct elemental experiences: violent subjugation, personal horrification, and violent coaching. During “violent subjugation,” authentic or would-be subjugators, such as fathers, stepmothers, older siblings, neighbors, or schoolmates, use or threaten to use physical force to make a perceived subordinate accept their domination. Violent subjugation can be practiced in one of two ways. It is practiced coercively when a subjugator seeks to make a perceived subordinate comply with a specific command and uses only enough force to achieve this limited goal. In contrast, when a person seeks to teach perceived subordinates a lasting lesson about his or her dominance over them and uses more than enough force to achieve their promise of future submission, that person is practicing retaliative subjugation. Although both forms are brutal, coercive subjugation is relatively merciful. During coercive subjugation, subordinates can immediately stop getting battered by complying with their subjugator’s present command, whereas during retaliatory subjugation, a subordinate is not afforded this precious luxury. During “personal horrification,” the second elemental experience that comprises brutalization, perceived subordinates do not undergo violent subjugation themselves, but they witness someone close to them, such as a mother, brother, close friend, neighbor, or schoolmate, undergoing it. Although not as physically traumatic as violent subjugation, this experience can be even more psychologically damaging. Moreover, after undergoing personal horrification, perceived subordinates can be effectively subjugated for a while, at least by physical intimidation alone.
“Violent coaching” is the final elemental experience that comprises brutalization. During this experience, a superordinate takes on the role of coach and assigns a perceived subordinate the role of novice. The coach instructs novices that they should not try to avoid, appease, ignore, or run from their would-be subjugators but instead physically attack them. Thus, the coach’s goal is to prompt violent conduct on the part of the novice, which, ironically, the novice could later direct against the coach. In a West Baltimore neighborhood, violent coaching is known as “crimping ’em up,” which the inhabitants define as “the process by which older kids toughen up younger ones” (Simon and Burns 1997, 205–206). Coaches have a variety of techniques at their disposal for prompting novices to take violent action against would-be subjugators. One technique is “vain glorification.” Here, coaches regale novices with personal anecdotes about their own or their cronies’ violent actions in which they portray themselves as heroes, or at least antiheroes, and their would-be subjugators as villains. The pleasure that novices derive from hearing their coaches’ stories makes them long for the day when they can finally have their own violent feats to brag about. “Ridicule” is a second technique that coaches use to provoke violence on the part of novices. The coach belittles the novice for his reluctance or refusal to physically attack people who try to subjugate him. The coach continuously mocks the novices until the realization sinks in that it is better for him to physically attack a potential subjugator than to suffer any more derision from the coach. Coaches who prefer a less subtle technique than either vain glorification or ridicule can always use “coercion,” a spe-
Violence, Theories of cial case of violent subjugation (described earlier) in which superordinates either threaten or actually harm a novice for refusing to obey their instructions to physically attack some would-be subjugator. Novices quickly get the message that it would be smarter for them to physically attack some other subjugator than to get physically harmed by their coach. “Haranguing” is still another technique. Here, the coach relentlessly rants and raves about hurting would-be subjugators without ever belittling, physically threatening, or appealing to novices’ vanity, as the other techniques do. Novices are repeatedly told the same thing in the hope that it will eventually sink into their heads. A final technique that a coach can use for prompting novices to take violent action is “besiegement.” If a single technique will not prompt a novice to take violent action against potential subjugators, then coaches can always resort to a combination of techniques. Because besiegement combines all the techniques described previously, except for haranguing, a coach can make novices endure the pain and anxiety of ridicule and coercion if they refuse to physically attack a wouldbe subjugator while assuring them of certain relief from this pain and anxiety, as well as the added enjoyment of vain glorification, if they do succeed in harming him or her physically. Boys’ and girls’ passage through the brutalization stage may differ. Although girls may undergo violent subjugation and personal horrification as often as boys, boys undergo violent coaching more often than girls. Because violent coaches suffer from the same gender bias as many other members of society, usually they expect girls to follow the traditionally female subordinate role rather than the customarily male superordinate role. Coaches
737
less frequently encourage violent behavior in a girl than in a boy because they find it more acceptable for girls to rely on charm and guile rather than brute force to settle dominance disputes. Thus, girls may just as often as boys enter the brutalization stage, but boys much more often complete this stage. The second stage in the violentization process is “defiance” (formerly labeled “belligerency”). Unlike brutalization, defiance is a unitary yet nuanced experience. During this experience, subordinates seek to resolve the crisis into which their brutalization has thrown them. While agonizing over their brutalization, they repeatedly ask themselves why they are being brutalized and what, if anything, they can do about it. In a desperate search for answers, they revisit episodes of their past violent subjugation, personal horrification, and violent coaching. Reliving these experiences, which consumes them with hostility toward themselves and other people, produces an epiphany. They realize belatedly that their violent coaches may have had a point after all: the only real way that they can put a stop to their brutalization is to become violent themselves. If, in the wake of this epiphany, subordinates decide finally to heed their violent coaches’ instructions, then they make a “mitigated violent resolution”—they resolve from that moment on to kill or gravely harm anyone who attempts to violently subjugate them. The making of this resolution not only marks the graduation from the second stage but also the birth of a potential violent criminal. In contrast to boys’ and girls’ pointedly different passages through the brutalization stage, they traverse the defiance stage in much the same way. Girls who enter this stage are no less apt than
738
Violence, Theories of
boys to complete it. Although more boys than girls probably experience violent coaching, those girls who have undergone the same violent coaching as boys are just as likely as their male counterparts to have an epiphany during this stage about the necessity for taking grievous violent action against future subjugators. Cases 14 and 6 provide examples of boys who had attained this plateau in their violence development: Case 14: I wanted to stay away from everybody and wanted everybody to stay away from me. I didn’t want to be fooled around with by people. I told myself that if anybody fools around with me bad anymore, I am going to go off on them. I was ready to kill people, who fooled and fooled around with me and wouldn’t stop. (Athens 1992, 60) Case 6: People had messed with me long enough. If anybody ever messed with me again, I was going to go up against them. I was going to stop them from messing bad with me. If I had to, I would use a gun, knife, or anything. I didn’t mess with other people, and I wasn’t letting them mess with me anymore. My days of being a chump who was too frightened and scared to hurt people for messing with him were over. (Athens 1992, 61) “Dominance engagement” (formerly labeled “violent performance”) is the third stage in the violentization process. Unlike brutalization but like defiance, dominance engagement is also a unitary yet nuanced experience. During this stage perceived subordinates test the mitigated violent resolution that they formed earlier during the defiance stage. Of course, the circumstances must be
just right: some would-be superordinate must threaten or actually use violence in an attempt to subjugate the perceived subordinate, and the perceived subordinate must think that he has at least a fighting chance. The would-be superordinate must remain undeterred by the perceived subordinate’s likely physical resistance to violent subjugation. Finally, no third party must intervene and prevent the perceived subordinate from putting his resolve to the full test. As important as the circumstances surrounding a dominance engagement is its immediate outcome. There are several possible outcomes: a major or minor victory, a major or minor defeat, and a draw or no decision. In a major victory, the perceived subordinate scores a clear-cut win and in the process inflicts serious injuries upon the would-be superordinate. A major defeat is simply the reverse. A minor victory or defeat is the same as major ones, except that no one is seriously injured. A “no decision” occurs when the engagement never progressed to the point that a winner or loser could be declared: it ends before any of the combatants could inflict serious injuries upon the other. In contrast, a draw is an engagement that did progress beyond that point, but still no clear winner or loser could be determined. Here, the combatants inflicted equally grievous injuries upon one another. The most common outcomes of dominance engagements are minor victories and defeats as well as draws and no decisions, whereas the least common are major defeats and victories. Before putative subordinates can move on to the next stage, they must achieve at least one, and usually more, major victories. Boys and girls differ dramatically in how they fare in dominance engagements.
Violence, Theories of Girls usually fare worse than boys for a variety of reasons. Because more boys than girls reach this stage of violence development, girls are more likely to confront boys than other girls during physical dominance engagements. The greater physical size of adolescent boys, their greater participation in physical contact sports and delinquent gangs, and their greater knowledge of and access to lethal weapons all give them a decided edge in winning dominance engagements against girls. In fact, girls can provide a ready source of “cheap” major or minor victories that budding violent males can use to advance their stalled violence development (Rhodes 1999, 286–312). Thus, girls who reach the dominance engagement stage are much less likely than boys to pass through it. The fourth and final stage of violentization is “virulency.” Unlike defiance and dominance engagement but like brutalization, virulency is a composite experience composed of three elemental experiences. “Violent notoriety,” the first elemental experience, refers to the recognition that former subordinates suddenly acquire from their major victory over a would-be or former subjugator during a dominance engagement. Although previously spoken of as being incapable or only possibly capable of violence during dominance engagements, these former subordinates are now spoken about as if they are not only capable of violence but proficient in it. The second elemental experience, “social trepidation,” flows directly from the first. However, unlike violent notoriety, social trepidation does not refer to how people talk about a boy in his absence but how they act toward him in his presence. In contrast to the past, people now act more deferentially and cautiously toward
739
the former subordinate. Moreover, they now take special pains not to challenge or slight him in any way, because they fear igniting a dominance engagement that they could lose. If this newly ordained superordinate decides to embrace rather than reject his violent notoriety and the social trepidation that it generates, then he will undergo the final elemental experience, “malevolency.” Overly impressed with his sudden rise from a lowly subordinate to a lofty superordinate, he becomes arrogant. He now resolves to gravely harm or even kill someone for any provocation, however slight. After making this new violent resolution, he is transformed from a person who would only resort to violence to resist his or an intimate’s violent subjugation to a person who relishes any opportunity to violently subjugate others. Undergoing the malevolency experience marks the completion not only of the virulency stage but also of the entire violentization process. At the end of this stage, a “violent” criminal becomes an “ultraviolent” one (see, for example, Shakur 1993). As is true of the dominance engagement stage, more boys than girls pass through the virulency stage. Because in American society, cold, ruthless acts of violence are more closely associated with males, Americans are much more apt to consider males more dangerous and to fear them more than females. That same gender bias makes it much easier for young men than women to gain violent notoriety, engender social trepidation, and accept a malevolent identity. Thus, at least in the case of creating barriers to violence development, sexism seems to work to the distinct disadvantage of boys and to the distinct advantage of girls (Kipnis 1999, ix–xi). Cases 9 and 33 below
740
Violence, Theories of
provide examples of boys who have finished the entire violentization process and become ultraviolent criminals: Case 9: I became a go-getter. I would go after people’s asses for pissing me off in any fucking way at all. I meant what I said to people and said what I meant to them. They better listen to what I said because I wasn’t playing games any more, but for keeps. I was ready to kill anybody who walked the streets. (Athens 1992, 77) Case 33: I was ready to throw down with everything that I had. If a motherfucker loses his teeth, then he lost some teeth. If he loses his eye, then he lost an eye, and he loses his life, then he lost a life. It didn’t matter to me. The way I looked at it was that is just one less motherfucker this world will have to put up with. (Athens 1992, 79) It is important to keep in mind that human beings, their environmental niches, and their experiences are interdependent and exist in a developmental relationship. As people progress through the stages of the violentization process, the role that they play in their environmental niches changes dramatically. During the brutalization stage, they merely play the role of convenient victims on which the other, more violent occupants of their niche can practice violent subjugation, personal horrification, and violent coaching. At this early point in people’s violence development, their environmental niche molds them more than they mold it. Until they enter the defiance stage, form a mitigated violent resolution, and become violent themselves, they are not yet mentally prepared to reject playing the role of a victim of
brutalization and start playing the role of a physical resister against it. Moreover, they do not actually put their new role into action until after they enter the dominance engagement stage and score some major or minor victories or defeats against their would-be brutalizers. However, after they finally graduate to the virulency stage, form an unmitigated violent resolution, and become ultraviolent, they change from playing the role of physical resisters against their would-be brutalizers to that of ruthless brutalizers themselves. The irony is that complete progression through the violentization process makes the roles that people play in their environmental niches go full circle from those of hapless victims to vicious victimizers. With the addition of every new ultraviolent person to an environmental niche, it becomes that much more dangerous and hazardous to everyone who occupies it. Thus, by the end of the last stage of people’s violence development, they mold their environmental niche more than it molds them. Although the process of violentization usually takes several years to complete, it sometimes can be completed in only a few months. If the latter happens, then the process becomes a “cataclysmic experience.” The completion of violentization is always contingent upon a person undergoing all the experiences of each stage and all the stages in the process. Thus, fortunately, only a few of the boys and girls who begin the process of violentization ever finish it. Nevertheless, a greater proportion of the boys who start the process complete it because, at this point in the evolution of American society, girls have a harder time completing all the stages, except for defiance. Boys and girls can start and finish the violentization process at almost any age, but
Violence, Theories of boys usually start and finish it at a younger age than girls. Unfortunately, no matter how much younger males are when they finish violentization, the females who finally complete the process can be every bit as deadly as their male counterparts. Once boys or girls start the violentization process, how can they be prevented from completing it and becoming ultraviolent adults? The key to prevention lies in stopping them from entering as many stages of the violentization process as possible. If children have not entered the brutalization stage, then they must be kept from ever starting it. Home and neighborhood monitoring programs might prevent children’s violent subjugation, personal horrification, and violent coaching, but only if they could be vigorously implemented. Specially designed educational programs directed against the use of violence to achieve dominance could no doubt prove effective, if properly integrated into and administered across the entire adult and juvenile community. Once children have entered the defiance stage, the goal of intervention must be to stop them from making a mitigated violent resolution and thereby completing this stage. Individual counseling aimed at helping children to draw an insight from their brutalization other than the need to act violently toward other people who seek to violently subjugate them could prove extremely helpful here. However, once children graduate from the defiance stage and enter into the dominance engagement stage, individual counseling alone usually proves to be ineffective, no matter how intense or prolonged. At this stage in their violence development, children need to undergo antiviolent, primary group resocialization. Drawing on many of the same techniques used during vio-
741
lent coaching but now directed at a new goal, new mentors who were once themselves violent individuals could teach these children nonviolent means of waging and winning dominance engagements and supervise them as they practice their newly learned techniques. Unfortunately, after children have progressed to the virulency stage of their violence development, few effective countermeasures now exist to stop them from replacing their mitigated violent resolution with an unmitigated one and becoming ultraviolent adolescents. In fact, at this late stage in a child’s violence development, most interventions will have the reverse effect of the one intended. Instead of diminishing the child’s violent notoriety and the social trepidation that it engenders, belated attempts at intervention will only make it more rather than less inevitable that the child will undergo the culminating experience of malevolency. Thus, the earlier in the violentization process at which one intervenes with the appropriate measures, the more likely that the intervention will succeed at the least risk to the community. Lonnie Athens See also Gangs; Guns; Suicide References and further reading Athens, Lonnie. 1989. The Creation of Dangerous Violent Criminals. London: Routledge. ———. 1992. The Creation of Dangerous Violent Criminals. Urbana: University of Illinois Press. ———. 1997. Violent Criminal Acts and Actors Revisited. Urbana: University of Illinois Press. ———. 1998. “Dominance, Ghettoes, and Violent Crime.” The Sociological Quarterly 39 (Fall): 673–691. Blumer, Herbert. 1997. “Foreword.” Pp. 3–6 in Violent Criminal Acts and Actors Revisited by Lonnie Athens. Urbana: University of Illinois Press.
742
Vocational Education
Dewey, John. 1929. Experience and Nature. La Salle: Open Court. Kipnis, Aaron. 1999. Angry Young Men. San Francisco: Jossey-Bass. Lewis, Dorothy. 1992. “From Abuse to Violence: Psychophysiological Consequences of Maltreatment.” Journal of the American Academy of Child and Adolescent Psychiatry 31 (May): 383–391. ———. 1998. Guilty by Reason of Insanity. New York: FawcettColumbine. Lewontin, Richard. 2000. The Triple Helix: Gene, Organism, and Environment. Cambridge: Harvard University Press. Lewontin, Richard, Steven Rose, and Leon Kamin. 1984. Not in Our Genes: Biology, Ideology, and Human Nature. New York: Pantheon. Mead, George. 1934. Mind, Self and Society. Chicago: University of Chicago Press. Mednick, Sarnoff. 1977. “A Biosocial Theory of Learning Law-abiding Behavior.” Pp. 1–8 in Biosocial Bases of Criminal Behavior. Edited by S. Mednick and K. Christiansen. New York: Garner. Mednick, Sarnoff, Vicki Pollock, Jan Volavka, and William Gabriella. 1982. “Biology and Violence.” Pp. 21–80 in Criminal Violence. Edited by Marvin Wolfgang and Neil Weiner. Beverly Hills: Sage. Montagu, Ashley. 1985. “The Sociobiology Debate: An Introduction.” Pp. 24–33 in Biology, Crime and Ethics: A Study of Biological Explanations for Criminal Behavior. Edited by Frank Marsh and Janet Katz. Cincinnati: Anderson. Rhodes, Richard. 1999. Why They Kill: The Discoveries of a Maverick Criminologist. New York: Alfred A. Knopf. Shakur, Sanyika. 1993. Monster: The Autobiography of an LA Gang Member. New York: Penguin. Simon, David, and Edward Burns. 1997. The Corner: A Year in the Life of an Inner City Neighborhood. New York: Broadway. Sutherland, Edwin. 1973. “Susceptibility and Differential Association.” Pp. 42–43 in Edwin H. Sutherland on Analyzing Crime. Edited by K. Schuessler. Chicago: University of Chicago Press.
Wolfgang, Marvin, and Franco Ferracuti. 1967. The Subculture of Violence: Toward an Integrated Theory in Criminology. London: Tavistock.
Vocational Education Vocational education is formal schooling that prepares a young person for a job. It has perennially been the object of contentious debate, typically centered on issues of race and class (e.g., Oakes 1985). Gender equity is also an issue because over the years, vocational education for boys has prepared them better than it has girls for a wide range of comparatively high-paying jobs. In a 1994 report to the U.S. Congress on the status of vocational education, a blue-ribbon panel found that there were gender differences and evidence of sex stereotyping in the coursetaking patterns of secondary school students. One important difference was that girls were significantly more likely than boys to earn their vocational credits in consumer and home economics. Boys were more likely than girls to take courses in agriculture and trade and industry. The statistics revealed that 91 percent of students who concentrated in trade and industry courses (e.g., welding, machine shop) were boys, whereas 87 percent of those who concentrated in health courses were girls (National Assessment of Vocational Education [NAVE] Independent Advisory Panel 1994). These gender differences evidenced in the last decade of the twentieth century represented accumulated school practice that spanned several decades; its origins antedated the formalization of vocational education as a school subject. Perhaps more than any other subject in the American high school curriculum, vocational education has been the one most prone to
Vocational Education
743
Young men receiving instruction in aviation mechanics, South Charleston, West Virginia, ca. 1935–1943 (Library of Congress)
gender stereotyping, with girls and boys being deliberately exposed to quite different forms of vocational and career knowledge. The peculiar sociology of the subject results from the fact that the school curriculum mirrors normative societal practice. Vocational education has traditionally taken its cue directly from labor markets, selecting as vocational subjects replicas of actual jobs in the economy. Historically, labor markets have been segmented by gender, with some types of work assigned to women and other types to men. These work patterns sometimes reflect deliberate choices by boys and girls, but more often than not they reflect societal expectations of each gender. Labor markets also evidence gender inequities in pay and in opportunities for advancement.
For most of the twentieth century, vocational education in the schools merely reproduced the gender stereotypes and inequities of labor markets. The vocational education and career guidance literature of the early decades of the twentieth century shows that it was commonplace to project careers in terms of gender. For example, in a book on career choice, Lewis Smith and Gideon L. Blough (1929) delineated careers for men and for women. Men’s careers were in the realms of manufacturing, transportation and communication, the professions, public service, extraction of minerals, oil refining, and agriculture and animal husbandry. Women’s careers were in the commercial field, homemaking, personal service, and a few professions. Similar distinctions are evident in a work titled “Vocations for
744
Vocational Education
Girls,” in which Mary Lingenfelter and Harry Kitson (1939) offered nursing, home economics, cosmetology, and office work as careers for girls while setting forth medicine, dentistry, engineering, and science as careers primarily for boys. These stereotypes were openly acknowledged by persons like college professor David Hill, who wrote: “In the vocational education of women the opportunities for work and for happiness in the home should be promoted at every step” (1920, 355). Gender roles in the acquisition of occupational skill were present in colonial America, not just in the kinds of trades available to boys as opposed to girls but also in the terms of their respective contracts (Mays 1952). Boys could not complete their apprenticeships until they reached age twenty-one, whereas girls completed theirs when they reached age eighteen or were married. Society expected boys to become masters of small shops, but girls were expected to engage in housewifery. In the nineteenth century, with the decline of the craft age and the rise of the machine age, apprenticeship became somewhat outmoded. Vocational education became the way to learn how to operate machines. Courses were offered initially in skill-oriented mechanic schools and later in after-work continuation schools that were attached to factories. Although manual training became an elementary school subject for both sexes, it ultimately made its way into later grades of schooling, solidifying as a subject essentially for boys. Over the decades, manual training evolved into industrial arts (“shop”) and, more recently, into technology education. A new pedagogy suited to the mass transmission of skills was needed, and manual training became that new peda-
gogy. At the 1876 Centennial Exposition held in Philadelphia, mathematics professor Calvin Woodward of Washington University of St. Louis saw the Russian exhibit, which featured a display of a formalized way of dissecting and teaching skills. This system simplified the way in which occupations could be taught. Woodward set the gender tone for the subject by establishing a manual training school for boys ages fourteen to eighteen in 1880 in St. Louis. The curriculum included theory and practice relating to hand and machine tool use on woods, metals, and plastics, along with mathematics, science, and literature. Woodward reported (1889) that a follow-up of early graduates of the school showed that large numbers of the boys had opted for higher education; they chose professions such as medicine, dentistry, architecture, and engineering. He promoted the career possibilities of manual training, arguing that although it was essentially in the realm of general education, it was but a step to a trade. The results from his school showed the subject to be a precursor not just of trades but of professions. Manual training was the one subject in the curriculum that could respond directly to the changing economic landscape, that is, the shift from an agrarian economy to a machine economy. But the primary beneficiaries of this new dimension of knowledge were boys. They learned drafting, which was the language of industry, and the basis of communication across trades and professions such as architecture and engineering. They also learned to operate machines and use tools. At the turn of the twentieth century, with pressure building to increase the pool of skilled workers in the country, industrial lobbyists made a strong push for the
Vocational Education creation of separate and autonomously run vocational schools funded by federal dollars. One of the strongest opponents of this push was John Dewey, noted American philosopher and educator, whose argument was that promotion by schools of early career choice would be inherently undemocratic, since such a process would limit the future possibilities of children. In any case, he argued, vocations were not purely economic; they extended into the family and into community life (Dewey 1916). Vocationalist advocates led by David Snedden, then Massachusetts commissioner of education, won the day. Their lobbying culminated with the passage in 1917 of the Smith-Hughes Act, which established federal funding for vocational education on a gendered basis, that is, home economics for girls and agriculture and industrial programs for boys. This basic funding approach dictated the nature of programs for almost all of the twentieth century, and its effects are reflected in the findings of the NAVE report to Congress, as described at the beginning of this entry. As the vocational education movement gathered energy late in the nineteenth century, some questioned whether such training applied to women, saying: “After all, they were not going to be the ‘captains of industry’ and they were not going to furnish labor for the industrial machines that would compete with Germany’s growing industrial strength” (Powers 1992, 9). Jane Powers contended that male leaders of the vocational education movement were opposed to women’s role in the workforce. She documented growth among the ranks of women in factories in the early decades of the twentieth century, not just in clerical jobs but in manufacturing and mechanical jobs. Opposition to women in these jobs grew
745
among male workers, who felt women and children in the workforce were depressing men’s wages and decreasing their bargaining power. Women indeed were an exploited class of labor, performing low-paying and low-skill jobs. Unfortunately, whether through vocational education in the upper grades or through industrial arts in the middle grades, schools reproduced these gender stereotypes about which Powers wrote. The same gender strictures that confronted women in the factories of the 1890s through the 1910s applied in the offering of vocational subjects in the schools. The typical American adult male looking back on his school experience would have industrial arts as a memory and could possibly attribute his skill in operating hand and machine tools or in performing mechanical and electrical chores around the home to skill and knowledge acquired in such classes. That would not be the case for the typical American woman. The whole culture of tool and machine use and the threedimensional world of making and building and constructing have been treated as the preserve of males. In the last two decades of the twentieth century, important changes occurred that offered hope for meaningful reform of vocational subjects, not just in terms of content but in terms of access. Industrial arts has evolved into technology education. The old content of woodworking, metalworking, and drafting was discarded and replaced with broader themes such as power and energy, construction, communication, and biorelated technologies. Correspondingly, the pedagogic focus has shifted from acquiring skills with tools to learning design and problem solving. New modular laboratory designs have replaced the old
746
Vocational Education
tool shops, and machine interfaces with computers are now the norm. These new laboratories are a far cry from the old factory-type ones that excluded girls. With the hard industrial edge of vocational subjects removed, there is anecdotal evidence that girls are finding these classes much more to their liking. Standards for vocational subjects have been developed and published, thanks to grants from the National Aeronautic and Space Agency and the National Science Foundation. The new democratic goal for vocational education is “technology for all Americans” throughout grades K–12 (International Technology Education Association 2000). Federal laws, especially the Carl D. Perkins Acts of 1984 and 1990, have also affected vocational education. Informed by contemporary concerns for fairness and equal opportunity, these pieces of legislation directed the states to establish gender equity as a guiding criterion for program funding. The acts also established integration of academic and vocational education as a fundamental curriculum principle. Thus, traditional vocational education has been unraveling, to be replaced by a “new vocationalism” (e.g., Grubb 1996; Lewis 1997). Advocates of the new vocationalism draw heavily on the democratic thought and vocationalist ideals of John Dewey, particularly his view that vocations could form the context for teaching children about work and not about narrow job preparation. The reforms in technology education and vocationalism draw common inspiration from the new economy. New emphases on knowledge work and information mean that the nature of work has been fundamentally transformed. Technology has significantly transformed
work and jobs, eliminating many traditional crafts (see especially Zuboff 1988). Skill itself is being redefined. Indeed, many now believe that so-called soft skills, such as solving problems, thinking critically, learning how to learn, communicating, and working as a team, are more critical now than technical skills (e.g., Secretary’s Commission on Achieving Necessary Skills 1991; Gray and Herr 1995). The U.S. Department of Education has backed away from old categorizations such as trade and industrial education, agriculture, and home economics and instead proposed sixteen “career clusters,” including information technology, manufacturing, health science, financial services, construction, business and administrative services, legal and protective services, human services, hospitality and tourism, audiovisual technology and communication services, public administration and government, retail-wholesale, scientific research, engineering and technical services, agricultural and natural resources, and transportation and distributive services (U.S. Department of Education 2000). In contrast to narrow job categories, clusters provide for exploration. Specific vocational choices can be postponed until the postsecondary years, and careers can then be pursued in twoyear technical colleges and community colleges. Some of those who support the new vocationalism offer a critical science perspective, meaning that they think schools should question inappropriate societal practice rather than accede passively. Accordingly, they want the new vocational curriculum to deal squarely with continuing workplace inequities, such as gender stereotyping, racism, sexual harassment, and the glass ceiling. Joe
Vocational Education Kincheloe (1999) speaks of a gendered workplace where patriarchy continues to be the predominant ideology and where women are kept in their place. Penny L. Burge and Steven M. Culver (1994) speak of a “gendered economy” that has the effect of expanding the career aspirations of boys and curtailing those of girls. These authors want vocational education deliberately to adopt strategies that would reject common workplace practice and teach more ideal practices, such as boys and girls working collaboratively rather than competitively. Such sentiments are shared by Patricia Carter (1994), who calls on vocational education to respond to the need for workplace equity—for an end to sexual harassment and gender segregation so that women and girls can step out of traditional roles into nontraditional ones. The new changes in technology education and vocational education go in the direction of altering the sociology of these subjects. They have the potential for leveling the playing field for girls and boys, giving them both the same breadth of exposure in the school curriculum. Consequently, boys and girls both can get early glimpses of the whole world of work that can help them make important career and life decisions and by which they come closer to realizing their fullest potential. Theodore Lewis See also Apprenticeship; Computers References and further reading Burge, Penny L., and Steven M. Culver. 1994. “Gender Equity and Empowerment in Vocational Education.” Pp. 51–63 in Critical Education for Work: Multidisciplinary Approaches. Edited by Richard D. Lakes. Norwood, NJ: Ablex.
747
Carter, Patricia A. 1994. “Women’s Workplace Equity: A Feminist View.” Pp. 67–81 in Critical Education for Work: Multidisciplinary Approaches. Edited by Richard D. Lakes. Norwood, NJ: Ablex. Dewey, John. 1916. Democracy and Education. New York: Macmillan. Gray, Kenneth C., and Edwin L. Herr. 1995. Other Ways to Win: Creating Alternatives for High School Graduates. Thousand Oaks, CA: Corwin Press. Grubb, W. Norton. 1996. “The New Vocationalism: What It Is, What It Could Be.” Phi Delta Kappan 77, no. 8: 533–546. Hill, David S. 1920. Introduction to Vocational Education: A Statement of Facts and Principles Related to the Vocational Aspects of Education below College Grade. New York: Macmillan. International Technology Education Association. 2000. Standards for Technological Literacy: Content for the Study of Technology. Reston, VA: ITEA. Kincheloe, Joe L. 1999. How Do We Tell the Worker? The Socioeconomic Foundations of Work and Vocational Education. Boulder, CO: Westview Press. Lewis, Theodore 1997. “Toward a Liberal Vocational Education.” Journal of Philosophy of Education 31, no. 3: 477–489. Lingenfelter, Mary R., and Harry D. Kitson. 1939. Vocations for Girls. New York: Harcourt Brace. Mays, Arthur B. 1952. Essentials of Industrial Education. New York: McGraw-Hill. National Assessment of Vocational Education, Independent Advisory Panel. 1994. Interim Report to Congress. Washington, DC: U.S. Department of Education. Oakes, Jeannie. 1985. Keeping Track: How Schools Structure Inequality. New Haven, CT: Yale University Press. Powers, Jane B. 1992. The “Girl Question” in Education: Vocational Education for Young Women in the Progressive Era. Washington, DC: Falmer Press. Secretary’s Commission on Achieving Necessary Skills (SCANS). 1991. What Work Requires of Schools: A SCANS Report for America 2000. Washington, DC: U.S. Department of Labor.
748
Vocational Education
Smith, Lewis W., and Gideon L. Blough. 1929. Planning a Career: A Vocational Civics. New York: American Book Company. U.S. Department of Education. 2000. Career Clusters: Adding Relevancy to Education. Pamphlet. Washington, DC: U.S. Department of Education.
Woodward, Calvin M. 1889. “The Results of the St. Louis Manual Training School.” Journal of Proceedings and Addresses. Session of the year 1889, held in Nashville, TN, National Education Association. Zuboff, Shoshana. 1988. In the Age of the Smart Machine. New York: Basic Books.
W Washington, Booker T., and W. E. B. Du Bois
turned to education as the way to overcome the limits they faced. Finally, both achieved remarkable success as educators and as leaders. Booker Taliaferro Washington was born on a farm near Hale’s Ford in the foothills of the Blue Ridge Mountains in Franklin County, Virginia. In his autobiographical writings, Washington gives his birth date as 1857, 1858, or 1859; it was more likely 1856. Washington’s mother Jane was a slave who worked on James Burroughs’s farm. His father, a white man, has never been identified. The Burroughs farm was midsized and in 1860 employed ten slaves—four adults and ten children. Among the ten were Jane, Booker’s mother; Sophie, his aunt; Munroe, who may have been his uncle; and another adult male. The children included Booker himself; John, his older brother; and Amanda, his younger sister. The other three children may have been Sophie’s. By 1860 Jane had married Washington, a slave belonging to the farmer who lived across the road from the Burroughs place. Slavery was not conducive to a comfortable family life. Jane lived with her children in a dirtfloor, one-room cabin, the distinguishing feature of which was a large hole in the floor, which Burroughs used in the winter to store sweet potatoes. Her chores as cook for Burroughs left little time for her own children. Booker recalled that during this
The childhoods of Booker T. Washington and W. E. B. Du Bois, the two most significant leaders of the African American community in the late nineteenth and early twentieth centuries, illustrate that although young African American men faced common problems, their childhoods could differ significantly. In spite of racism and other restrictions that limited their freedom in the second half of the nineteenth century, the range of experiences that African American males faced during childhood and youth was extensive. Booker T. Washington and W. E. B. Du Bois faced very different experiences as boys, but their childhoods also contained significant common elements. One was born and raised in the South, the other in New England; one was born into servitude, the other into a family that had been free for generations; and one made his own way with little family or community support, whereas the other found considerable community and some family support. Both, of course, encountered race and racism but did so in greatly different contexts and settings and consequently responded differently; both also confronted poverty and economic deprivation and grew up in families that were buffeted by social, cultural, economic, and racial stress. Out of these circumstances both
749
750
Washington, Booker T., and W. E. B. Du Bois
period, he never sat at a table and shared a meal with his family. He also recalled that the rigors of slavery robbed his mother of her health and vitality. William Edward Burghardt Du Bois was born on February 23, 1868, in Great Barrington, Massachusetts. His mother, Mary Burghardt Du Bois, was a member of the Burghardt clan that had lived in the Great Barrington region since the colonial period. By the end of the Civil War they were a fixture in the community, working as housemaids, waiters, farmers, and small shopkeepers. His father, Alfred Du Bois, was born in Haiti, the grandson of a prominent planter and physician who divided his time between the mainland and the islands. Alfred Du Bois arrived in Great Barrington as a drifter with no job and few prospects and likely a wife in New York. He married Mary Burghardt over the objections of her family in early February 1868, a few weeks before the birth of their son. Alfred Du Bois remained with Mary only about a year. After her husband left, Mary lived for a time on the farm of her parents and then moved into Great Barrington. She suffered from depression and then in 1875 or 1876 from a stroke, which left her partially crippled. Part-time work as a housemaid and frequent assistance from her sisters and brothers enabled her to maintain a life on the edge of poverty for herself and her sons. William and his older half-brother Adelbert Burghardt also took odd jobs and contributed to the support of the family. In spite of these hardships, Du Bois remembers that his mother made sure that his schooling proceeded without interruption. The defining element in the early childhood of Booker T. Washington was slavery and its accompanying poverty and family stress. Simple issues such as food and clothing stood out in Washing-
ton’s memory. Although his mother’s job as cook for the farm ensured that food of some sort would always be available, meals remained a hit-or-miss affair, “a piece of bread here and a scrap of meat there . . . a cup of milk at one time and some potatoes at another” (Washington 1901, 219). When this system left the children hungry, Washington recalled that his mother would awaken them late at night and provide them with eggs or a chicken that she had somehow secured. Clothing was also limited. His sole garment during early childhood was a coarse shirt made of rough flax, which when new was so painful to wear that he often went without clothing. His first shoes were a pair of wood-soled clogs that he received for Christmas when he was eight years old. As a slave he was expected to work, even as a young child. His job included operating the fans that kept the flies away from the Burroughs’ dining table at mealtime, carrying water to workers, and performing other light chores around the farm. Even though slavery ended when he was nine years old, the young Washington recognized the barriers that race and slavery placed between blacks and whites. Although the Burroughs children were his playmates with stick horses, marbles, or games of tag or on fishing excursions, there was an invisible barrier at the schoolroom door. Education was for white children only. Work, too, was segregated, reserved for blacks. Finally, there was the incident that indelibly defined slavery in young Washington’s mind. As he recalled, the vision of his uncle, stripped naked, tied to a tree, and whipped across his bare back with a cowhide strip as he cried for mercy “made an impression on my boyish heart that I shall carry with me to my grave” (Washington 1900, 12).
Washington, Booker T., and W. E. B. Du Bois The Civil War and emancipation radically altered Washington’s childhood. In 1864 his stepfather escaped from Lynchburg, where he had been hired out to a tobacco factory, and fled to Malden, West Virginia, where he found employment in the salt mines. In August 1865 he sent for his now-emancipated wife and stepchildren. Economically, life in Malden was not a great improvement over life on the Burroughs farm. They lived in a dilapidated cabin, in a crowded neighborhood of similar dwellings, amid garbage, raw sewage, throngs of poor blacks and even poorer whites, and violence and vice—an environment that shocked and offended the young Washington. Booker and his brother James were immediately put to work packing salt. Their stepfather confiscated all their wages, which soured any relationship that might have developed between stepfather and stepsons. The one advantage that Washington found in Malden was school. Over the objections of his stepfather but with the support of his mother, Washington began attending the school for blacks that opened in late 1865, first at night and then during the day between early-morning and lateafternoon stints in the salt mines. This first effort at education was frequently interrupted by the work demands that his stepfather imposed, first in the salt mines and then in the coal mines. In 1867 he escaped the control of his stepfather when he took a job as a domestic in the home of General Lewis Ruffner, one of the wealthiest citizens of Malden. There Washington served as houseboy, companion, and eventually protégé for Viola Ruffner, General Ruffner’s New England wife. From Viola Ruffner, the young Washington not only found refuge from the salt mines and his stepfather’s home, but he also was imbued with the
751
values of hard work, honesty, cleanliness, books, and education, as well as the example of gentility that the Ruffners’ home represented. Washington would later credit her for much of his early education and especially with preparing him for college. Du Bois’s childhood in Great Barrington was shielded from many of the problems of race and Reconstruction, but it was affected by his mother’s declining health and economic condition. Following the departure of her husband in 1869, Mary Du Bois and her two sons lived with her parents, Othello and Sarah Burghardt, until her father’s death in 1874 forced the sale of the family farm and a move into the city. Following her mother’s death a year later, Mary Du Bois and her two sons relocated again to a dilapidated house they shared with an even poorer white family on Railroad Street in the heart of Great Barrington’s saloon, gambling, and prostitution district. In this setting Mary soon suffered her debilitating stroke, further limiting her economic prospects. The family’s income became increasingly dependent on the earnings of Adelbert and the part-time jobs of William. As the family’s prospects sank, Mary Du Bois put all of her energy into the development of her son William. She used the obvious social lessons of the world outside their door to teach him the dangers of alcohol, gambling, and loose women. More important, she emphasized education. The young Du Bois began regular education at age five or six, after the family moved back to Great Barrington. From the beginning he excelled as a student and attracted the attention of his teachers as well as some of the prominent members of the community. Du Bois’s academic ability brought rapid promotion
752
Washington, Booker T., and W. E. B. Du Bois
and also prompted local intervention into the family’s economic situation. As Du Bois prepared to enter high school, local citizens arranged for the family to move to a more suitable home. Du Bois was the only African American student in the local high school. The principal, Frank Hosmer, arranged for Du Bois to take the college preparatory curriculum, made sure that he had the expensive books and other materials needed for that course of study, and started mother and child thinking about college. Du Bois’s social contacts during this period were not with Great Barrington’s small African American community but almost exclusively with his white classmates and with the children of the families who employed his mother. He was conscious of the economic differences that separated him from his classmates but initially seemed to think that race was of no significance. As he grew older, this idealistic vision of American democracy was undermined by a series of events that marked the racial boundaries, even in the relatively liberal atmosphere of Great Barrington. The first, the refusal of a young girl in one of his classes to accept an exchange of greeting cards, which probably occurred early in his high school days, was followed by other small rejections that forced him to recognize the extent of racial feeling in the United States. As their childhoods came to an end, both Booker T. Washington and W. E. B. Du Bois left the homes of their youth in pursuit of higher education. While working in the salt mines, Washington had learned from another worker that Hampton Institute in Hampton, Virginia, would allow impoverished blacks to work to pay the costs of their education. In 1872, with 50 cents in his pocket, collected in nickels and dimes from family
and friends and with the blessing of his mother, the sixteen-year-old set out on the 500-mile trek to seek admission to Hampton. Three years later he graduated as one of its top students. Du Bois’s plans for college developed more traditionally. By his senior year he had selected his college of choice, Harvard University. His plans hit a snag, though, when his mother died shortly after his March 1885 graduation. Once again, however, benefactors came to his rescue. The African American community took him in, made sure that he had food and shelter in the months following his mother’s death, and provided him with a well-paying summer job. Principal Hosmer and three other white citizens collected funds from local churches to pay for Du Bois’s college education, but they stipulated that he attend Fisk University, not Harvard. Du Bois did not object. In September 1885 the seventeen-year-old Du Bois arrived in Nashville, Tennessee, prepared to commence his studies. Three years later, with his Fisk B.A. in hand, Du Bois entered Harvard University. Cary D. Wintz See also African American Boys; Civil War; Jobs in the Nineteenth Century; Slavery References and further reading Aptheker, Herbert, ed. 1997. The Correspondence of W. E. B. Du Bois. Vol. 1, Selections 1877–1934. Amherst: University of Massachusetts Press. Du Bois, W. E. B. 1920. Darkwater: Voices from within the Veil. New York: Harcourt, Brace, and Howe. ———. 1940. Dusk of Dawn: An Essay toward an Autobiography of a Race Concept. In W. E. B. Du Bois: Writings. Edited by Nathan Huggins. New York: Harcourt, Brace, and Company. Reprint, New York: Library of America, 1986. ———. 1968. Autobiography of W. E. B. Dubois: A Soliloquy on Viewing My
World War II Life from the Last Decade of Its First Century. New York: International Publishers. Harlan, Louis R. 1972. Booker T. Washington: The Making of a Black Leader, 1856–1901. New York: Oxford University Press. Lewis, David Levering. 1993. W. E. B. Du Bois: Biography of a Race. New York: Henry Holt. Marable, Manning. 1986. W. E. B. Du Bois: Black Radical Democrat. Boston: Twayne Publishers. Rudwick, Elliott M. 1969. W. E. B. Du Bois: Propagandist of the Negro Protest. New York: Atheneum. Washington, Booker T. 1900. The Story of My Life and Work. In The Booker T. Washington Papers. Vol. 1, The Autobiographical Writings. Edited by Louis R. Harlan. Chicago: J. L. Nichols. 1972. Reprint, Urbana: University of Illinois Press. ———. 1901. Up from Slavery: An Autobiography. In The Booker T. Washington Papers. Vol. 1, The Autobiographical Writings. Edited by Louis R. Harlan. New York: Doubleday, Page. 1972. Reprint, Urbana: University of Illinois Press.
Washington, George See Manners and Gentility
Work See Jobs in the Seventeenth and Eighteenth Centuries; Jobs in the Nineteenth Century; Jobs in the Twentieth Century
World War II Just as World War II had an enormous impact on the lives of adults in the United States, so too did it profoundly shape the experiences of American boys. Like their adult counterparts, boys (and girls) were mobilized to help with the war effort, and they performed important services such as collecting scrap materials, planting
753
victory gardens, and buying war bonds and stamps. The war also influenced the games and leisure activities of boys of all ages. Virtually all of the toys, comic books, magazines, radio programs, and movies boys encountered dealt with either combat or patriotic themes. Viewed as future soldiers, boys between the ages of fourteen and seventeen were considered to be of exceptional importance to the U.S. war effort, and schools and government programs created specialized courses to promote their physical fitness and premilitary training. Even as national propaganda campaigns, bond drives, advertising, and shared popular culture materials encouraged commonalities within American boys’ wartime activities, differences in racial, class, and ethnic background often resulted in divergent experiences—especially for thousands of Japanese American boys, who spent much of the war behind the barbed-wire walls of internment camps. With millions of adults entering the military and taking war industry jobs, government agencies increasingly called upon children to perform essential home-front tasks. While high school–age boys and girls served the war effort through parttime employment, children between the ages of six and thirteen completed their wartime jobs on a completely voluntary basis. Despite expanded roles for women in the workforce, the tasks boys and girls performed often conformed to traditional gender stereotypes. As historians William Tuttle (1993) and Robert Kirk (1994) have noted, girls were more likely to engage in “nurturing” activities such as caring for small children, knitting socks and blankets, canning produce, and rolling bandages, whereas boys took on more “masculine” duties like collecting salvage, building model airplanes for military and
754
World War II
A group of schoolboys gathering scrap metal during World War II (Archive Photos)
civilian training sessions, and serving as junior air raid wardens. Of all the tasks young boys across the country performed, scrap collection was probably the most crucial to the war effort. With serious rubber, paper, and metal shortages during the war, the tin cans, old raincoats, newspapers, copper pans, and other materials boys collected in their wagons helped conserve valuable resources and saved countless hours of adult labor. Although boys could participate in a variety of school and community salvage campaigns, the Boy Scouts of America were especially effective in mobilizing boys as “scrap troopers.” In the summer of 1941, for example, after learning of the nation’s aluminum shortage, the Boy Scouts collected 11 out of the 12 million pounds of aluminum brought in during a nationwide pots and pans drive—
enough, the Army estimated, to make 1,700 planes. Throughout the war, the Boy Scouts continued to receive special recognition for their additional contributions to salvage drives. By the war’s end, they had gathered more than 23 million pounds of tin, 109 million pounds of rubber, 370 million pounds of scrap metal, and 3 million books. In addition to collecting scrap metals, paper, and rubber, boys in rural areas were called upon to fill other special roles. Young boys did extra chores around the farm and gathered milkweed pods for the stuffing in life preservers, and boys fourteen and older joined the Future Farmers of America. These young agricultural workers not only grew crops and raised livestock, but they also learned to repair farm machinery and increase food production. Children’s agricultural work,
World War II however, was not confined to rural areas; boys and girls throughout the country planted and tended “victory gardens.” Even in cities where open land was scarce, park officials set aside plots for elementary school children to plant vegetable gardens. Another important way that boys contributed to the war on the home front was through purchasing war bonds and stamps. Boys and girls alike would bring their nickels, dimes, and quarters to school once a week to buy stamps for their war stamp books. Once a child filled his book with $18.75 in stamps, he could trade it in for a war bond at the post office. Many schools charted their students’ fund-raising efforts with classroom posters that translated stamp and bond sales into military purchases. Children in smaller schools might watch their savings turn into equipment for a single soldier, but pupils in larger schools could count the number of bombs, tanks, planes, or ammunition rounds they helped build with their money. Although gender did not affect children’s participation in the war stamp and bond drives, class differences often did. Poorer children were often ashamed of their inability to purchase as many defense stamps as their affluent peers. The embarrassment was so great for one ten-year-old boy in Kokomo, Indiana, that he “swallowed a worm on a dare in order to win 25 cents to buy a war stamp and be, as he said, ‘like the rest of the kids’” (Tuttle 1993, 125). The United States’ full-scale mobilization for war not only fostered youth participation in home-front programs, but also it encouraged boys to think of themselves as potential servicemen. Boys as young as three could be found pretending to be “airplane men” shooting at Japa-
755
nese Zeros with wooden boards as guns. As they got older, though, school-age boys required greater verisimilitude in their war games, and they often went to elaborate lengths to fashion uniforms, first-aid kits, guns, and other weapons that looked as real as possible. Although the combat scenarios that boys played out varied from neighborhood to neighborhood, two axioms generally held true: the Americans were ultimately always victorious, and no one wanted to be the enemy. Recruiting younger boys or, occasionally, girls solved the latter problem. Once boys reached adolescence, their interest in combat and war games could be channeled into more formal outlets. The Young Men’s Christian Association (YMCA), for example, created “Boymandos” programs throughout the nation to provide premilitary fitness classes and general athletic training. Teenage boys could also join the High School Victory Corps, a nationwide organization designed to prepare high school students for war service—military and civilian. Freshmen and sophomores could serve only as general members, but junior and senior boys could join the air, land, or sea branches, which provided their members with uniforms, insignia, specialized coursework, and military drill and calisthenics. Although membership in the corps was voluntary, participation in school physical education programs was not. Because the War Department estimated that more than 80 percent of sixteen- and seventeenyear-olds would enter military service during the war, most high school physical education programs tried to cultivate the types of skills and physiques boys would need in basic training and combat. Ultimately, millions of boys did serve in the military, often patriotically enlisting as soon as they turned seventeen.
756
World War II
A young boy with a helmet lies on the floor playing with a toy army, 1940s (Archive Photos)
If boys did not get their fill of the war by participating in school, YMCA, Boy Scout, or other programs, they could immerse themselves in war news and action through popular culture. Although they often contained fantastic story lines and superhuman heroes, comic books, radio programs, and movies were generally saturated with war topics and appeals for patriotism. The messages of these popular culture materials were so influential, in fact, that Robert Kirk dubbed them the “unofficial instruments of national policy” for children (1994, 36). Even before World War II began, boys were introduced to the “Horrors of War” via 1-cent bubble gum cards manufactured by Gum Inc. of Philadelphia. The
cards graphically depicted Japanese atrocities in China and offered explanatory captions in order to teach children “the importance of peace.” When World War II started, Gum Inc. created a “War News Pictures” series of cards, which children could collect to learn about the war in Europe. Boys and girls could also keep abreast of war information and practice for civilian defense jobs by amassing Coca-Cola plane identification cards, Wonder Bread warship guides, and Junior Air Raid Warden games. Reading comic books was a far more popular activity for boys and girls—especially during the later years of the war, when shortages curtailed chewing gum sales. Between the ages of six and twelve, boys and girls alike were avid comic book readers, buying roughly 12 million copies a month. Wartime studies revealed that boys were more likely to follow a greater number of serials and were generally drawn to more bellicose comics. Nevertheless, children of both sexes and of all ethnic backgrounds ran to comic book dealers each month to purchase the latest adventures of Batman, Captain Marvel, Superman, Blackhawk, and Wonder Woman. Radio programs were also popular forms of entertainment for boys during the war, and on average they listened fourteen hours a week. Boys and girls tuned in after school primarily to adventure shows such as Jack Armstrong—That All American Boy, Dick Tracy, Little Orphan Annie, The Shadow, Gangbusters, The Lone Ranger, and Superman. Like comic books, these programs reinforced patriotic feelings and exhorted children to do their share in the home-front battle. Radio also provided young listeners with much of their war news. Some children in larger cities even had their own interactive news programs,
World War II in which stations provided listeners with maps, tiny flags, and pins so they could chart overseas campaigns at home. Other news and entertainment came from the Saturday matinee movie, an enduring 1940s ritual for children between the ages of seven and thirteen. Boys and girls spent most of Saturday afternoon at the movie theater, watching a double feature (which often included a battle or spy film), publicservice cartoons, and sometimes a government-funded documentary. Although movies, radio shows, and comics inspired patriotism and provided countless hours of entertainment for boys during the war, they also reinforced existing racial and ethnic prejudices. Popular representations of Japanese as bestial or subhuman, Italians as bumbling, and Germans as vicious regularly produced wartime hatred and cruelties among boys. In Detroit, for example, neighborhood boys frequently called seven-yearold Rick Caesar “Mussolini” and chased him because of his Italian heritage. German American boys often masked their ethnic identity, masquerading as Swiss or Polish to avoid jeers and physical abuse. Ultimately, though, Japanese American boys experienced the most dramatic and long-term persecution. Although only a handful of Italian- and German-born children were temporarily excluded from “sensitive areas” of the United States, more than 30,000 Japanese American youths were imprisoned in internment camps under Executive Order 9066. Evacuated from their homes in March 1942, these children and their families first went to assembly camps and then to detention centers in remote parts of the country. At most centers, children lived with their families in one-room apartments, each measuring 8 by 20 feet or 12 by 20 feet in size. Orphans and foster chil-
757
dren, however, were placed in the children’s village at the Manzanar Relocation Center in southern California. Regardless of the location, children spent much of their time in class or after-school recreational programs. Nevertheless, Japanese American boys in the camps still found time to play sandlot baseball, marbles, and the same war games that other boys throughout the country were enjoying. Despite their loss of civil liberties, Japanese American boys were generally quite patriotic, and they actively participated in war bond and salvage collection drives. Italian, German, and Japanese American boys, however, were not the only ones to experience ethnic prejudice and racial violence during the war. Despite public images of and appeals for national unity, existing racial tensions came to a head in many wartime communities. In the summer of 1943 alone, more than 242 race riots erupted in 47 different cities, involving thousands of men and teenage boys. African, Mexican, and Native American boys migrating with their families to new cities for war-related work often encountered tremendous hostility and suspicion from their white peers. When faced with the contradictions between national self-representations of American unity, equality, and democracy and the realities of segregation and discrimination, some youths of color had trouble determining who the real enemy was. After being prevented by whites from moving into his new home in Detroit, one African American boy declared, “I’m a Jap from now on” (quoted in Tuttle 1993, 165). Another problem exacerbated by the war was juvenile delinquency. Teacher shortages, an insufficient number of daycare facilities, an increase in parents’ outof-home working hours, and expanded
758
Wrestling
employment opportunities for youths all contributed to more juvenile crime. During the war, juvenile arrests as a whole rose by more than 20 percent, and in some cities like San Diego, boys’ incarcerations increased by 55 percent or more. Although boys and girls under seventeen were both committing more crimes, the infractions that youths carried out generally varied by sex. Police arrested teenage girls for prostitution or being “V-girls” (juveniles who had sex with servicemen), whereas they apprehended boys for theft, vandalism, or fighting. In response to the rise in juvenile delinquency and the public outcry over the “decline in youth values,” communities instituted ten o’clock curfews and constructed teen centers and canteens where youths could dance, play cards, shoot pool, and socialize. Programs such as these reinforced popular sentiment that boys and girls were important to the war effort and to America’s future. Christina S. Jarvis
See also Comic Books References and further reading Baruch, Dorothy W. 1942. You, Your Children and War. New York: D. Appleton-Century Company. Kirk, Robert William. 1994. Earning Their Stripes: The Mobilization of American Children in the Second World War. New York: Peter Lang. Lingeman, Richard R. 1976. Don’t You Know There’s a War On? The American Home Front 1941–1945. New York: Capricorn Books. Skoloff, Gary, et al. 1995. To Win the War: Home Front Memorabilia of World War II. Missoula, MT: Pictorial Publishing. Spencer, Lyle M., and Robert K. Burns. 1943. Youth Goes to War. Chicago: Science Research Associates. Tuttle, William M. 1993. “Daddy’s Gone to War”: The Second World War in the Lives of America’s Children. New York: Oxford University Press. Werner, Emmy E. 2000. Through the Eyes of Innocents: Children Witnesses of World War II. Boulder, CO: Westview Press.
Wrestling See Superheroes
Y Young Life
the men of all ages who formed the YMCA’s more traditional constituency. Critics in the late twentieth century regretted the YMCA’s loss of religious focus and its concentration on providing services, especially physical exercise, for fee payers. But YMCAs also furnished varied social services and enrolled millions of boys and girls under age eighteen. George Williams formed the first Young Men’s Christian Association in 1844 in London, England, banding his fellow clerks together to preserve their faith amid urban temptations. During the early 1850s similar associations sprang up in North American cities, offering prayer meetings, Bible classes, libraries, and literary societies. Although the associations federated at the state and international (U.S. and Canada) levels, each level raised its own budget and remained independent, permitting the wide diversity of local programming that has characterized YMCAs to this day. Since in the nineteenth century even youths in white-collar jobs commonly worked at ages when their twentieth-century counterparts would have been schoolboys, early YMCAs admitted boys in their early teens as full members. By the 1870s, however, as the original members grew older, many YMCAs required a minimum age, usually sixteen. At first predominantly evangelistic in their approach, YMCA men soon compro-
See Parachurch Ministry
Young Men’s Christian Association The Young Men’s Christian Association (YMCA) originated in the mid-nineteenth century as a voluntary association to bolster the faith and mold the character of Protestant men and boys and then evolved into a fee-based membership organization run by paid staff that provided varied services, including physical training, to both sexes and all ages. Early YMCAs accepted boys in their early teens as full members but began separate programs for those under age sixteen during the 1870s. The YMCA espoused balanced religious, intellectual, social, and physical development, but by the 1890s gymnasiums (later swimming pools as well) achieved lasting prominence as the most widely used YMCA facilities. YMCA boys’ work grew rapidly in the early twentieth century, exceeding 200,000 American boys enrolled by 1920, as staff sought to make themselves specialists in adolescent boyhood. YMCA men supervised the start of Boy Scouting in the United States, but saw it outgrow YMCA boys’ work. Starting around the 1920s, the focus of YMCA programming grew more diffuse as local associations enrolled nonProtestants, women and girls, and young boys in addition to the teenage boys and
759
760
Young Men’s Christian Association
A boxing exhibition at the YMCA, ca. 1920 (Library of Congress)
mised between revivalist and antirevivalist Protestants, continuing to seek conversions but stressing the gradual growth in character preferred by antirevivalists. YMCAs of the 1860s and 1870s broadened their programs, erected buildings to house them, hired professional staff (known as “secretaries”), and defined their mission as “the improvement of the spiritual, mental, social and physical condition of young men” (Hopkins 1951, 107). As a cult of muscular Christianity took hold among white-collar workers who wanted to be good, respectable, and yet manly, by the 1890s gymnasiums became the main attraction. There young clerks could work off troubling energies
and strengthen a sense of masculinity diminished by sedentary work, submission to a boss’s orders, and the advent of female coworkers. In search of group activities less regimented than calisthenics, YMCA men invented two gymnasium games, basketball in 1891 and volleyball in 1895. Basketball quickly took hold among the YMCA’s younger members and by the early 1900s became a mainstay of public school and church athletics as well. YMCA swimming pools, though still uncommon before 1900, foreshadowed the prominence of aquatics in twentieth-century YMCAs. Luther Gulick, the international YMCA’s first secretary for physical education, proposed an inverted
Young Men’s Christian Association triangle, symbolizing the physical, mental, and spiritual sides of human nature, to publicize YMCA determination to train the whole man. The red triangle became the YMCA’s ubiquitous emblem in the 1890s. Separate programs for boys, usually defined as ages ten to sixteen, got under way in the 1870s but grew slowly until the 1890s. Early YMCA workers held weekly religious meetings and Bible study, often judging success by the numbers of boys who accepted Christ. On the theory that character required control of all errant impulses, YMCA men put the boys through long gymnastic drills. In the 1890s, though, with the advent of team sports and swimming, recreation began to outweigh disciplinary training. Though both remained important into the 1920s, the balance shifted from conversion toward character building. This developmental approach gained strength around 1900 as boys’ work specialists discovered adolescence. The new term, popularized by students of the psychologist G. Stanley Hall (among them Luther Gulick), gave concerns about youth a scientific-sounding rationale. Though a time of crisis when sexual and other new instincts flooded in upon teenagers, adolescence in Hall’s view was also an era of great promise when religious conversion and idealistic enthusiasm were natural. But teenagers needed supervision and guidance at this vulnerable stage. Under Gulick’s tutelage, Edgar M. Robinson (1867–1951), the international YMCA’s first secretary for boys’ work, sought to make the guidance of adolescents the raison d’être of YMCA boys’ work and persuaded most junior departments to raise their age limits to twelve through seventeen. Most YMCA junior members were sons of the middle class. They could afford fees
761
to use the gymnasium and pool, and most were promising recruits for the parent association, since the YMCA’s founding fathers had restricted full membership to members of evangelical churches, defined in 1869 to mean orthodox Protestant denominations. Except in smaller northern cities where African Americans were few, YMCAs were racially segregated, with separate branches operating under black leadership. Despite their anger at white racism, college-educated elites commonly supported black YMCAs as safe public spaces free of white control and devoted to advancement of the young. Middle-class boys needed supervision and religious nurture, YMCA workers believed, since short school days left ample free time, and most boys quit Sunday school in their early teens. At the same time, YMCA men expressed extravagant fears that such boys were losing their masculinity—freed of physical labor, raised by mothers while fathers worked, and taught by female schoolteachers, they were going soft. Robinson denounced the boy who has been “kept so carefully wrapped up in the ‘pink cotton wool’ of an overindulgent home, [that] he is more effeminate than his sister, and his flabby muscles are less flabby than his character” (Macleod 1983, 48). Widespread suspicion that religiosity was unmanly exacerbated the panic. In response, the YMCA offered building-centered activities throughout the school year and summer camping, both focused on gymnastics, sports, swimming, and hobbies, with further religious commitments for a minority. In 1909–1910, 78 percent of YMCA juniors enrolled for gymnastics and sports, 37 percent joined Bible classes, and just under 5 percent took “decisions for Christian life” (Macleod 1983, 252, 265). Starting around 1910, YMCA boys’ workers sought to embed these decisions
762
Young Men’s Christian Association
in a sequence of conventional moral development by inviting boys to take a “Forward Step”: deciding for Christ, joining the church, giving regularly, doing committee work, and especially giving up habits such as smoking and masturbation. Since YMCA boys’ work lacked a core program as tightly defined as scouting, junior members could decide which activities to join and which to avoid. Early in the century secretaries were already voicing what later became a persistent criticism of the YMCA, worrying that they were merely selling privileges for a fee. Efforts at community outreach beyond the building-centered, fee-based activities accentuated the diffuseness of YMCA boys’ work, however. In 1910 Robinson supervised the transfer of Boy Scouting to the United States but had to let the Boy Scouts of America organize separately. After an enthusiastic start, YMCAs sponsored fewer and fewer Boy Scout troops. Led by David Porter, a Rhodes scholar who later headed the YMCA’s college division, YMCA men took the Forward Step into high schools and organized Hi-Y clubs. By the 1920s these were the most successful religious clubs in many high schools, although they could not match the prestige of independent fraternities. During the 1920s, boys’ workers also sought to reorient their programs around small-group methods centered on the boys’ varied interests. Compared with scouting, the YMCA enjoyed considerable success among high school and other older boys; the median age of YMCA junior members from 1900 through the early 1920s was about fifteen. Yet the YMCA could not match the Boy Scouts of America’s explosive growth, from none before 1910 to 377,000 boys in 1920. By comparison, YMCA junior departments
enrolled 31,000 boys in 1900 and 219,000 by early 1921, plus 41,000 in Hi-Y. Further diffusion followed as YMCAs recruited all ages and broke the gender barrier in their quest for members. The late 1920s saw the creation of a “Friendly Indian” (as opposed to hostile?) program for boys under age twelve. In 1930 city YMCA members included 83,000 boys under age twelve and 232,000 of ages twelve through seventeen. By 1963 there were at least 557,000 and 479,000, respectively, plus 181,000 girls under age twelve and 246,000 girls ages twelve through seventeen. Together, they comprised just over half the YMCA’s total membership. As early as 1933, women gained full membership, though their numbers grew only gradually. By 1995 the YMCAs claimed 14 million members in the United States, of whom almost half were female and fully half were still under age eighteen. Although the national YMCA (Canada had organized separately) declared against segregation in 1946, southern white YMCAs resisted fiercely; racists could not imagine sharing a pool with African Americans. By 1967, however, the national board voted to require pledges of nondiscrimination from all YMCAs. Religious requirements relaxed in ways that alarmed YMCA traditionalists. In 1936 the National Council’s program services committee commissioned a guiding statement that defined the YMCA as “a world-wide fellowship of men and boys, united by a common loyalty to Jesus, for the purpose of developing Christian personality and building a Christian society” (Hopkins 1951, 524). Already conversion and Protestant orthodoxy were losing ground to personal development and social values. Since by 1951 nearly two-fifths of
Young Men’s Christian Association members were Roman Catholic or Jewish, YMCAs had to accommodate religious diversity. Thus by 1969 at the YMCA’s Camp Becket in Massachusetts, the old ritual “wherein boys had taken Jesus to be their personal chum had . . . vanished. In its place was an evening candlelight service, at the height of which boys rose to pledge their commitment to building a just society” (Putney 1997, 237). The postwar YMCA followed the middle class to the suburbs; a building boom erected 338 family YMCAs by 1956, offering a wide variety of recreation to fee payers of all ages. Youth soccer and youth basketball for both sexes blossomed in the 1970s. Indicative of the YMCA’s abandonment of dogmatic moral instruction was the claim that “values education was the central and pervasive theme” of the basketball program, “articulated in terms of participation, good sportsmanship, skills development, and competition” (Johnson 1979, 392). Troubled by the social turmoil of the late 1960s, some associations accepted government funds and developed fairly successful programs for inner-city youths. But secretaries and laypeople loyal to the suburban branches often resisted change, and social concern waned by 1975, though a residue of job counseling, tutoring, and other community uplift programs remained in center-city branches. Familyoriented activities and service to individuals through values clarification and personal fitness programs predominated. As fitness and athletics became the main attraction, YMCAs of the 1980s and 1990s invested heavily in upscale facilities to rival private health clubs. Some critics see in YMCA history a story of declension, although their concerns differ. John Gustav-Wrathall laments the loss of the “intense friendship”
763
fused with religious conversion and devotion to the YMCA that inspired young men’s “ardent loyalty” to the nineteenthcentury YMCA (Gustav-Wrathall 1998, 46). Pioneer boys’ workers likewise were often bachelors or as-yet-unmarried men devoted to the welfare of boys. But as early as the 1880s, Gustav-Wrathall believes, the addition of “physical culture and vigilance against sexual immorality” brought trouble (Gustav-Wrathall 1998, 46). The new emphasis on physical development and availability of public spaces potentially sexualized relationships. How much this involved boys is uncertain. Critics who exposed scandals in 1887 at the Chicago YMCA and in 1912 in Portland, Oregon, both mentioned “men and boys” (Gustav-Wrathall 1998, 164), but they were hostile observers, and the actual ages involved went unreported. Alerted by Hall’s ideas to the power of sexual instincts in adolescence and the need for sublimation as well as by the evident anxieties of conscientious YMCA juniors, YMCA boys’ workers of the early twentieth century put considerable energy into getting boys to pledge abstinence from masturbation. By the 1920s, expectations that secretaries would marry and increasing recruitment of women and girls undermined the remaining single-sex ethos of the YMCA. Denial of more acceptable emotional and religious outlets, Gustav-Wrathall suggests, may have helped to foster active gay cruising in downtown YMCAs of the post–World War II decades. Some historians have made a habit of denouncing American culture for shifting from sturdy, inner-directed character in the Victorian era to therapeutic and narcissistic values during the twentieth century. Clifford Putney argues similarly that
764
Young Men’s Hebrew Association
the post–World War II YMCA drifted, unsure of its mission, and then, after an interlude of social reform in the late 1960s and early 1970s, it turned to the gratification of individual needs through values clarification and physical fitness, making sure that members “felt good.” Quoting Christopher Lasch, Putney describes “the contemporary climate [as] therapeutic, not religious” and criticizes the YMCA for abandoning its earlier religiously based character building (Putney 1997, 243, 244). As a membership organization heavily dependent on fees, however, the YMCA was weakly equipped to challenge American culture once the reformism of the Great Society era waned. And even in its heyday, despite the intense religious commitments it induced among a minority, YMCA character building for boys inculcated mainly a forceful conventionality. This was evident in the late 1910s when boys’ workers tried to formalize the YMCA ideal of fourfold character development in a Christian Citizenship Training Program. Seeking symmetrical development, group leaders were to score each boy’s physical, intellectual, religious, and social “efficiency,” plotting the four resulting numbers on a chart with crossed axes: When the four points were joined, the lines formed a quadrilateral—more or less lopsided according to how badly, for instance, the boy’s religious score fell short of his physical one. Thus the perfect boy was a big square (Macleod 1983, 125). In practice, many boys had to be pressured to enroll for more than sports and gymnastics. David I. Macleod See also Adolescence; Basketball; Boy Scouts; Camping; Muscular Christianity; Young Men’s Hebrew Association
References and further reading Gustav-Wrathall, John Donald. 1998. Take the Young Stranger by the Hand: SameSex Relations and the YMCA. Chicago: University of Chicago Press. Hopkins, C. Howard. 1951. History of the Y.M.C.A. in North America. New York: Association Press. Johnson, Elmer L. 1979. The History of YMCA Physical Education. Chicago: Association Press. Macleod, David I. 1983. Building Character in the American Boy: The Boy Scouts, YMCA, and Their Forerunners, 1870–1920. Madison: University of Wisconsin Press. Mjagkij, Nina. 1994. Light in the Darkness: African Americans and the YMCA, 1852–1946. Lexington: University Press of Kentucky. Putney, Clifford. 1997. “From Character to Body Building: The YMCA and the Suburban Metropolis, 1950–1980.” Pp. 231–249 in Men and Women Adrift: The YMCA and YWCA in the City. Edited by Nina Mjagkij and Margaret Spratt. New York: New York University Press. Zald, Mayer N. 1970. Organizational Change: The Political Economy of the YMCA. Chicago: University of Chicago Press.
Young Men’s Hebrew Association In 1854 in Baltimore, Maryland, a group of upper-class German Jewish immigrants established the first Young Men’s Hebrew Association (YMHA), marking the beginning of that organization’s crusade to promote literary, social, moral, and athletic activities for Jewish youth in the United States. In other urban areas, prominent German Jews organized YMHAs (originally named Young Men’s Hebrew Literary Associations) to provide social, literary, recreational, and religious activities for Jewish young men often excluded from Protestant social clubs because of anti-Semitism. The YMHAs patterned themselves after the Young Men’s Christian Association, established in the
Young Men’s Hebrew Association
765
Sabbath blessing at the nursery school of the YMHA and YWHA (Shirley Zeiberg)
United States in 1851, offering facilities for reading and recreation and to promote spiritual values for Jewish youth. Following the Civil War, the YMHA movement expanded greatly. YMHAs in late-nineteenth-century cities offered educational classes, athletics, lectures, and social programs in an effort to assimilate numerous Jewish immigrant young men and boys into American life. As the number of YMHAs increased and a national governing association emerged, plans developed to merge Young Men’s Hebrew Associations and Young Women’s Hebrew Associations (YWHAs), serving as the forerunner of the Jewish Community Center movement in the twentieth century.
The influx of European Jewish immigrants in the last decades of the nineteenth century prompted Jewish civic and religious leaders to build YMHAs to aid new immigrants with Americanization programs of English classes, civics, and physical education and sports within a Jewish environment. By 1900 about 100 Jewish YMHAs were in the United States, serving German Jews and newer European immigrants and their children. The organization of the New York City YMHA in 1874—the 92nd Street YMHA still in existence—and the Philadelphia YMHA in 1875 initiated the growth of “Jewish Ys” and expanded programs for young men and boys. YMHAs in cities
766
Young Men’s Hebrew Association
like New York, Philadelphia, Louisville, and New Orleans developed permanent facilities attracting new members and advancing athletics for young men. Many YMHAs expanded from rooms for libraries and social clubs to include gymnasiums, swimming pools, bowling alleys, billiard rooms, and other recreational facilities. The New York YMHA provided gymnasium equipment in 1875 and then opened a full gymnasium in 1877 to serve the athletic interests of young men and boys. By integrating physical fitness with spiritual values, YMHAs wanted to promote “muscular Judaism,” like the “muscular Christianity” for Protestant male youth encouraged by the Young Men’s Christian Association, and encouraged athletics and religion to counter stereotypes of Jews as weak. By 1910 more than 100 YMHAs existed with a membership of 20,000, and other YMHAs formed to serve Jewish communities. At the Jewish Ys, however, the issue of participating in sports on the Sabbath required the attention of YMHA directors. The New York City YMHA faced the Sabbath issue by keeping the gymnasium open on the Sabbath only for “lighter exercises” and stipulating that “members were not allowed to practice on the trapeze and horizontal bars” (Rabinowitz 1948, 51). The YMHA of Louisville, organized primarily for the assistance of eastern European immigrants by wellestablished Jews like Isaac W. Bernheim, a Kentucky distiller and philanthropist, completed its gymnasium in 1890. Boys’ and men’s gymnasium classes were held on Monday, Wednesday, Friday, and Saturday; but the Saturday gymnasium schedule was in the evening from 8:00 to 9:30 P.M. to avoid religious conflicts on the Sabbath. YMHA administrators also
emphasized the need to prohibit gambling and drinking by young men in the YMHA facilities. As other YMHAs built athletic facilities, the sports programs for boys increased, and competitive teams and champions emerged from some of these Jewish Ys. Jewish young men and boys participated in basketball, boxing, swimming, wrestling, track and field, baseball, volleyball, bowling, handball, tennis, Ping-Pong, and other sports. In particular, YMHA basketball teams competed against other YMHA teams and Jewish settlements in leagues, as well as against YMCA teams, independent teams, and Amateur Athletic Union teams for various age groups; YMHA boys and young men’s basketball players frequently competed in regional as well as national athletic competitions in the early to mid-1900s. The 92nd Street YMHA basketball teams gained national recognition under the guidance of their excellent physical education staff, led by outstanding basketball player and coach Nat Holman. Several YMHAs hosted competitive swimming meets sanctioned by the Amateur Athletic Union, drawing boys as both athletes and spectators. YMHAs often sponsored outdoor summer camps for boys of various ages that focused on sporting activities and Jewish cultural life in rural settings. Some camps followed Jewish laws regarding kosher food and Sabbath observance, but others offered a Reform Judaism context as boys left urban areas for outdoor experiences. YMHA, affiliated boys’ summer camps included Louisville’s YMHA Camp, the 92nd Street YMHA’s Surprise Lake Camp, Philadelphia YMHA’s Camp Arthur, and St. Louis’s “Y” Camp at the Lake of the Ozarks. Information about summer camps, as well as announcements about
Young Men’s Hebrew Association happenings in music, drama, debate, physical education, various clubs, and Jewish holiday celebrations for members, appeared in YMHA publications. House organs like the Y.M.H.A. Bulletin of the 92nd Street YMHA, the Y Journal of the St. Louis YMHA-YWHA, the Chronicler of the Louisville YMHA, and the Criterion of the Paterson, New Jersey, YMHAYWHA publicized the boys’ athletic results and physical education and social programs. Jewish population centers changed in several cities in the first decades of the twentieth century, as Jews relocated from old ethnic centers to other parts of the same cities or to other cities, and YMHAs responded by reorganizing and relocating to reach Jewish youth. To facilitate cooperation and advice between YMHAs and YWHAs, in 1921 the National Jewish Welfare Board (JWB) was organized. The JWB became the national governing body for YMHAs and YWHAs and the National Council of Young Men’s Hebrew and Kindred Associations founded in 1913. The JWB actively promoted the merger of YMHAs and YWHAs and sought to develop them into Jewish Community Centers (JCCs) by the mid-twentieth century, thus combining Jewish and American cultural interests. For example, the Baltimore YMHA-YWHA, built in 1930, brought together the city’s Jewish community in a new facility for religious, educational, athletic, and social activities. A national campaign to improve Jewish community life for Americans of all social classes and religious backgrounds prompted staff of the JWB to work with local communities desiring to renovate YMHAs-YWHAs or
767
build new JCCs. Throughout the United States today, JCCs offer an array of educational classes, lectures, concerts, Jewish holiday celebrations, sports, and recreational activities for Jewish youth. Linda J. Borish See also Basketball; Camping; Young Men’s Christian Association References and further reading Borish, Linda J. 1996. “National Jewish Welfare Board Archives, Young Men’s– Young Women’s Hebrew Association Records: A Research Guide.” Archives and Manuscript Collections, American Jewish Historical Society, Waltham, MA, and New York, NY, November, 1–16. ———. 1999. “‘An Interest in Physical Well-Being among the Feminine Membership’: Sporting Activities for Women at Young Men’s and Young Women’s Hebrew Associations.” American Jewish History 87, no. 1 (March): 61–93. Kirsch, George B. 2000. “Young Men’s Hebrew Association.” Pp. 501–502 in Encyclopedia of Ethnicity and Sports in the United States. Edited by George B. Kirsch, Othello Harris, and Claire E. Nolte. Westport, CT: Greenwood Press. Kraft, Louis. 1941. “Center, The Jewish.” In The Universal Jewish Encyclopedia. Edited by Isaac Landman. Langfeld, William. 1928. The Young Men’s Hebrew Association of Philadelphia: A Fifty-Year Chronicle. Philadelphia: Young Men’s and Young Women’s Hebrew Association of Philadelphia. Levine, Peter. 1992. Ellis Island to Ebbets Field: Sport and the American Jewish Experience. New York: Oxford University Press. Rabinowitz, Benjamin. 1948. The Young Men’s Hebrew Association (1854–1913). New York: National Jewish Welfare Board. Riess, Steven A., ed. 1998. Sports and the American Jew. Syracuse: Syracuse University Press.
Bibliography
Abbott, Douglas A., and Gene H. Brody. 1985. “The Relation of Child Age, Gender, and Number of Children to the Marital Adjustment of Wives.” Journal of Marriage and the Family 47: 77–84.
———. 1990. “Hierarchies, Jobs, Bodies: A Theory of Gendered Organizations.” Gender and Society 4, no. 2. Acker, Joan, and Donald R. Van Houten. 1974. “Differential Recruitment and Control: The Sex Structuring of Organizations.” Administrative Science Quarterly 19, no. 2.
Abel, Ernest. 1977. The Handwriting on the Wall: Toward a Sociology and Psychology of Graffiti. Westport, CT: Greenwood Press.
Adams, Judith. 1991. The American Amusement Park Industry. A History of Technology and Thrills. Boston, MA: Twayne Publishers.
Accessibility of Firearms and the Use of Firearms by or against Juveniles. 2000. Washington, DC: Office of Juvenile Justice and Delinquency Prevention, U.S. Department of Justice.
Adelman, Melvin L. 1986. A Sporting Time: New York City and the Rise of Modern Athletics, 1820–1870. Urbana: University of Illinois Press.
Achatz, Mary, and Crystal A. MacAllum. 1994. Young Unwed Fathers: Report from the Field. Philadelphia: Public/Private Ventures.
Adler, Naomi A., and Joseph Schutz. 1995. “Sibling Incest Offenders.” Child Abuse and Neglect 19: 811–819.
Acker, Joan. 1987. “Sex Bias in Job Evaluation: A Comparable Worth Issue.” In Ingredients for Women’s Employment Policy. Edited by C. Bose and G. Spitze. Albany: SUNY Press.
Ahmed, Yvette, and Peter K. Smith. 1994. “Bullying in Schools and the Issue of Sex Differences.” Pp. 70–83 in Male Violence. Edited by John Archer. New York: Routledge.
———. 1988. “Class, Gender and the Relations of Distribution.” Signs: Journal of Women in Culture and Society 13.
Ahrons, Constance, and Richard B. Miller. 1993. “The Effect of the Postdivorce Relationship on Parental Involvement: A Longitudinal Analysis.” American Journal of Orthopsychiatry 63, no. 3: 441–450.
———. 1989. Doing Comparable Worth: Gender, Class and Pay Equity. Philadelphia: Temple University Press.
769
770
Bibliography
Alan Guttmacher Institute. 1994. Sex and America’s Teenagers. New York: Alan Guttmacher Institute.
American Sunday School Union. 1825–1830. Annual Reports. Philadelphia: American Sunday School Union.
———. 1999. Facts in Brief: Teen Sex and Pregnancy. New York: Alan Guttmacher Institute.
———. Committee of Publications. 1827. Election Day. Philadelphia: American Sunday School Union.
Alexander, Lloyd. 1968. The High King. New York: Bantam Doubleday Dell.
Amsel, Eric, and J. David Smalley. 1999. “Beyond Really and Truly: Children’s Counterfactual Thinking about Pretend and Possible Worlds.” Pp. 99–134 in Children’s Reasoning and the Mind. Edited by K. Riggs and P. Mitchell. Brighton, UK: Psychology Press.
Alger, Horatio. 1872. Phil the Fiddler; or, the Story of a Young Street Musician. New York: Federal Book Company. ———. 1973. Silas Snobden’s Office Boy. 1889–1890. Reprint, Garden City, NY: Doubleday. ———. 1985. Ragged Dick; or, Street Life in New York. 1867. Reprinted in Ragged Dick and Struggling Upward. New York: Penguin.
Anderson, James D. 1988. The Education of Blacks in the South, 1860–1935. Chapel Hill: University of North Carolina Press. Anderson, Kristen. 1997. “Gender Bias and Special Education Referrals.” Annals of Dyslexia 47: 151–162.
Allen, E. John B. 1993. From Skisport to Skiing: One Hundred Years of American Sport, 1849–1940. Amherst: University of Massachusetts Press.
Anderson, Nels, and Raffaele Rauty. 1998. On Hobos and Homelessness. Chicago: University of Chicago Press.
Allen, Gay Wilson. 1981. Waldo Emerson: A Biography. New York: Viking.
Andrews, Dee E. 2000. The Methodists and Revolutionary America. Princeton, NJ: Princeton University Press.
Allen, Sarah M., and Alan J. Hawkins. 1999. “Maternal Gatekeeping: Mothers’ Beliefs and Behaviors That Inhibit Greater Father Involvement in Family Work.” Journal of Marriage and the Family 61: 199–212.
Ang, Ien, and Joke Hermes. 1991. “Gender and/in Media Consumption.” In Mass Media and Society. Edited by James Curran and Michael Gurevitch. New York: Routledge.
American Association of Orthodontists. 2001. “Orthodontics Online,” http:// www.aaortho.org/ (accessed in March). American Camping Association. 1998. Accreditation Standards for Camp Programs and Services. Martinsville, IN: ACA. ———. 1999. Guide to ACA-Accredited Camps. Martinsville, IN: ACA. ———. 2000. “ACA Fact Sheet,” http:// www.acacamps.org/media (accessed June 25).
Anonymous. 1724. Onania; or the Heinous Sin of Pollution, and All Its Frightful Consequences, in Both Sexes, Considered. 10th ed. Boston: John Phillips. Anson, J. L., and Robert F. Marchesani, Jr., eds. 1991. Baird’s Manual of American College Fraternities. 20th ed. Indianapolis: Baird’s Manual Foundation. Anthony, Michael J. 2000. Foundations of Ministry: An Introduction to Christian Education for a New Generation. Grand Rapids, MI: Baker Books.
Bibliography Anti-Slavery Melodies: For the Friends of Freedom; Prepared by the Hingham AntiSlavery Society. 1843. Hingham: Elijah B. Gill. Appleby, Joyce. 2000. Inheriting the Revolution. Cambridge: Harvard University Press. Appleton’s Elementary Geography. 1908. New York: American Book Company. Aptheker, Herbert, ed. 1997. The Correspondence of W. E. B. Du Bois. Vol. 1, Selections 1877–1934. Amherst: University of Massachusetts Press.
771
Ashe, Arthur, and Arnold Rampersad. 1993. Days of Grace. New York: Alfred A. Knopf. Astin, W. A. 1977. Four Critical Years: Effects of College on Beliefs, Attitudes, and Knowledge. San Francisco: JosseyBass. AtariWorld.com. “The Atari Timeline,” http://www.atariworld.com/AtariTimeline.html (accessed December 27, 2000). Athens, Lonnie. 1989. The Creation of Dangerous Violent Criminals. London: Routledge.
“Are Newspapers Taking Advantage of Child Labor?” 1988. Stark Metropolitan Magazine (April): 8–10.
———. 1992. The Creation of Dangerous Violent Criminals. Urbana: University of Illinois Press.
Aries, Philippe. 1962. Centuries of Childhood: A Social History of Family Life. Translated by Robert Baldick. New York: Vintage.
———. 1997. Violent Criminal Acts and Actors Revisited. Urbana: University of Illinois Press.
Arnett, Jeffrey J. 2000. “Emerging Adulthood: A Theory of Development from the Late Teens through the Twenties.” American Psychologist 55: 469–480.
———. 1998. “Dominance, Ghettoes, and Violent Crime.” Sociological Quarterly 39 (Fall): 673–691. Athletic Sports for Boys: A Repository of Graceful Recreations for Youth. 1866. New York: Dick and Fitzgerald.
Ascione, Frank, and Phil Arkow, eds. 1998. Child Abuse, Domestic Violence and Animal Abuse. West Lafayette, IA: Purdue University Press.
Atlas, Rona, and Debra Pepler. 1998. “Observations of Bullying in the Classroom.” Journal of Educational Research 92: 1–86.
Ashby, LeRoy. 1973. Silas Snobden’s Office Boy. 1889–1890. Reprint, Garden City, NY: Doubleday.
Atwater, Montgomery M. 1943. Ski Patrol. New York: Random House.
———. 1983. Saving the Waifs: Reformers and Dependent Children, 1890–1917. Philadelphia: Temple University Press. ———. 1997. Endangered Children: Dependency, Neglect, and Abuse in American History. New York: Twayne Publishers. Ashe, Arthur, with Alexander McNabb. 1995. Arthur Ashe on Tennis. New York: Alfred A. Knopf.
Austin, Joe. 2001. Taking the Train: Youth, Urban Crisis, Graffiti. New York: Columbia University Press. Avery, Gillian. 1975. Childhood’s Pattern: A Study of Heroes and Heroines of Children’s Fiction, 1750–1950. London: Hodder and Stoughton. ———. 1994. Behold the Child: American Children and Their Books 1621–1922. Baltimore: Johns Hopkins University Press.
772
Bibliography
Avrich, Paul. 1980. The Modern School Movement: Anarchism and Education in the United States. Princeton: Princeton University Press. Axtell, James, ed. 1981. The Indian Peoples of Eastern America: A Documentary History of the Sexes. New York: Oxford University Press. Ayers, William. 1997. A Kind and Just Parent: The Children of Juvenile Court. Boston: Beacon Press. Aykesworth, Thomas. 1987. Hollywood Kids: Child Stars of the Silver Screen from 1903 to the Present. New York: Dutton. Babbit, Nicki. 2000. Adolescent Drug and Alcohol Abuse: How to Spot It, Stop It, and Get Help for Your Family. Sebastopol, CA: O’Reilly. Badger, Anthony. 1989. The New Deal: The Depression Years, 1933–1940. New York: Noonday Press. Bagnall, William. 1893. The Textile Industries of the United States. Cambridge: Riverside Press. Bailey, Anthony. 1980. America, Lost and Found. New York: Random House. Bailey, Beth L. 1988. From Front Porch to Back Seat. Baltimore: Johns Hopkins University Press. Baird, Leonard L. 1977. The Schools: A Profile of Prestigious Independent Schools. Lexington, MA: D. C. Heath. Baker, Karen (external relations, Boy Scouts of America). 2000. Telephone conversation, May 30.
Bancroft, Hubert Howe. 1888. California Inter Pocula. San Francisco: History Company. Bardaglio, Peter W. 1992. “The Children of Jubilee: African American Childhood in Wartime.” Pp. 213–229 in Divided Houses: Gender and the Civil War. Edited by Catherine Clinton and Nina Silber. New York: Oxford University Press. Barish, Evelyn. 1989. Emerson: The Roots of Prophecy. Princeton: Princeton University Press. Barker, David. 2000. “Television Production Techniques as Communication.” In Television: The Critical View. 6th ed. Edited by Horace Newcomb. New York: Oxford University Press. Barnett, James. 1954. The American Christmas: A Study in National Culture. New York: Macmillan. Barnouw, Erik. 1978. The Sponsor: Notes on a Modern Potentate. New York: Oxford University Press. ———. 1990. Tube of Plenty: The Evolution of American Television. 2d rev. ed. Oxford: Oxford University Press. Barson, Michael. 1985. “The TV Western.” Pp. 57–72 in TV Genres: A Handbook and Reference Guide. Edited by Brian G. Rose. Westport, CT: Greenwood Press. Barth, Richard P., Mark Claycomb, and Amy Loomis. 1988. “Services to Adolescent Fathers.” Health and Social Work 13: 277–287.
Baldwin, Henry Ives. 1989. The Skiing Life. Concord, NH: Evans Printing.
Barton, Bruce. 1925. The Man Nobody Knows: A Discovery of the Real Jesus. Indianapolis: Bobbs-Merrill.
Ball, Charles. 1969. Slavery in the United States. 1837. Reprint, New York: Negro Universities Press.
Baruch, Dorothy W. 1942. You, Your Children and War. New York: D. Appleton-Century Company.
Bibliography Bass, Ellen, and Kate Kaufman. 1996. Free Your Mind: The Book for Gay, Lesbian, and Bisexual Youth—and Their Allies. New York: HarperCollins. Batchelor, Dean. 1995. The American Hot Rod. Osceola, WI: Motorbooks International. Bateson, Gregory. 1972. “A Theory of Play and Fantasy.” Pp. 177–193 in Steps to an Ecology of Mind. New York: Ballantine. Baur, John E. 1978. Growing Up with California: A History of California’s Children. Los Angeles: Will Kramer. Beach, E. P. 1888. “A Day in the Life of a Newsboy.” Harper’s Young People 9 (January 17): 202. Beal, C. R. 1994. Boys and Girls: The Development of Gender Roles. New York: McGraw-Hill. Beckman, Frank J. 1962. “The Vanished Villains: An Exercise in Nostalgia.” Unpublished manuscript, Billy Rose Theater Collection, New York Public Library at Lincoln Center. Bedard, Roger L., ed. 1984. Dramatic Literature for Children: A Century in Review. New Orleans, LA: Anchorage Press. Bedard, Roger L., and C. John Tolch, eds. 1989. Spotlight on the Child: Studies in the History of American Children’s Theatre. Westport, CT: Greenwood Press. Bederman, Gail. 1989. “‘The Women Have Had Charge of the Church Work Long Enough’: The Men and Religion Forward Movement of 1911–1912 and the Masculinization of Middle-Class Protestantism.” American Quarterly 41, no. 3 (September): 432–465. ———. 1995. Manliness and Civilization: A Cultural History of Gender and Race in the United States, 1880–1917. Chicago: University of Chicago Press.
773
Behr, Edward. 1996. Prohibition. New York: Arcade. Beisel, Nicola. 1997. Imperiled Innocents: Anthony Comstock and Family Reproduction in Victorian America. Princeton: Princeton University Press. Beiswinger, George L. 1985. One to One: The Story of the Big Brothers/Big Sisters Movement in America. Philadelphia: Big Brothers/Big Sisters of America. Belk, Russell. 1987. “A Child’s Christmas in America: Santa Claus as Deity, Consumption as Religion.” Journal of American Culture 10, no. 1: 87–100. ———. 1990. “Halloween: An Evolving American Consumption Ritual.” In Advances in Consumer Research. Vol. 17. Edited by M. Goldberg, Gerald Gorn, and Richard Pollay. Chicago: University of Chicago Press. Bellesiles, Michael A. 2000. Arming America: The Origins of the National Gun Culture. New York: Alfred A. Knopf. Bellingham, Bruce. 1984. “‘Little Wanderers’: A Socio-Historical Study of the Nineteenth Century Origins of Child Fostering and Adoption Reform, Based on Early Records of the New York Children’s Aid Society.” Ph.D. diss., University of Pennsylvania. Bendroth, Margaret Lamberts. 1997. “Men, Masculinity, and Urban Revivalism: J. Wilbur Chapman’s Boston Crusade.” Journal of Presbyterian History 75, no. 4 (Winter): 235–246. Beneke, Timothy. 1997. Proving Manhood: Reflections on Men and Sexism. Berkeley: University of California Press. Bennahum, David S. 1998. Extra Life: Coming of Age in Cyberspace. New York: Basic Books.
774
Bibliography
Bennett, Paula, and Vernon A. Rosario II, eds. 1995. Solitary Pleasures: The Historical, Literary, and Artistic Discourses of Autoeroticism. New York: Routledge. Bennett, William, John DiIulio, and John Waters. 1996. Body Count: Moral Poverty—and How to Win America’s War against Crime and Drugs. New York: Simon and Schuster. Benshoff, Harry M. 1997. Monsters in the Closet: Homosexuality and the Horror Film. Manchester: Manchester University Press. Benston, M. L. 1985. “The Myth of Computer Literacy.” Canadian Women’s Studies 5: 20–22. Berch, B. 1984. “For Women the Chips Are Down.” Processed World 11, no. 2: 42–46. Bergman, Andrew. 1971. We’re in the Money: Depression America and Its Films. New York: New York University Press. Bernstein, Irving. 1985. A Caring Society: The New Deal, the Worker, and the Great Depression. Boston: Houghton Mifflin. Bernstein, Rhona J. 1996. Attack of the Leading Ladies: Gender, Sexuality and Spectatorship in Classic Horror Cinema. New York: Columbia University Press. Berrol, Selma C. 1995. Growing Up American: Immigrant Children in America Then and Now. New York: Twayne. Berryman, Jack W. 1975. “From the Cradle to the Playing Field: America’s Emphasis on Highly Organized Competitive Sports for Preadolescent Boys.” Journal of Sport History (Fall): 112–131. Best, Joel. 1985. “The Myth of the Halloween Sadist.” Psychology Today 19: 14–19.
Best, Joel, and Gerald Horiuchi. 1985. “The Razor Blades in the Apple: The Social Construction of Urban Legends.” Social Problems 32: 488–499. Bettelheim, Bruno. 1976. The Uses of Enchantment: The Meaning and Importance of Fairy Tales. New York: Alfred A. Knopf. Betts, John R. 1974. America’s Sporting Heritage: 1850–1950. Reading, MA: Addison-Wesley. Bezilla, Robert, ed. 1988. America’s Youth: 1977–1988. Princeton, NJ: Gallup. Bigelow, Jim. 1994. The Joy of Uncircumcising! Exploring Circumcision: History, Myths, Psychology, Restoration, Sexual Pleasure and Human Rights. 2d ed. Aptos, CA: Hourglass Books. Biller, H. B. 1981. “Father Absence, Divorce, and Personality Development.” In The Role of the Father in Child Development. 2d ed. Edited by M. E. Lamb. New York: John Wiley. Bissinger, H. G. 1990. Friday Night Lights: A Town, a Team, a Dream. Reading, MA: Addison-Wesley. Blacher, Jan. 1994. When There’s No Place Like Home: Options for Children Living Apart from Their Natural Families. Baltimore: P. H. Brookes Publishers. Blake, Peter. 1973. “The Lessons of the Parks.” Architectural Forum (June): 28ff. Blankenhorn, David. 1995. Fatherless America: Confronting Our Most Urgent Social Problem. New York: Basic Books. Bliven, Bruce. 1968. “A Prairie Boyhood.” The Palimpsest 49, no. 8: 308–352. Block, Carolyn Rebecca, Antigone Christakos, and R. Przybylski. 1996. “Street Gangs and Crime: Patterns and Trends in Chicago.” Research Bulletin.
Bibliography Chicago: Criminal Justice Information Authority. Block, Jeanne H., Jack Block, and Per F. Gjerde. 1986. “The Personality of Children Prior to Divorce: A Prospective Study.” Child Development 57, no. 4: 827–840. ———. 1988. “Parental Functioning and the Home Environment in Families of Divorce: Prospective and Concurrent Analyses.” Journal of the American Academy of Child and Adolescent Psychiatry 27: 207–213. Bloom, John. 1997. A House of Cards: Baseball Card Collecting and Popular Culture. Minneapolis: University of Minnesota Press. Blumer, Herbert. 1997. “Foreword.” Pp. 3–6 in Violent Criminal Acts and Actors Revisited by Lonnie Athens. Urbana: University of Illinois Press. Bock, Richard, and Abigail English. 1973. Got Me on the Run: A Study of Runaways. Boston: Beacon Press. Boddy, William. 1990. Fifties Television: The Industry and Its Critics. Chicago: University of Illinois Press. Boles, John B. 1972. The Great Revival, 1787–1805: The Origins of the Southern Evangelical Mind. Lexington: University of Kentucky Press. Books, Sue, ed. 1998. Invisible Children in the Society and Its Schools. Mahwah, NJ: Erlbaum. Borish, Linda J. 1987. “The Robust Woman and the Muscular Christian: Catharine Beecher, Thomas Higginson and Their Vision of American Society, Health, and Physical Activities.” International Journal of the History of Sport: 139–154. ———. 1996. “National Jewish Welfare Board Archives, Young Men’s–Young
775
Women’s Hebrew Association Records: A Research Guide.” Archives and Manuscript Collections, American Jewish Historical Society, Waltham, MA, and New York, NY, November, 1–16. ———. 1999. “‘An Interest in Physical Well-Being among the Feminine Membership’: Sporting Activities for Women at Young Men’s and Young Women’s Hebrew Associations.” American Jewish History 87, no. 1 (March): 61–93. ———. Forthcoming. Landmarks of American Sports. American Landmarks Series. Edited by James O. Horton. New York: Oxford University Press. Bose, Michael. 1987. “Boys Town: New Ways but Respect for the Past.” U.S. News and World Report, March 20: 38–39. Boston Temperance Songster: A Collection of Songs and Hymns for Temperance Societies, Original and Selected. 1844. Boston: William White. Bosworth, Kris, Dorothy L. Espelage, and Thomas R. Simon. 1999. “Factors Associated with Bullying Behavior in Middle School Students.” Journal of Early Adolescence 19: 341–362. Boy Scouts of America. 1998. “Annual Report,” http://www.scouting.org/ excomm/98annual/yir1998.html (accessed May 28, 2000). ———. 1999. “1999 Annual Report,” http://bsa.scouting.org/nav/pub/news. html (accessed May 14, 2001). Boyd, Billy Ray. 1998. Circumcision Exposed: Rethinking a Medical and Cultural Tradition. Freedom, CA: Crossing Press. Boyd, Brendan, and Frederick Harris. 1973. The Great American Baseball Card Flipping, Trading, and Bubble Gum Book. New York: Warner Paperbacks.
776
Bibliography
Boyer, Paul. 1978. Urban Masses and Moral Order in America, 1820–1920. Cambridge: Harvard University Press. Boylan, Anne M. 1988. Sunday School: The Foundation of an American Institution, 1790–1880. New Haven: Yale University Press. Boys and Girls Clubs of America. 2000. “Who We Are: The Facts,” http://www. bgca.org/whoweare/facts.asp (accessed May 14, 2001). Brace, Charles Loring. 1872. The Dangerous Classes of New York and Twenty Years’ Work among Them. New York: Wynkoop and Hallenbeck. Bragg, George W. 1999. The Big Book. Privately published. Cf. http://216.147. 109.215/bragg.html (accessed March 11, 2001). Braine, Marty, and David O’Brien. 1947. Mental Logic. Mahwah, NJ: Lawrence Erlbaum. Braithewaite, Richard Bevan. 1953. Scientific Explanation: A Study of the Function of Theory, Probability and Law in Science. Cambridge, UK: Cambridge University Press. Brands, H. W. 1997. T. R.: The Last Romantic. New York: Basic Books. Brandt, Allan M. 1985. No Magic Bullet: A Social History of Venereal Disease in the United States since 1880. New York: Oxford University Press. Brauer, Ralph. 1975. The Horse, the Gun, and the Piece of Property: Changing Images of the TV Western. Bowling Green: Popular Press. Brave Boys: New England Traditions in Folk Music. 1995. New World Records. Breen, T. H., and Stephen Innes. 1980. “Myne Owne Ground”: Race and Freedom on Virginia’s Eastern Shore,
1640–1676. New York: Oxford University Press. Brevada, William. 1986. Harry Kemp, the Last Bohemian. Lewisburg, PA: Bucknell University Press. Brewer, John. 1997. The Pleasures of the Imagination: English Culture in the Eighteenth Century. New York: Farrar, Straus and Giroux. Bristow, Nancy K. 1996. Making Men Moral: Social Engineering during the Great War. New York: New York University Press. Brockett, Oscar G., and Frank Hildy. 1999. History of the Theatre. 8th ed. Boston: Allyn and Bacon. Brody, C. J., and L. C. Steelman. 1985. “Sibling Structure and Parental SexTyping of Children’s Household Tasks.” Journal of Marriage and the Family 47: 265–273. Brody, Gene H., Zolinda Stoneman, and Carol MacKinnon. 1986. “Contributions of Maternal Child-rearing Practices and Interactional Contexts to Sibling Interactions.” Journal of Applied Developmental Psychology 7: 225–236. Brody, Gene H., Zolinda Stoneman, and J. Kelly McCoy. 1992. “Parental Differential Treatment of Siblings and Sibling Differences in Negative Emotionality.” Journal of Marriage and the Family 54: 643–651. Bronner, Simon J. 1988. American Children’s Folklore. Little Rock, AR: August House. Brooke, Michael. 1999. The Concrete Wave: The History of Skateboarding. Toronto, Ont.: Warwick. Brooks, Tim, and Earle Marsh. 1979. The Complete Directory to Prime Time Network TV Shows 1946–Present. New York: Ballantine.
Bibliography Brown, Sally. 1990. If the Shoes Fit: Final Report and Program Implementation Guide of the Maine Young Fathers Project. Portland: Human Services Development Institute, University of Southern Maine. Browne, J., and V. Minichiello. 1995. “The Social Meanings behind Male Sex Work: Implications for Sexual Interactions.” British Journal of Sociology 46, no. 4: 598–622. ———. 1996. “The Social and Work Context of Commercial Sex between Men: A Research Note.” Australian and New Zealand Journal of Sociology 32, no. 1: 86–92. Browne, Porter Emerson. 1909. “The Mellowdrammer.” Everybody’s Magazine (September): 347–354. Browning, Don, ed. 1997. From Culture Wars to Common Ground: Religion and the American Family Debate. Louisville: Westminister/John Knox. Bruegman, Bill. 1992. Toys of the Sixties. Akron, OH: Cap’n Penny Productions. Brumberg, Joan J. 1997. The Body Project: An Intimate History of American Girls. New York: Random House. Bryant, Brenda. 1990. “The Richness of the Child-Pet Relationship.” Anthrozoös 3, no. 4: 253–261. Bryant, Jennings. 1985. Testimony to the Attorney General’s Commission on Pornography Hearings. Houston, Texas. Bryk, Anthony, and Valerie Lee. 1986. “Effects of Single Sex Secondary Schools on Student Achievement and Attitudes.” Journal of Educational Psychology 78. Buckingham, David. 1993. Children Talking Television: The Making of Television Literacy. London: Falmer Press. Buhle, Mari Jo, Paul Buhle, and Dan Georgakas, eds. “Contributions of Family
777
Relationships and Child Temperaments to Longitudinal Variations in Sibling Relationship Quality and Sibling Relationship Styles.” Journal of Family Psychology 8: 274–286. ———. 1994. 1998. Encyclopedia of the American Left. 2d ed. New York: Oxford University Press. Buhrmester, Duane, and Wyndol Furman. 1990. “Perceptions of Sibling Relationships during Middle Childhood and Adolescence.” Child Development 61: 1387–1398. Bulkley, L. Duncan. 1894. Syphilis in the Innocent (Syphilis Insontium) Clinically and Historically Considered with a Plan for the Legal Control of the Disease. New York: Bailey and Fairchild. Bullough, Vern L. 1976. Sexual Variance in Society and History. Chicago: University of Chicago Press. Bureau of the Census. 1965. The Statistical History of the United States. Washington, DC: Government Printing Office. Burge, Penny L., and Steven M. Culver. 1994. “Gender Equity and Empowerment in Vocational Education.” Pp. 51–63 in Critical Education for Work: Multidisciplinary Approaches. Edited by Richard D. Lakes. Norwood, NJ: Ablex. Burger, Jim. 1976. In Service: A Documentary History of the Baltimore City Fire Department. Baltimore: Paradigm Books. Burnham, John. 1993. Bad Habits: Drinking, Smoking, Taking Drugs, Gambling, Sexual Misbehavior, and Swearing in American History. New York: New York University Press. Burstyn, Varda. 1999. The Rites of Men: Manhood, Politics, and the Culture of Sport. Toronto: University of Toronto Press.
778
Bibliography
Burton, Linda M., Peggy DilworthAnderson, and Cynthia Merriwether–de Vries. 1995. “Context and Surrogate Parenting among Contemporary Grandparents.” Marriage and Family Review 20: 349–366. Buscombe, Edward. 1988. The BFI Companion to the Western. New York: Da Capo Press. Buscombe, Edward, and Roberta E. Pearson. 1998. Back in the Saddle Again: New Essays on the Western. London: BFI Press. Bushman, Richard. 1992. The Refinement of America: Persons, Houses, Cities. New York: Alfred A. Knopf. Butterfield, Lyman H., ed. 1961. The Adams Papers: Diary and Autobiography of John Adams. Cambridge, MA: Belknap Press of Harvard University Press. ———. 1966. The Earliest Diary of John Adams: June 1753–April 1754, September 1758–January 1759. Cambridge, MA: Belknap Press of Harvard University Press. Bybee, Rodger W., and E. Gordon Gee. 1982. Violence, Values, and Justice in the Schools. Boston: Allyn and Bacon. Cabot, James Elliot. 1887. A Memoir of Ralph Waldo Emerson. 2 vols. Boston: Houghton Mifflin. Caillois, Roger. 1979. Man, Play, and Games. Translated by Meyer Barash. New York: Schocken Books. Calvert, Karin. 1992. Children in the House: The Material Culture of Early Childhood, 1600–1900. Boston: Northeastern University Press. Camara, K., and G. Resnick. 1989. “Styles of Conflict Resolution and Cooperation between Divorced Parents: Effects on Child Behavior and Adjustment.”
American Journal of Orthopsychiatry 59, no. 4: 560–575. Cameron, Ian, and Douglas Pye. 1996. The Book of Westerns. New York: Continuum. “Camping Then and Now.” 1999. Camping Magazine 72 (November–December): 18–31. Canada, G. 1998. Reaching Up for Manhood: Transforming the Lives of Boys in America. Boston: Beacon Press. Canetto, Silvia Sara. 1997a. “Meanings of Gender and Suicidal Behavior among Adolescents.” Suicide and LifeThreatening Behaviors 27: 339–351. ———. 1997b. “Gender and Suicidal Behavior: Theories and Evidence.” Pp. 138–167 in Review of Suicidology. Edited by R. W. Maris, M. M. Silverman, and Canetto. New York: Guilford. Canetto, Silvia Sara, and David Lester. 1995. “Gender and the Primary Prevention of Suicide Mortality.” Suicide and Life-Threatening Behavior 25: 58–69. Canetto, Silvia Sara, and Isaac Sakinofsky. 1998. “The Gender Paradox in Suicide.” Suicide and Life-Threatening Behavior 28: 1–23. Cannon, Donald J. 1977. Heritage of Flames. New York: Doubleday. Caplow, Theodore. 1984. “Rule Enforcement without Visible Means.” American Journal of Sociology 89, no. 6: 1306–1323. Caplow, Theodore, Howard Bahr, and Bruce Chadwick. 1983. All Faithful People: Change and Continuity in Middletown’s Religion. Minneapolis: University of Minnesota Press. Cappon, Lester J., ed. 1959. The AdamsJefferson Letters. Chapel Hill: University of North Carolina Press.
Bibliography Carey, Susan. 1985. Conceptual Change in Childhood. Cambridge, MA: MIT Press. Carlson, Eve B. 1984. “Children’s Observations of Interparental Violence.” In Battered Women and Their Families. Edited by A. R. Roberts. New York: Springer Publishing. Carnegie Council on Adolescent Development, Task Force on Youth Development and Community Programs. 1992. A Matter of Time: Risk and Opportunity in the Nonschool Hours. New York: Carnegie Corporation of New York. Carp, E. Wayne. 1998. Family Matters: Secrecy and Adoption in the History of Adoption. Cambridge, MA: Harvard University Press. Carp, E. Wayne, ed. 2001. Historical Perspectives on American Adoption. Ann Arbor: University of Michigan Press. Carr, Lois Green, and Russell R. Menard. 1979. “Immigration and Opportunity: The Freedman in Early Colonial Maryland.” Pp. 206–242 in The Chesapeake in the Seventeenth Century. Edited by Thad W. Tate and David L. Ammerman. New York: W. W. Norton. Carr, Lois Green, and Lorena S. Walsh. 1979. “The Planter’s Wife: The Experience of White Women in Seventeenth-Century Maryland.” In A Heritage of Her Own: Toward a New Social History of American Women. Edited by Nancy F. Cott and Elizabeth H. Pleck. New York: Simon and Schuster. Carr, Lois Green, Russell R. Menard, and Lorena S. Walsh. 1991. Robert Cole’s World: Agriculture and Society in Early Maryland. Chapel Hill: University of North Carolina Press. Carrier, James. 1986. Learning Disability: Social Class and the Construction of
779
Inequality in American Education. Westport, CT: Greenwood Press. Carroll, James D., et al. 1987. We the People: A Review of U.S. Government and Civics Textbooks. Washington, DC: People for the American Way. Carson, Cary, Ronald Hoffman, and Peter J. Albert, eds. 1994. Of Consuming Interests: The Style of Life in the Eighteenth Century. Charlottesville: University Press of Virginia. Carter, Patricia A. 1994. “Women’s Workplace Equity: A Feminist View.” Pp. 67–81 in Critical Education for Work: Multidisciplinary Approaches. Edited by Richard D. Lakes. Norwood, NJ: Ablex. Cartmill, Matt. 1993. A View to a Death in the Morning: Hunting and Nature through History. Cambridge, MA: Harvard University Press. Cartwright, Peter. 1856. The Autobiography of Peter Cartwright, the Backwoods Preacher. Edited by W. P. Strickland. Cincinnati: L. Swormstedt and A. Poe. Cary, Diana Serra. 1979. Hollywood’s Children: An Inside Account of the Child Star Era. Boston: Houghton Mifflin. Case, Carl. 1906. The Masculine in Religion. Philadelphia: American Baptist Publishers Society. Cashin, Joan E. 1991. A Family Venture: Men and Women on the Southern Frontier. New York: Oxford University Press. Casper, Lynne, and Kenneth Bryson. 1998. Co-resident Grandparents and Their Grandchildren: Grandparent Maintained Families. Population Division Working Paper no. 26. Washington, DC: Population Division, U.S. Bureau of the Census. Cassorla, Albert. 1976. The Skateboarder’s Bible. Philadelphia: Running Press.
780
Bibliography
Cassuto, Leonard. 1997. The Inhuman Race: The Racial Grotesque in American Literature and Culture. New York: Columbia University Press. Catalogue of American Portraits, National Portrait Gallery, Smithsonian Institution. http://www.npg.si.edu/inf/ceros.htm (accessed March 24, 2001). Cavallo, Dominick. 1981. Muscles and Morals: Organized Playgrounds and Urban Reform, 1880–1920. Philadelphia: University of Pennsylvania Press. Cawelti, John. 1965. Apostles of the SelfMade Man: Changing Concepts of Success in America. Chicago: University of Chicago Press. ———. 1985. The Six-Gun Mystique. Rev. ed. Bowling Green: Bowling Green University Popular Press. Cayton, Andrew R. L. 1993. “The Early National Period.” Vol. 1, p. 100 in Encyclopedia of American Social History. Edited by Mary Kupiec Cayton, Elliott J. Gorn, and Peter W. Williams. New York: Scribner’s. Cayton, Mary Kupiec. 1989. Emerson’s Emergence. Chapel Hill: University of North Carolina Press. CDC (Centers for Disease Control and Prevention). 1998. “Suicide among Black Youths—United States, 1980–1995.” Journal of the American Medical Association 279, no. 18: 1431. ———. 1999. “Division of Adolescent and School Health’s Information Service Report.” Silver Springs, MD: Government Printing Office. Censer, Jane Turner. 1984. North Carolina Planters and Their Children, 1800–1860. Baton Rouge: Louisiana State University Press.
Chafetz, Janet. 1980. “Toward a MacroLevel Theory of Sexual Stratification.” Current Perspectives in Social Theory 1. Champlin, John D., Jr., and Arthur E. Bostwick. 1890. The Young Folks’ Cyclopedia of Games and Sports. New York: Henry Holt. Chancer, Lynn. 1998. Reconcilable Differences: Confronting Beauty, Pornography, and the Future of Feminism. Berkeley: University of California Press. Chapin, John. 2000. “Third-Person Perception and Optimistic Bias among Urban-Minority ‘At-Risk’ Youth.” Communication Research 27, no. 1: 51–81. Chapman, P. D. 1988. Schools as Sorters: Lewis M. Terman, Applied Psychology and the Intelligence Testing Movement, 1890–1930. New York: New York University Press. Charry, Ellen T. 2001. “Will There Be a Protestant Center?” Theology Today (January): 453–458. Check, James. 1995. “Teenage Training: The Effects of Pornography on Adolescent Males.” Pp. 89–91 in The Price We Pay: The Case against Racist Speech, Hate Propaganda, and Pornography. Edited by Laura J. Lederer and Richard Delgado. New York: Hill and Wang. Chenery, Mary Faeth. 1991. I Am Somebody: The Messages and Methods of Organized Camping for Youth Development. Martinsville, IN: ACA. Chesney-Lind, Meda, and John Hagedorn, eds. 1999. Female Gangs in America. Chicago: Lake View Press. Child Welfare League of America. 1994. Kinship Care: A Natural Bridge. Washington, DC: Child Welfare League of America.
Bibliography Children’s Defense Fund. 1988. Adolescent and Young Adult Fathers: Problems and Solutions. Washington, DC: Children’s Defense Fund. Children’s Hospital of Philadelphia. Annual Report. 1895. Chodorow, Nancy. 1979. The Reproduction of Mothering. Berkeley: University of California Press. Chotner, Deborah. 1992. American Naive Paintings. Washington, DC: National Gallery of Art. Christensen, Clark. 1995. “Prescribed Masturbation in Sex Therapy: A Critique.” Journal of Sex and Marital Therapy 21 (Summer): 87–99. Christie, A. A. 1997. “Using Email within a Classroom Based on Feminist Pedagogy.” Journal of Research on Computing in Education 30, no. 2 (December). ———. 2000. “Gender Differences in Computer Use in Adolescent Boys and Girls.” Unpublished raw data. Church, Robert, and Michael W. Sedlack. 1976. Education in the United States: An Interpretive History. New York: Free Press. Circumcision Information and Resource Pages. 2001. “United States Circumcision Incidence,” http://www.cirp.org/library/ statistics/USA (accessed March 9, 2001). Clapp, David. 1822–1823. “Diary.” Worcester, MA: American Antiquarian Society. Clark, Cindy Dell. 1995. Flights of Fancy, Leaps of Faith: Children’s Myths in Contemporary America. Chicago: University of Chicago Press. Clark, Ronald W. 1983. Benjamin Franklin: A Biography. New York: Random House.
781
Clark-Hine, Darlene, and Earnestine Jenkins, eds. 1999. A Question of Manhood: A Reader in U.S. Black Men’s History and Masculinity. Bloomington: Indiana University Press. Cleaveland, Agnes Morley. 1977. No Life for a Lady. Lincoln: University of Nebraska Press. Clement, Priscilla Ferguson. 1997. Growing Pains: Children in the Industrial Age, 1850–1890. New York: Twayne Publishers. Clemmer, E. J., and E. W. Hayes. 1979. “Patient Cooperation in Wearing Orthodontic Headgear.” American Journal of Orthodontics 75, no. 5: 517–524. Clinton, Catherine. 1998. Civil War Stories. Athens: University of Georgia Press. Cloninger, Susan C. 2000. Theories of Personality: Understanding Persons. 3d ed. Upper Saddle River, NJ: Prentice-Hall. Clover, Carol J. 1992. Men, Women, and Chainsaws: Gender in the Modern Horror Film. Princeton: Princeton University Press. Cockrell, Dale, ed. 1989. Excelsior: Journals of the Hutchinson Family Singers, 1842–1846. New York: Pendragon Press. Coggins, Jack. 1967. Boys in the Revolution: Young Americans Tell Their Part in the War for Independence. Harrisburg, PA: Stackpole Books. Cohen, M. 1987. Juvenile Prostitution. Washington, DC: National Association of Counties Research. Cohen, Patricia Cline. 1999. The Murder of Helen Jewitt. New York: Vintage. Cole, Phyllis. 1998. Mary Moody Emerson and the Origins of Transcendentalism: A Family History. New York: Oxford University Press.
782
Bibliography
Coleman, Annie Gilbert. 1996. “The Unbearable Whiteness of Skiing.” Pacific Historical Review 65 (November): 583–614.
Coontz, Stephanie. 1988. The Social Origins of Private Life: A History of American Families, 1600–1900. New York: Verso.
Coleman, E. 1989. “The Development of Male Prostitution Activity among Gay and Bisexual Adolescents.” Journal of Homosexuality 17, no. 2: 131–149.
———. 1992. The Way We Never Were: American Families and the Nostalgia Trap. New York: Basic Books.
Coleman, James S. 1961. The Adolescent Society: The Social Life of the Teenager and Its Impact on America. New York: Free Press. Collins, Bud. 1989. My Life with the Pros. New York: Dutton. Collis, B. 1987. “Psycho-social Implications of Sex Differences in Attitudes towards Computers: Results of a Survey.” International Journal of Women’s Studies 8, no. 3: 207–213. Comics Scene 2000. 2000. New York: Starlog Group. Committee on Injury and Poison Prevention, American Academy of Pediatrics. 1997. Injury Prevention and Control for Children and Youth. Edited by Mark D. Widome. Elk Grove Village, IL: American Academy of Pediatrics. “Concerning Black Bass.” 1884. American Angler 5, April 19. Conde, Yvonne M. 1999. Operation Pedro Pan: The Untold Exodus of 14,048 Cuban Children. New York: Routledge. Conger, Rand D., and Martha A. Reuter. 1996. “Siblings, Parents and Peers: A Longitudinal Study of Social Influences in Adolescent Risks for Alcohol Use and Abuse.” Pp. 1–30 in Sibling Relationships: Their Causes and Consequences. Edited by G. H. Brody. Norwood, NJ: Ablex. Connell, R. W. 1987. Gender and Power. Stanford: Stanford University Press. ———. 1995. Masculinities. Berkeley: University of California Press.
Cormier, Robert. 1974. The Chocolate War. New York: Laureleaf. Cowan, Ruth Schwartz. 1979. “From Virginia Dare to Virginia Slims: Women and Technology in American Life.” Technology and Culture 20: 51–63. Cowart, M. F., R. W. Wilhelm, and R. E. Cowart. 1998. “Voices from Little Asia: ‘Blue Dragon’ Teens Reflect on Their Experience as Asian Americans.” Social Education 62, no. 7: 401–404. Coyne, Michael. 1997. The Crowded Prairie: American National Identity in the Hollywood Western. New York: St. Martin’s Press. Crews, Gordon A., and M. Reid Counts. 1997. The Evolution of School Disturbance in America: Colonial Times to Modern Day. Westport, CT: Praeger. Crime in the United States 1999. 2000. Washington, DC: Federal Bureau of Investigation, U.S. Department of Justice. Crimmins, Eileen. 1981. “The Changing Pattern of American Mortality Decline, 1940–1977.” Population Development Review 7: 229–254. Cross, Gary. 1997. Kids’ Stuff: Toys and the Changing World of American Childhood. Cambridge, MA: Harvard University Press. Croswell, T. R. 1898. “Amusements of Worcester Schoolchildren.” The Pedagogical Seminary 6: 314–371. Croteau, Jan Helling. 2000. Perform It! A Complete Guide to Young People’s Theatre. Portsmouth, NH: Heinemann.
Bibliography Cruise, David, and Alison Griffiths. 1992. Net Worth: Exploding the Myths of Pro Hockey. Toronto: Penguin Books. Csikszentmihalyi, Mihaly, Kevin Rathunde, and Samuel Whalen. 1993. Talented Teenagers: The Roots of Success and Failure. New York: Cambridge University Press. Culin, Stewart. 1891. “Street Games of Boys in Brooklyn, N.Y.” Journal of American Folklore 4, no. 14: 221–237. Cummings, E. Mark, and Patrick Davies. 1994. Children and Marital Conflict: The Impact of Family Dispute and Resolution. New York: Guilford Press. Cummings, Scott, and Daniel Monti. 1993. Gangs. Albany: State University of New York Press. Cunliffe, Marcus. 1968. Soldiers and Civilians: The Martial Spirit in America, 1775–1865. Boston: Little, Brown.
783
Introduction by Stan Lee. New York: Abrams. Daniel, Thomas M., and Frederick C. Robins, eds. 1997. Polio. Rochester: University of Rochester Press. Daniels, Elizabeth. 1989. “The Children of Gettysburg.” American Heritage 40 (May–June): 97–107. Dann, John, ed. 1980. The Revolution Remembered: Eyewitness Accounts of the War for Independence. Chicago: University of Chicago Press. Davidson, Ben. 1976. The Skateboard Book. New York: Grosset and Dunlap. Davies, Richard G. 1983. “Of Arms and the Boy: A History of Culver Military Academy, 1894–1945.” Ph.D. diss., School of Education, Indiana University. Davis, Jack E. 1993. “Changing Places: Slave Movement in the South.” The Historian 55 (Summer): 657–676.
Cunningham, Hugh. 1991. The Children of the Poor: Representations of Childhood since the Seventeenth Century. Oxford: Basil Blackwell.
Davis, Joshua. 1819. Joshua Davis’ Report. Collections of the New England Historical and Genealogical Society.
Curry, G. David, and Scott H. Decker. 1998. Confronting Gangs: Crime and Community. Los Angeles: Roxbury.
Davis, O. L., Jr., et al. 1986. Looking at History: A Review of Major U.S. History Textbooks. Washington, DC: People for the American Way.
Dale, Edward Everett. 1959. Frontier Ways: Sketches of Life in the Old West. Austin: University of Texas Press. Danbom, David B. 1974. “The Young America Movement.” Journal of the Illinois State Historical Society 67: 294–306. ———. 1995. Born in the Country: A History of Rural America. Baltimore: Johns Hopkins University Press. Daniel, Clifton, ed. 1987. Chronicle of the 20th Century. New York: Prentice Hall. Daniel, Les. 1993. Marvel: Five Fabulous Decades of the World’s Greatest Comics.
Davis, Owen. 1914. “Why I Quit Writing Melodrama.” American Magazine (September): 28–31. ———. 1931. I’d Like to Do It Again. New York: Farrar and Rinehart. Dawson, John M., and Patrick A. Langan. 1994. Murder in Families. Washington, DC: U.S. Department of Justice, Bureau of Justice Statistics. De Charms, Richard, and Gerald H. Moeller. 1962. “Values Expressed in American Children’s Readers: 1800–1950.” Journal of Abnormal and Social Psychology 64: 136–142.
784
Bibliography
de Graaf, R., et al. 1994. “Male Prostitutes and Safe Sex: Different Settings, Different Risks.” AIDS Care 6, no. 3: 277–288.
Destrehan, Nicholas A. 1850. “Memoirs” in “Letter Book.” Historic New Orleans Collection, New Orleans, LA.
Dean, John I. 1992. “Scouting in America, 1910–1990.” Ed.D. diss., University of South Carolina.
Dewey, John. 1916. Democracy and Education. New York: Macmillan.
Deisher, R., G. Robinson, and D. Boyer. 1982. “The Adolescent Female and Male Prostitute.” Pediatric Annals 11, no. 10: 819–825. D’Emilio, John D., and Estelle B. Freedman. 1988. Intimate Matters: A History of Sexuality in America. New York: Harper and Row. Denning, Michael. 1987. Mechanic Accents: Dime Novels and Working-Class Culture in America. New York: Verso Press. Denniston, George C. 1999. Male and Female Circumcision: Medical, Legal and Ethical Considerations in Pediatric Practice. Norwell, MA: Kluwer Academic.
———. 1929. Experience and Nature. La Salle: Open Court. Dexter, Franklin Bowditch. 1919. Ancient Town Records. Vol. 2: New Haven Town Records, 1662–1684. New Haven: New Haven Colony Historical Society. Deyle, Steven. 1995. “The Domestic Slave Trade in America.” Ph.D. diss., Columbia University. Dicey, Edward. 1863. Six Months in the Federal States. London: Macmillan. Reprint, Herbert Mitgang, ed., 1971. Spectator of America. Chicago: Quadrangle Books. DiIulio, John J., Jr. 1995. “The Coming of the Super-Predators.” The Weekly Standard (November 27): 23–27.
Derevensky, Jeffrey L., Rina Gupta, and Giuseppe Della Cioppa. 1996. “A Developmental Perspective of Gambling Behavior in Children and Adolescents.” Journal of Gambling Studies 12, no. 1: 49–66.
Dixon, Pahl, and Peter Dixon. 1977. Hot Skateboarding. New York: Warner Books.
Desjardins, Mary. 1999. “Luci and Desi: Sexuality, Ethnicity, and TV’s First Family.” In Television, History, and American Culture: Feminist Critical Essays. Edited by Mary Best Haralovich and Lauren Rabinovitz. Durham, NC: Duke University Press.
Doherty, William J. 1998. The Intentional Family. Reading, MA: Addison-Wesley.
Desrockers, Robert E., Jr. 1999. “Not Fade Away: The Narrative of Venture Smith, an African American in the Early Republic.” In A Question of Manhood: A Reader in U.S. Black Men’s History and Masculinity. Vol. 1. Edited by Darlene Clark-Hine and Earnestine Jenkins. Bloomington: Indiana University Press.
Don’t Give the Name a Bad Place: Types and Stereotypes in American Musical Theater, 1870–1900. 1978. New World Records.
Dobrin, Michael, and Philip E. Linhares. 1996. Hot Rods and Customs: The Men and Machines of California’s Car Culture. Oakland: Oakland Museum of California.
Donelson, Kenneth L., and Alleen Pace Nilsen. 1996. Literature for Today’s Young Adult. 5th ed. Reading, MA: AddisonWesley.
Douglas, Mary. 1975. “Jokes.” Pp. 90–114 in Implicit Meanings: Essays in Anthropology by Mary Douglas. London: Routledge and Kegan Paul.
Bibliography Douglas, Susan. 1999. Listening In: Radio and the American Imagination. New York: Times Books.
785
Same-Sex and Different-Sex Dyads.” Child Development 52: 1265–1273.
Douglass, Frederick. 1855. My Bondage and My Freedom. Reprint, New York: Dover Publications, 1969.
Dunn, Judy, and Shirley McGuire. 1992. “Sibling and Peer Relationships in Childhood.” Journal of Child Psychology and Psychiatry 33: 67–105.
Drury, Clifford Merril. 1974. “Growing Up on an Iowa Farm, 1897–1915.” Annals of Iowa 42, no. 3: 161–197.
Dyk, Walter. 1938. Son of Old Man Hat: A Navaho Autobiography. Lincoln: University of Nebraska Press.
———. 1998. American Youth Violence. New York: Oxford University Press.
Dyreson, Mark. 1998. Making the American Team: Sport, Culture, and the Olympic Experience. Urbana: University of Illinois Press.
Dryfoos, Joy G. 1990. Adolescents at Risk: Prevalence and Prevention. New York: Oxford University Press. Du Bois, W. E. B. 1920. Darkwater: Voices from within the Veil. New York: Harcourt, Brace, and Howe. ———. 1940. Dusk of Dawn: An Essay toward an Autobiography of a Race Concept. In W. E. B. Du Bois: Writings. Edited by Nathan Huggins. New York: Harcourt, Brace, and Company. Reprint, New York: Library of America, 1986. ———. 1968. Autobiography of W. E. B. Dubois: A Soliloquy on Viewing My Life from the Last Decade of Its First Century. New York: International Publishers. Dundes, Alan. 1987. “The Dead Baby Joke Cycle.” Pp. 3–14 in Cracking Jokes. Berkeley, CA: Ten Speed Press. Dunn, Judy. 1983. “Sibling Relationships in Early Childhood.” Child Development 54: 787–811. ———. 1996. “Brothers and Sisters in Middle Childhood and Early Adolescence: Continuity and Change in Individual Differences.” Pp. 31–46 in Sibling Relationships: Their Causes and Consequences. Edited by Gene H. Brody. Norwood, NJ: Ablex. Dunn, Judy, and C. Kendrick. 1981. “Social Behavior of Young Siblings in the Family Context: Differences Between
“Early American Impressions.” 1904. The American Field: The Sportsman’s Journal 61, no. 17 (April 23). Early Minstrel Show, The. 1998. New World Records. East, Patricia L., and Karen S. Rook. 1992. “Compensatory Patterns of Support among Children’s Peer Relationships: A Test Using School Friends, Nonschool Friends, and Siblings.” Developmental Psychology 28: 163–172. Eastman, Charles A. 1902. Indian Boyhood. 1902. Reprint, New York: Dover Publications, 1971. Edelson, Jeffery. L. 1999. “Children’s Witnessing of Adult Domestic Violence.” Journal of Interpersonal Violence 14, no. 8: 839–870. Eder, Donna. 1997. “Sexual Aggression within the School Culture.” In Gender, Equity, and Schooling: Policy and Practice. Edited by Barbara J. Bank and Peter M. Hall. New York: Garland. Eder, Donna, with Catherine Colleen Evans and Stephen Parker. 1995. School Talk: Gender and Adolescent Culture. New Brunswick, NJ: Rutgers University Press. Education Commission of the States Task Force on Education for Economic Growth.
786
Bibliography
1983. Action for Excellence: A Comprehensive Plan to Improve Our Nation’s Schools. Denver: Education Commission of the States. Eells, Eleanor. 1986. Eleanor Eells’ History of Organized Camping: The First Hundred Years. Martinsville, IN: ACA.
Periodicity; Auto-Eroticism. Philadelphia: E. A. Davis. Ellis, Joseph J. 1993. Passionate Sage: The Character and Legacy of John Adams. New York: W. W. Norton.
Einstein, Albert. 1950. Out of My Later Years. New York: Philosophical Library.
Elson, Ruth Miller. 1964. Guardians of Tradition: American Schoolbooks of the Nineteenth Century. Lincoln: University of Nebraska Press.
Eisenstadt, S. N. 1956. From Generation to Generation: Age Groups and Social Structure. New York: Free Press.
Elster, Arthur B., and Michael E. Lamb, eds. Adolescent Fatherhood. Hillsdale, NJ: Erlbaum.
Ekrich, Arthur A. 1956. The Civilian and the Military. New York: Oxford University Press.
Emerson, Mary Moody. 1993. The Selected Letters of Mary Moody Emerson. Edited by Nancy Craig Simmons. Athens: University of Georgia Press.
El-Bassel, N., R. F. Schilling, L. Gilbert, S. Faruque, K. L. Irwin, and B. R. Edlin. 2000. “Sex Trading and Psychological Distress in a Street-based Sample of Low Income Urban Men.” Journal of Psychoactive Drugs 32, no. 2: 259–267. Elder, Glen H., Jr. 1974. Children of the Great Depression: Social Change in Life Experience. Chicago: University of Chicago Press. Elfenbein, Jessica Ivy. 1996. “To ‘Fit Them for Their Fight with the World’: The Baltimore YMCA and the Making of a Modern City, 1852–1932.” Ph.D. diss., University of Delaware. Elifson, K. W., J. Boles, and M. Sweat. 1993. “Risk Factors Associated with HIV Infection among Male Prostitutes.” American Journal of Public Health 83, no. 1: 79–83. Elliott, David L., and Arthur Woodward, eds. 1990. Textbooks and Schooling in the United States: Eighty-Ninth Yearbook of the National Society for the Study of Education, Pt. 1. Chicago: National Society for the Study of Education. Ellis, Havelock. 1900. The Evolution of Modesty; the Phenomena of Sexual
Emerson, Ralph Waldo. 1844. “The Young American.” The Dial (April). ———. 1903–1904. The Complete Works. 12 vols. Edited by Edward W. Emerson. Boston: Houghton Mifflin. ———. 1939. Letters of Ralph Waldo Emerson. 6 vols. Edited by Ralph L. Rusk. New York: Columbia University Press. ———. 1960–1978. Journals and Miscellaneous Notebooks of Ralph Waldo Emerson. Edited by William H. Gilman et al. Cambridge: Harvard University Press. Emery, R. E. 1988. Marriage, Divorce, and Children’s Adjustment. Newbury Park, CA: Sage. Emery, R. E., E. M. Hetherington, and L. F. Dilalla. 1984. “Divorce, Children, and Social Policy.” Pp. 189–266 in Child Development Research and Social Policy. Edited by H. W. Stevenson and A. E. Siegel. Chicago: University of Chicago Press. Empey, LaMar T., and M. C. Stafford. 1991. American Delinquency: Its Meaning and Construction. 3d ed. Belmont, CA: Wadsworth.
Bibliography Englehardt, Tom. 1987. “Children’s Television: The Strawberry Shortcake Strategy.” In Watching Television: A Pantheon Guide to Popular Culture. Edited by Todd Gitlin. New York: Pantheon. English Country Dances: From Playford’s Dancing Master, 1651–1703. 1991. Saydisc. Erdman, Harley. 1997. Staging the Jew: The Performance of an American Ethnicity, 1860–1920. New Brunswick: Rutgers University Press. Erickson, Judith B. 1983. Directory of American Youth Organizations. Omaha, NE: Boys Town. Escobar, Edward J. 1999. Race, Police, and the Making of a Political Identity: Relations between Chicanos and the Los Angeles Police Department, 1900–1945. Berkeley: University of California Press. Espelage, Dorothy, and Christine Asidao. In press. “Conversations with Middle School Students about Bullying and Victimization: Should We Be Concerned?” Journal of Emotional Abuse. Espelage, Dorothy L., and Melissa K. Holt. In press. “Bullying and Victimization during Early Adolescence: Peer Influences and Psychosocial Correlates.” Journal of Emotional Abuse. Espelage, Dorothy L., Kris Bosworth, and Thomas R. Simon. 2000. “Examining the Social Environment of Middle School Students Who Bully.” Journal of Counseling and Development 78: 326–333. Evans, Walter. 1972. “The All-American Boys: A Study of Boys’ Sports Fiction.” Journal of Popular Culture 6: 104–121. Ewbank, Douglas. 1987. “History of Black Mortality and Health before 1940.” Milbank Quarterly 65, supp. 1: 100–128.
787
Ewing, Elizabeth. 1977. History of Children’s Costume. New York: Charles Scribner’s Sons. Fagan, Jeffrey, and Franklin E. Zimring, eds. 2000. The Changing Borders of Juvenile Justice: Transfer of Adolescents to the Criminal Court. Chicago: University of Chicago Press. Fagot, Beverly I., Katherine C. Pears, Deborah M. Capaldi, Lynn Crosby, and Craig S. Leve. 1998. “Becoming an Adolescent Father: Precursors and Parenting.” Developmental Psychology 34: 1209–1219. Faludi, Susan. 1999. “The Betrayal of the American Man.” Newsweek (September 13): 49–58. Fantuzzo, John W., and Carrol U. Lindquist. 1989. “The Effects of Observing Conjugal Violence on Children: A Review and Analysis of Research Methodology.” Journal of Family Violence 4, no. 1: 77–93. Faragher, John Mack. 1979. Women and Men on the Overland Trail. New Haven: Yale University Press. Farish, Hunter Dickinson, ed. 1957. Journal and Letters of Philip Vickers Fithian, 1773–1774: A Plantation Tutor of the Old Dominion. Williamsburg, VA: Colonial Williamsburg. Farmer, Silas. 1889. The History of Detroit and Michigan. Detroit: Silas Farmer. Fass, Paula S. 1977. The Damned and the Beautiful: American Youth in the 1920s. New York: Oxford University Press. Fass, Paula S., and Mary Ann Mason, eds. 2000. Childhood in America. New York: New York University Press. Federal Writers Project, Interviews with Former Slaves. 1930s. Chapel Hill: Southern Historical Collection, University of North Carolina.
788
Bibliography
Feinstein, John. 1991. Hard Courts. New York: Villard Books.
States.” History of Education Quarterly 40, no. 1: 1–22.
Feld, Barry C. 1999. Bad Kids: Race and the Transformation of the Juvenile Court. New York: Oxford University Press.
Finn, William J. 1939. The Art of the Choral Conductor. Evanston, IL: SummyBirchard Publishing.
Feldman, Shirley, and Glen Elliot, eds. 1990. At the Threshold: The Developing Adolescent. Cambridge, MA: Harvard University Press.
Firearm Injuries and Fatalities. 2000. Atlanta: National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.
Feretti, Fred. 1975. The Great American Book of Sidewalk, Stoop, Dirt, Curb, and Alley Games. New York: Workman.
Fischer, David Hackett. 1989. Albion’s Seed: Four British Folkways in America. New York: Oxford University Press.
Ferling, John E. 1992. John Adams: A Life. Knoxville: University of Tennessee Press.
Fisher, B., D. K. Weisberg, and T. Marotta. 1982. Report on Adolescent Male Prostitution. San Francisco: Urban and Rural Systems Associates.
———. 1994. John Adams: A Bibliography. Westport, CT: Greenwood Press. Fetto, John. 1999. “Happy Campers.” American Demographics 21, no. 7 (July): 46–47. Findlay, John M. 1992. Magic Lands. Seattle: University of Washington Press. Fine, Gary Alan. 1987. With the Boys: Little League Baseball and Preadolescent Culture. Chicago: University of Chicago Press. Fine, M. 1991. Framing Dropouts: Notes on the Politics of an Urban Public High School. Albany: State University of New York Press. Fingerhut, Lois, and Joel Kleinman. 1989. Trends and Current Status in Childhood Mortality. Washington, DC: National Center for Health Statistics. Finkelstein, Barbara. 1989. Governing the Young: Teacher Behavior in Popular Primary Schools in Nineteenth-Century United States. London: Falmer Press. ———. 2000. “A Crucible of Contradictions: Historical Roots of Violence against Children in the United
Fiske, George W. 1912. Boy Life and SelfGovernment. New York: Association Press. FitzGerald, Francis. 1979. America Revised: History Schoolbooks in the Twentieth Century. Boston: Little, Brown. Flanagan, D. P., and J. L. Genshaft, eds. 1997. “Issues in the Use and Interpretation of Intelligence Testing in Schools.” School Psychology Review 26: 2. Flanagan, D. P., J. Genshaft, and P. L. Harrison, eds. 1997. Contemporary Intellectual Assessment: Theories, Tests and Issues. New York: Guilford. Florey, Francesca A., and Avery M. Guest. 1988. “Coming of Age among U.S. Farm Boys in the Late 1800s: Occupational and Residential Choices.” Journal of Family History 13, no. 2: 233–249. Flynt, Josiah. 1972. Tramping with Tramps. Montclair, NJ: Patterson Smith. For Youth by Youth. 2001. “About 4-H,” http://www.4-H.org (accessed May 14, 2001).
Bibliography Forbush, William B. 1907. The Boy Problem. 3d ed. Boston: Pilgrim Press. Ford, Clellan S., and Frank A. Beach. 1951. Patterns of Sexual Behavior. New York: Harper and Brothers.
789
Franklin, Benjamin. 1959. Autobiography and Selected Writings. New York: Holt, Rinehart and Winston. ———. 1959. Autobiography. New York: Holt, Rinehart, and Winston.
Ford, Larry. 2001. “Boychoir—Past, Present and Future,” http://www. boychoirs.org (accessed March 11, 2001).
Franklin Fire Company. 1856. “Minutes.” Missouri Historical Society, St. Louis Volunteer Fireman Collection.
———. 2001. “Donald Collup Singing Alleluja by Wolfgang Amadeus Mozart,” http://www.boychoirs.org/collup.html (accessed March 11, 2001).
Frazer, Sir James. 1915. The Golden Bough: A Study of Magic and Religion. London: Macmillan.
Ford, Larry, Gene Bitner, and Lindsay Emery. 2001. “The World of Treble Voices,” http://216.147.109.215/contents. html (accessed March 11, 2001). Ford, Paul Leicester, ed. 1897. The “New England Primer”: A History of Its Origin and Development. New York: Dodd, Mead. Ford, Paul L., ed. 1899. The New England Primer. New York: Dodd, Mead. Formanek-Brunell, Miriam. 1993. Made to Play House: Dolls and the Commercialization of American Girlhood 1830–1930. New Haven: Yale University Press. Forrest, Suzanne. 1998. The Preservation of the Village: New Mexico’s Hispanics and the New Deal. Albuquerque: University of New Mexico Press. Foucault, Michel. 1980. The History of Sexuality. Vol. 1: An Introduction. New York: Vintage. Fox, Ebenezer. 1838. The Revolutionary Adventures of Ebenezer Fox. Boston: Monroe and Francis. Frank, Michael L., and Crystal Smith. 1989. “Illusion of Control and Gambling in Children.” Journal of Gambling Behavior 5, no. 2: 127–136. Franklin, Barry. 1987. Learning Disabilities: Dissenting Essays. New York: Falmer Press.
Freeman, Evelyn B. 1985. “When Children Face Divorce: Issues and Implications of Research.” Childhood Education 62, no. 2: 130–136. Freeman, Norman. 1980. Strategies of Representation in Young Children. London: Academic Press. Freeman, Norman, and Maureen V. Cox. 1985. Visual Order. Cambridge: Cambridge University Press. Frye, Alexis E. 1902. Grammar School Geography. Boston: Ginn. Fuller, Wayne E. 1982. The Old Country School: The Story of Rural Education in the Midwest. Chicago: University of Chicago Press. Furman, Wyndol, and Duane Buhrmester. 1985. “Children’s Perceptions of the Qualities of Sibling Relationships.” Child Development 56: 448–461. ———. 1992. “Age and Sex Differences in Perceptions of Networks of Personal Relationships.” Child Development 63: 103–115. Gagne, Luc. 1995. Moving Beauty. Montreal, Quebec: Montreal Museum of Fine Art. Gagner, Constance T., Teresa M. Cooney, and Kathleen Thiede Call. 1998. “The Effects of Family Characteristics and Time Use on Teenage Girls; and Boys’
790
Bibliography
Household Labor.” Princeton University Center for Research on Child Well-being. Working Paper Series no. 98-1. Gairdner, Douglas. 1949. “The Fate of the Foreskin.” British Medical Journal 2 (1949): 1433–1437. Galarza, Ernesto. 1971. Barrio Boy. Notre Dame: University of Notre Dame Press. Galenson, David. 1981. White Servitude in Colonial America: An Economic Analysis. Cambridge: Cambridge University Press. Galenson, David W. 1993. “The Impact of Economic and Technological Change on the Careers of American Men Tennis Players, 1960–1991.” Journal of Sport History 20, no. 2 (Summer): 127–150. Gall, Timothy, and Daniel Lucas, eds. 1996. Statistics on Alcohol, Drug and Tobacco Use. Detroit: Thompson. Gallo, Agatha M., and Kathleen A. Knafl. 1993. “The Effects of Mental Retardation, Disability, and Illness on Sibling Relationships: Research Issues and Challenges.” Pp. 215–234 in Siblings of Children with Chronic Illnesses: A Categorical and Noncategorical Look at Selected Literature. Edited by Zolinda Stoneman and Phyllis Waldman Burman. Baltimore: Paul H. Brookes Publishing. Garbarino, James. 1999. Lost Boys: Why Our Sons Turn Violent and How We Can Save Them. New York: Free Press. Gardella, Peter. 1985. Innocent Ecstasy: How Christianity Gave America an Ethic of Sexual Pleasure. New York: Oxford University Press. Gardner, Howard. 1980. Artful Scribbles: The Significance of Children’s Drawings. New York: Basic Books. Garland, Hamlin. 1899. Boy Life on the Prairie. New York: Macmillan.
———. 1926. Boy Life on the Prairie. Boston: Allyn and Bacon. Garlits, Don. 1990. The Autobiography of “Big Daddy” Don Garlits. Ocala, FL: Museum of Drag Racing. Gault, Frank, and Claire Gault. 1977. The Harlem Globetrotters. New York: Walker. GeekComix.com. 2000. “A Brief History of Home Video Games,” http://www. geekcomix.com/vgh/main.shtml (accessed December 27, 2000). Gems, Gerald R. 1996. “The Prep Bowl: Football and Religious Acculturation in Chicago, 1927–1963.” Journal of Sport History 23, no. 3: 284–302. ———. 1997. Windy City Wars: Labor, Leisure, and Sport in the Making of Chicago. Lanham, MD: Scarecrow Press. ———. 2000. For Pride, Patriarchy, and Profit: Football and the Incorporation of American Cultural Values. Metuchen, NJ: Scarecrow Press. Gems, Gerald, ed. 1995. Sports in North America: A Documentary History. Vol. 5, Sports Organized, 1880–1900. Gulf Breeze, FL: Academic International Press. “General Social Survey.” 1999. http:// www.icpsr.umich.edu/GSS99/index.html. Gerould, Daniel. 1983. American Melodrama. New York: Performing Arts Journal. Giamatti, A. Bartlett. 1981. “Power, Politics, and a Sense of History.” Pp. 166–179 in The University and the Public Interest. New York: Atheneum. Gignilliat, Leigh R. 1916. Arms and the Boy: Military Training in Schools. Indianapolis: Bobbs-Merrill. Gilbert, Albert C., with Marshall McClintock. 1953. The Man Who Lives in Paradise. New York: Rinehart.
Bibliography
791
Gilbert, Douglas. 1940. American Vaudeville: Its Life and Times. New York: McGraw-Hill.
Women, and Children in Antebellum America. Albany: State University of New York Press.
Gilbert, James B. 1986. A Cycle of Outrage: America’s Reaction to the Juvenile Delinquent in the 1950s. New York: Oxford University Press.
Goffman, Erving. 1961. Asylums: Essays on the Social Situation of Mental Patients and Other Inmates. Garden City, NY: Anchor/Doubleday.
Gillham, Bill, and James A. Thomson, eds. 1996. Child Safety: Problem and Prevention from Preschool to Adolescence: A Handbook for Professionals. New York: Routledge.
———. 1961. Encounters: Two Studies in the Sociology of Interaction. Indianapolis: Bobbs-Merrill. ———. 1963. Stigma. Englewood Cliffs, NJ: Prentice-Hall.
Gillis, John R. 1974. Youth and History: Tradition and Change in European Age Relations, 1770–Present. New York: Academic Press.
———. 1967. Interaction Ritual: Essays on Face-to-Face Behavior. Garden City, NY: Anchor Books.
Gilmore, D. 1990. Manhood in the Making: Cultural Concepts of Masculinity. New Haven: Yale University Press.
Goldstein, Arnold P. 1990. Delinquents on Delinquency. Champaign, IL: Research Press.
Gilmore, William J. 1989. Reading Becomes a Necessity of Life: Material and Culture Life in Rural New England, 1780–1835. Knoxville: University of Tennessee Press.
———. 1991. Delinquent Gangs: A Psychological Perspective. Champaign, IL: Research Press.
Girl Scouts. 2000. “About Us,” http:// www.girlscouts.org (accessed May 14, 2001). Girls and Boys Town. 2000. “About Boys Town, History,” http://www.boystown.org/ home.htm (accessed September 5, 2000). Giroux, Henry A. 1996. Fugitive Cultures: Race, Violence and Youth. New York: Routledge. Gittens, Joan. 1994. Poor Relations: The Children of the State in Illinois, 1818–1990. Urbana: University of Illinois Press. Glassner, Barry. 1995. “Men and Muscles.” In Men’s Lives. Edited by Michael Kimmel and Michael Messner. Boston: Allyn and Bacon. Glenn, Myra C. 1984. Campaigns against Corporal Punishment: Prisoners, Sailors,
Goldstein, Ruth M., and Charlotte Zornow. 1980. The Screen Image of Youth: Movies about Children and Adolescents. Metuchen, NJ: Scarecrow Press. Gollaher, David. 1994. “From Ritual to Science: The Medical Transformation of Circumcision in America.” Journal of Social History 28, no. 1: 5–36. Golomb, Claire. 1992. The Creation of a Pictorial World. Berkeley: University of California Press. Gonzalez, Gilbert G. 1990. Chicano Education in the Era of Segregation. Philadelphia: Balch Institute Press. Goodman, Cary. 1979. Choosing Sides: Playground and Street Life on the Lower East Side. New York: Schocken Books. Goodman, Jules Eckert. 1908. “The Lure of Melodrama.” Bohemian Magazine (February): 180–191.
792
Bibliography
Goodman, Nan. 1998. Shifting the Blame: Literature, Law and the Theory of Accidents in Nineteenth-century America. Princeton: Princeton University Press. Gordon, Ian. 1998. Comic Strips and Consumer Culture, 1890–1945. Washington, DC: Smithsonian Institution Press. Gordon, Linda. 1988. Heroes of Their Own Lives: The Politics and History of Family Violence, Boston, 1880–1960. New York. Viking. Gorn, Elliott J. 1986. The Manly Art: Bare-Knuckle Prize Fighting in America. Ithaca: Cornell University Press. Gorn, Elliott J., ed. 1998. The McGuffey Readers: Selections from the 1879 Edition. Bedford Series in History and Culture. Boston: Bedford/St. Martin’s Press. Goulart, Ron. 2000. Comic Book Culture: An Illustrated History. Portland, OR: Collectors Press. Goulart, Ron, ed. 1990. Encyclopedia of American Comics. New York: Facts on File. Gould, Steven J. 1981. The Mismeasure of Man. New York: Norton Press. Graber, Julia A., and Jeanne Brooks-Gunn. 1996. “Transitions and Turning Points: Navigating the Passage from Childhood through Adolescence.” Developmental Psychology 32: 768–776. Graber, Julia A., Anne C. Petersen, and Jeanne Brooks-Gunn. 1996. “Pubertal Processes: Methods, Measures, and Models.” Pp. 23–53 in Transitions through Adolescence: Interpersonal Domains and Context. Edited by Julia A. Graber, Jeanne Brooks-Gunn, and Anne C. Petersen. Mahwah, NJ: Erlbaum. Grace, Catherine O’Neil. 1998. “Kids and Money: Valuable Lessons.” The Washington Post, June 23, Z22.
Graebner, William. 1988. “Outlawing Teenage Populism: The Campaign against Secret Societies in the American High School, 1900–1960.” Journal of American History 74: 411–435. Graetz, J. M. 1981. “The Origin of SpaceWar.” Creative Computing (August). Grant, Barry Keith, ed. 1996. The Dread of Difference. Austin: University of Texas Press. Grant, Julia. 1998. Raising Baby by the Book: The Education of American Mothers. New Haven: Yale University Press. Gray, Asa. 1875. Botany for Young People, Part II: How Plants Behave. New York: Ivison, Blakeman, and Taylor. Gray, Herman. 1995. Watching Race: Television and the Struggle for Blackness. Minneapolis: University of Minnesota Press. Gray, Kenneth C., and Edwin L. Herr. 1995. Other Ways to Win: Creating Alternatives for High School Graduates. Thousand Oaks, CA: Corwin Press. Greeley, Horace. 1868. Recollections of a Busy Life. New York and Boston: H. A. Brown and J. B. Ford. Green, Abel, and Joe Laurie, Jr. 1951. Show Biz: From Vaude to Video. New York: Henry Holt. Green, Harvey. 1988. Fit for America: Health, Fitness, Sport, and American Society. Baltimore: Johns Hopkins University Press. Greenberg, Amy S. 1998. Cause for Alarm: The Volunteer Fire Department in the Nineteenth-Century City. Princeton: Princeton University Press. Greenberg, Blu. 1985. How to Run a Traditional Jewish Household. New York: Simon and Schuster.
Bibliography
793
Greenberger, Ellen, and Lawrence Steinberg. 1986. When Teenagers Work: The Psychological and Social Costs of Adolescent Employment. New York: Basic Books.
United States, 1820–1870.” Society and Animals 7, no. 2: 95–120.
Greenfeld. Lawrence A. 1996. Child Victimizers: Violent Offenders and Their Victims. Washington, DC: Office of Juvenile Justice and Delinquency Prevention.
Griswold, Robert L. 1997. “Generative Fathering: A Historical Perspective.” Pp. 71–86 in Generative Fathering: Beyond Deficit Perspectives. Edited by Alan J. Hawkins and David C. Dollahite. Thousand Oaks, CA: Sage.
Greenfield, Laurence. 1991. “Toys, Children, and the Toy Industry in a Culture of Consumption, 1890–1991.” Ph.D. diss., Ohio State University. Greenfield, Patricia Marks. 1984. Mind and Media: The Effects of Television, Video Games, and Computers. Cambridge: Harvard University Press. Greenman, Jeremiah. 1978. Diary of a Common Soldier in the American Revolution, 1775–1783: An Annotated Edition of the Military Journal of Jeremiah Greenman. Edited by Robert C. Bray and Paul E. Bushnell. Dekalb: Northern Illinois University Press. Greif, Richard S. 1997. Big Impact: Big Brothers Making a Difference. Boston: New Hat. Greven, Philip J., Jr. 1977. The Protestant Temperament: Patterns of Child-Rearing, Religious Experience, and Self in Early America. New York: Alfred A. Knopf. ———. 1990. Spare the Child: The Religious Roots of Punishment and the Psychological Impact of Physical Abuse. New York: Vintage. Grider, Sylvia Ann. 1996. “Conservation and Dynamism in the Contemporary Celebration of Halloween: Institutionalization, Commercialization, Gentrification.” Western Folklore 53, no. 1: 3–15. Grier, Katherine C. 1999. “Childhood Socialization and Companion Animals:
Grimsley, Will. 1971. Tennis: Its History, People and Events. Englewood Cliffs, NJ: Prentice-Hall.
Grossberg, Michael. 1985. Governing the Hearth: Law and the Family in Nineteenth-Century America. Chapel Hill: University of North Carolina Press. Grove, Robert D., and Alice M. Hetzel. 1968. Vital Statistic Rates in the United States, 1940–1960. Washington, DC: National Center for Health Statistics. Grubb, W. Norton. 1996. “The New Vocationalism: What It Is, What It Could Be.” Phi Delta Kappan 77, no. 8: 533–546. Grunbaum, Jo Anne, Laura Kann, Steven A. Kinchen, James G. Ross, Vani R. Gowda, Janet L. Collins, and Lloyd J. Kolbe. 1999. “Youth Risk Behavior Surveillance—National Alternative High School Youth Risk Behavior Survey, United States, 1988.” Centers for Disease Control and Prevention: MMWR Surveillance Summaries 48, no. SS-7 (October 29). Gruneau, Richard, and David Whitson. 1993. Hockey Night in Canada: Sport, Identities, and Cultural Politics. Toronto: Garamond Press. Gudmundsen, Jinny. 2000. “Strategy for Parents: Use Ratings, Be Involved. Choosing Titles by the Letters.” Los Angeles Times, October 26, T8. Guimond, James. 1991. American Photography and the American Dream. Chapel Hill: University of North Carolina Press.
794
Bibliography
Gullotta, Thomas P., Gerald R. Adams, and Raymond Montemayor, eds. 1998. Delinquent Violent Youth: Theory and Interventions. Vol. 9, Advances in Adolescent Development. Thousand Oaks, CA: Sage. Gustav-Wrathall, John Donald. 1998. Take the Young Stranger by the Hand: SameSex Relations and the YMCA. Chicago: University of Chicago Press. Guthrie, J. 2000. “Not Geeks, Gangsters at Schools.” San Francisco Examiner, May 14, C1, C5. Gutman, Judith Mara. 1967. Lewis W. Hine and the American Social Conscience. New York: Walker. ———. 1974. Lewis Hine 1874–1940: Two Perspectives. New York: Grossman. Haas, Lisbeth. 1995. Conquests and Historical Identities in California, 1769–1936. Berkeley: University of California Press. Hacsi, Timothy A. 1997. Second Home: Orphan Asylums and Poor Families in America. Cambridge, MA: Harvard University Press. Hahamovitch, Cindy. 1997. The Fruits of Their Labor: Atlantic Coast Farmworkers and the Making of Migrant Poverty, 1870–1945. Chapel Hill: University of North Carolina Press. Hall, Donald E., ed. 1994. Muscular Christianity: Embodying the Victorian Age. Cambridge, UK: Cambridge University Press. Hall, G. Stanley. 1904. Adolescence: Its Psychology, and Its Relations to Physiology, Anthropology, Sociology, Sex, Crime, Religion, and Education. 2 vols. New York: D. Appleton. Hallock, Charles. 1873. The Fishing Tourist: Angler’s Guide and Reference Book. New York: Harper and Bros.
Halsey, Rosalie V. 1911. Forgotten Books of the American Nursery: A History of the Development of the American StoryBook. Boston: Charles Goodspeed, 1969; Reprint, Detroit: Singing Tree Press. Hamburg, David. 1992. Today’s Children: Creating a Future for a Generation in Crisis. New York: Times Books, Random House. Hamm, Charles. 1979. Yesterdays: Popular Song in America. New York: W. W. Norton. Hammonds, Evelynn Maxine. 1999. Childhood’s Deadly Scourge: The Campaign to Control Diphtheria in New York City, 1880–1930. Baltimore: Johns Hopkins University Press. Hampsten, Elizabeth. 1991. Settlers’ Children: Growing Up on the Great Plains. Norman: University of Oklahoma Press. Handbook of Private Schools, The. 1926. 17th ed. Boston: Porter Sargent. Handel, Gerald. 1985. “Central Issues in the Construction of Sibling Relationships.” Pp. 493–523 in The Psychosocial Interior of the Family. Edited by Gerald Handel. New York: Aldine de Gruyter. Hanson, Glen, and Peter Venturelli. 1995. Drugs and Society. 4th ed. Boston: Jones and Bartlett. Haralovich, Mary Best, and Lauren Rabinovitz, eds. 1999. Television, History, and American Culture: Feminist Critical Essays. Durham, NC: Duke University Press. Haraven, Tamara K. 1982. Family Time and Industrial Time: The Relationship between the Family and Work in a New England Industrial Community. Cambridge, UK: Cambridge University Press.
Bibliography Hardy, Stephen. 1983. How Boston Played: Sport, Recreation and Community, 1865–1915. Boston: Northeastern University Press. Hare, E. H. 1962. “Masturbatory Insanity: The History of an Idea.” The Journal of Mental Science 108 (January): 2–25. Harlan, Louis R. 1972. Booker T. Washington: The Making of a Black Leader, 1856–1901. New York: Oxford University Press.
795
Hazen, Margaret Hindle, and Robert M. Hazen. 1992. Keepers of the Flame: The Role of Fire in American Culture, 1775–1925. Princeton: Princeton University Press. Heimert, Alan, and Perry Miller. 1967. The Great Awakening. Indianapolis: Bobbs-Merrill. Henderson, Robert W. 1947. Ball, Bat and Bishop: The Origins of Ball Games. New York: Rockport Press.
Hatch, Nathan. 1989. The Democratization of American Christianity. New Haven, CT: Yale University Press.
Hendler, Glenn. 1996. “Pandering in the Public Sphere: Masculinity and the Market in Horatio Alger.” American Quarterly 48, no. 3 (September): 414–438.
Haven, Alice Bradley [Cousin Alice]. 1853. “All’s Not Gold That Glitters”; or, the Young Californian. New York: Appleton and Company.
Herdt, Gilbert. 1987. The Sambia: Ritual and Gender in New Guinea. New York: Holt, Rinehart, and Winston.
Hawes, Joseph M. 1971. Children in Urban Society: Juvenile Delinquency in Nineteenth-Century America. New York: Oxford University Press.
Herdt, Gilbert, and Andrew Boxer. 1993. Children of Horizons: How Gay and Lesbian Teens Are Leading a New Way out of the Closet. Boston: Beacon.
———. 1997. Children between the Wars: American Childhood, 1920–1940. New York: Twayne Publishers.
Herman, Daniel Justin. 2001. Hunting and the American Imagination. Washington, DC: Smithsonian Institution Press.
Hawes, Joseph, and N. Ray Hiner, eds. 1985. American Childhood: A Research Guide and Historical Handbook. Westport, CT: Greenwood Press.
Herz, J. C. 1997. Joystick Nation: How Computer Games Ate Our Quarters, Won Our Hearts and Rewired Our Minds. New York: Little, Brown.
Hawley, Frank, with Mark Smith. 1989. Drag Racing: Drive to Win. Osceola, WI: Motorbooks International.
Herzog, E., and C. Sudia. 1973. “Children in Fatherless Families.” In Review of Child Development Research. Vol. 3, Child Development and Child Policy. Edited by B. M. Caldwell and H. N. Riccuiti. Chicago: University of Chicago Press.
Hawley, Richard. 1991. “About Boys’ Schools: A Progressive Case for an Ancient Form.” Teachers College Board 92, no. 3. Haywood, C. Robert, and Sandra Jarvis. 1992. A Funnie Place, No Fences: Teenagers’ Views of Kansas, 1867–1900. Lawrence: Division of Continuing Education, University of Kansas.
Hess, Albert G., and Priscilla F. Clement, eds. 1993. History of Juvenile Delinquency: A Collection of Essays on Crime Committed by Young Offenders, in History and in Selected Countries. Vol. 2. Aalen, Germany: Scientia Verlag.
796
Bibliography
Hess, R. D., and I. T. Miura. 1985. “Gender Differences in Enrollment in Computer Camps and Classes.” Sex Roles 13: 193–203. Hetherington, E. M. 1979. “Divorce: A Child’s Perspective.” American Psychologist 34: 851–858. ———. 1991. “Presidential Address: Families, Lies, and Videotapes.” Journal of Research on Adolescence 1, no. 4: 323–348. Hewes, Minna, and Gordon Hewes. 1952. “Indian Life and Customs at Mission San Luis Rey: A Record of California Indian Life Written by Pablo Tac, an Indian Neophyte.” The Americas 9: 87–106. Hewitt, Barnard. 1959. Theatre U.S.A.: 1665–1957. New York: McGraw-Hill. Hewitt, Karen, and Louis Roomet. 1979. Educational Toys in America: 1800 to the Present. Burlington, VT: Robert Hull Fleming Museum. Heyrman, Christine. 1997. Southern Cross: The Beginnings of the Bible Belt. Chapel Hill: University of North Carolina Press. Hicks, David. 1996. “The Strange Fate of the American Boarding School.” The American Scholar 65, no. 4 (Autumn). Hilger, M. Inez. 1992. Chippewa Child Life and Its Cultural Background. 1951. Reprint, St. Paul: Minnesota Historical Society Press. Hill, David S. 1920. Introduction to Vocational Education: A Statement of Facts and Principles Related to the Vocational Aspects of Education below College Grade. New York: Macmillan. Hill, John. 1983. “Early Adolescence: A Research Agenda.” Journal of Early Adolescence 3: 1–21.
Hine, Lewis. 1915. “The High Cost of Child Labor.” Brochure. Washington, DC: Library of Congress. Hiner, N. Ray, and Joseph M. Hawes, eds. 1985. Growing Up in America: Children in Historical Perspective. Urbana: University of Illinois Press. Hinton, S. E. 1967. The Outsiders. Boston: G. K. Hall. His Majestie’s Clerks. 1996. Goostly Psalmes: Anglo American Psalmody, 1550–1800. Harmonia Mundi. Hoben, Allan. 1913. The Minister and the Boy: A Handbook for Churchmen Engaged in Boys’ Work. Chicago: University of Chicago Press. Hoch-Deutsches Lutherisches ABC und Namen Büchlein für Kinder. 1819. Germantown, PA: W. Billmeyer. Hochschild, Arlie R. 1997. The Time Bind. New York: Metropolitan Books. Hodges, W. F., and B. L. Bloom. 1984. “Parent’s Reports of Children’s Adjustment to Marital Separation: A Longitudinal Study.” Journal of Divorce 8, no. 1: 33–50. Hodgson, Lynne. 1992. “Adult Grandchildren and Their Grandparents: The Enduring Bond.” International Journal of Aging and Human Development 34: 209–225. Hogan, Dennis, David Eggebeen, and Sean Snaith. 1996. “The Well-Being of Aging Americans with Very Old Parents.” Pp. 327–346 in Aging and Generational Relations over the Life Course. Edited by Tamara Haraven. German: Aldine de Gruyter. Hohman, Leslie B., and Bertram Schaffner. 1947. “The Sex Lives of Unmarried Men.” American Journal of Sociology 52 (May): 501–507.
Bibliography Holland, Kenneth, and Frank Ernest Hill. 1942. Youth in the CCC. Washington, DC: American Council on Education. Hollander, Zander, ed. 1979. The Modern Encyclopedia of Basketball. New York: Doubleday. Holliday, J. S. 1999. Rush for Riches: Gold Fever and the Making of California. Berkeley: University of California Press. Hollinger, Joan H., et al., eds. 1989. Adoption in Law and Practice. New York: Mathew Bender. Holloran, Peter. 1989. Boston’s Wayward Children: Social Services for Homeless Children, 1830–1930. Rutherford, NJ: Fairleigh Dickinson University Press. Holt, Marilyn. 1992. The Orphan Trains: Placing Out in America. Lincoln: University of Nebraska Press. Holzman, Robert S. 1956. The Romance of Firefighting. New York: Bonanza Books. Homicide Trends in the United States. 2001. Washington, DC: Bureau of Justice Statistics. Hoobler, Dorothy, and Thomas Hoobler. 1994. The Chinese American Family Album. New York: Oxford University Press. Hopkins, C. Howard. 1951. History of the Y.M.C.A. in North America. New York: Association Press. Horatio Alger Association of Distinguished Americans, http://www. horatioalger.com. Horn, James. 1979. “Servant Emigration to the Chesapeake in the Seventeenth Century.” Pp. 51–95 in The Chesapeake in the Seventeenth Century. Edited by Thad W. Tate and David L. Ammerman. New York: W. W. Norton. Horn, Maurice, ed. 1977. The World Encyclopedia of Comics. New York: Avon.
797
Horton, James, and Lois E. Horton, consulting eds. 1995. A History of the African American People. New York: Smithmark Publishers. Howard-Pitney, Beth, Teresa D. LaFromboise, Mike Basil, Benedette September, and Mike Johnson. 1992. “Psychological and Social Indicators of Suicide Ideation and Suicide Attempts in Zuni Adolescents.” Journal of Consulting and Clinical Psychology 60: 473–476. Howell, James C. 1998. “Youth Gangs: An Overview.” Juvenile Justice Bulletin. Washington, DC: U.S. Department of Justice, Office of Juvenile Justice and Delinquency Prevention. Howell, Susan H., Pedro R. Portes, and Joseph H. Brown. 1997. “Gender and Age Differences in Child Adjustment to Parental Separation.” Journal of Divorce and Remarriage 27, nos. 3–4: 141–158. Huck, Charlotte. 1997. Children’s Literature in the Elementary School. 6th ed. Boston: McGraw-Hill. Huey, Wayne C. 1987. “Counseling Teenage Fathers: The ‘Maximizing a Life Experience’ (MALE) Group.” School Counselor 35: 40–47. Huff, C. Ronald, ed. 1990. Gangs in America. 1st ed. Newbury Park, CA: Sage. Huffstutter, P. J., and Claudia Eller. 2000. “GameWorks to Restrict Youngsters at Arcades.” Los Angeles Times, October 6, C1. Hughes, Fergus P. 1999. Children, Play, and Development. Boston: Allyn and Bacon. Huizinga, Johan. 1955. Homo Ludens: A Study of the Play Element in Culture. Boston: Beacon Press. Hulse, Diane. 1997. Brad and Cory: A Study of Middle School Boys. Cleveland: Cleveland’s University School Press.
798
Bibliography
Hunt, Alan. 1998. “The Great Masturbation Panic and the Discourses of Moral Regulation in Nineteenth- and Early Twentieth-Century Britain.” Journal of the History of Sexuality 8 (April): 575–615. Hunt, Lynn, ed. 1993. The Invention of Pornography: Obscenity and the Origins of Modernity, 1500–1800. New York: Zone Books. Hunter, William. 2000. “The Dot Eaters: Videogame History 101,” http://www. emuunlim.com/doteaters/index.htm (accessed December 27, 2000).
Journal of Gambling Behavior 4, no. 2: 110–118. IGTimes, in association with Stampa Alternativa. 1996. Style: Writing from the Underground. Terni, Italy: Umbriagraf. Inge, M. Thomas. 1990. Comics as Culture. Jackson: University of Mississippi Press. Inhelder, Barbel, and Jean Piaget. 1958. The Growth of Logical Thinking from Childhood to Adolescence. New York: Basic Books.
Hupp, Father Robert P. 1985. The New Boys Town. New York: Newcomen Society of the United States.
International Technology Education Association. 2000. Standards for Technological Literacy: Content for the Study of Technology. Reston, VA: ITEA.
Hurtado, Alfred. 1988. Indian Survival on the California Frontier. New Haven: Yale University Press.
Isenberg, Michael T. 1988. John L. Sullivan and His America. Urbana: University of Illinois Press.
Hutchinson, Edward P. 1956. Immigrants and Their Children, 1850–1950. New York: Wiley.
Jackson, Donald, ed. 1978. Letters of the Lewis and Clark Expedition. Vol. 2. 1962. Reprint, Urbana: University of Illinois Press.
Hutchinson Family’s Book of Words. 1851. New York: Baker, Godwin and Co., Steam Printers. Hyman, Irwin A. 1990. Reading, Writing and the Hickory Stick: The Appalling Story of Physical and Psychological Abuse in American Schools. Lexington, MA: Lexington Books. Hyman, Irwin A., and James H. Wise. 1979. Corporal Punishment in American Education: Readings in History, Practice, and Alternatives. Philadelphia, PA: Temple University Press. Hyman, Paula. 1990. “The Introduction of Bat Mitzvah in Conservative Judaism in Postwar America.” YIVO Annual 19: 133–146. Ide-Smith, Susan G., and Stephen E. Lea. 1988. “Gambling in Young Adolescents.”
Jackson, Kathy Merlock. 1986. Images of Children in American Film: A Sociocultural Analysis. Metuchen, NJ: Scarecrow Press. Jackson, Robert, and Edward Castillo. 1995. Indians, Franciscans, and Spanish Colonization: The Impact of the Mission System on California Indians. Albuquerque: University of New Mexico Press. Jackson, Ronald. 1994. Classic TV Westerns: A Pictorial History. New Jersey: Carol Publishing Group. Jacques, Brian. 1986. Redwall. New York: Putnam. Jarvis, F. W. 1995. Schola Illustris: The Roxbury Latin School. Boston: David Godine.
Bibliography Jeal, Tim. 1990. The Boy-Man: The Life of Lord Baden-Powell. New York: William Morrow. Jefferson, Thomas. 1944. Notes on Virginia. First published in 1784. In The Life and Selected Writings of Thomas Jefferson. Edited by Adrienne Koch and William Peden. New York: Modern Library. Jeffords, Susan. 1989. The Remasculinization of America: Gender and the Vietnam War. Bloomington: Indiana University Press. Jeffrey, Linda, Demond Miller, and Margaret Linn. In press. “Middle School Bullying as a Context for the Development of Passive Observers to the Victimization of Others.” Journal of Emotional Abuse. Jendryka, Brian. 1994. “Flanagan’s Island: Boys Town 1994.” Current (November): 4–10. Jenkins, Henry. “‘Complete Freedom of Movement’: Video Games as Gendered Play Spaces.” Pp. 262–297 in From Barbie to Mortal Combat: Gender and Computer Games. Edited by Henry Jenkins and Justine Cassell. Cambridge, MA: MIT Press.
799
Journal of Community and Applied Social Psychology 5, no. 5: 333–346. Johnson, Charles, and John McCluskey, Jr., eds. 1997. Black Men Speaking. Bloomington: Indiana University Press. Johnson, Deidre. 1993. Edward Stratemeyer and the Stratemeyer Syndicate. Twayne United States Authors Series. New York: Twayne Publishers. Johnson, Elmer L. 1979. The History of YMCA Physical Education. Chicago: Association Press. Johnson, George E. 1916. Education through Recreation. Cleveland, OH: Survey Committee of the Cleveland Foundation. Johnson, Gregory R., Etienne G. Krug, and Lloyd B. Potter. 2000. “Suicide among Adolescents and Young Adults: A CrossNational Comparison of 34 Countries.” Suicide and Life-Threatening Behavior 30: 74–82. Johnson, Susan Lee. 2000. Roaring Camp: The Social World of the California Gold Rush. New York: W. W. Norton. Johnson, Thomas H., ed. 1970. The Complete Poems of Emily Dickinson. London: Faber and Faber.
Jenkins, Philip. 1998. Moral Panic: Changing Concepts of the Child Molester in Modern America. New Haven, CT: Yale University Press.
Johnson, Walter. 1999. Soul by Soul: Life inside the Antebellum Slave Market. Cambridge, MA: Harvard University Press.
Jessor, Richard, and Shirley Jessor. 1977. Problem Behavior and Psychosocial Development: A Longitudinal Study of Youth. New York: Cambridge University Press.
Jones, Arthur F., Jr., and Daniel H. Weinberg. 2000. Current Population Reports: The Changing Shape of the Nation’s Income Distribution, 1947–1998. Washington, DC: U.S. Census Bureau.
Jhally, Sut, and Justin Lewis. 1992. Enlightened Racism: The Cosby Show, Audiences and the Myth of the American Dream. Boulder, CO: Westview Press.
Jones, James. 1993. Bad Blood: The Tuskegee Syphilis Experiment. Rev. ed. New York: Free Press.
Joffe, H., and J. E. Dockrell. 1995. “Safer Sex: Lessons from the Male Sex Industry.”
Jones, Norrece T., Jr. 1990. Born a Child of Freedom, Yet a Slave: Mechanisms of Control and Strategies of Resistance in
800
Bibliography
Antebellum South Carolina. Hanover, NH: University Press of New England.
Kalb, Claudia. 2000. “What Boys Really Want.” Newsweek (July 10): 52.
Jordan, Terry. 1993. North American Cattle-Ranching Frontiers: Origins, Diffusion and Differentiation. Albuquerque: University of New Mexico Press.
Kalter, Neil, Amy Kloner, Shelly Schreier, and Katherine Okla. 1989. “Predictors of Children’s Postdivorce Adjustment.” American Journal of Orthopsychiatry 59, no. 4: 605–618.
Joselit, Jenna Weissman. 1994. The Wonders of America: Reinventing Jewish Culture, 1880–1950. New York: Hill and Wang.
Kann, Laura, Steven A. Kinchen, Barbara I. Williams, James G. Ross, Richard Lowry, Jo Anne Grunbaum, Lloyd J. Kolbe, and State and Local YRBSS Coordinators. 2000. “Youth Risk Behavior Surveillance—United States, 1999.” Centers for Disease Control and Prevention: MMWR Surveillance Summaries 49, no. SS-5 (June 9).
Joyner, Charles. 1984. Down by the Riverside: A South Carolina Slave Community. Urbana: University of Illinois Press. Juvenile Offenders and Victims: 1999 National Report. 1999. Washington, DC: Office of Juvenile Justice and Delinquency Prevention, U.S. Department of Justice. Juvenile Protective Department. 1935. “Street Traders of Buffalo, New York.” Buffalo: Juvenile Protective Department, 13–14. Kadushin, Alfred, and Judith A. Martin. 1998. Child Welfare Services. 4th ed. New York: Macmillan. Kaestle, Carl F. 1973a. The Evolution of an Urban School System: New York City, 1750–1850. Cambridge, MA: Harvard University Press. ———. 1973b. Joseph Lancaster and the Monitorial School Movement. New York: Teachers College Press. ———. 1983. Pillars of the Republic: Common Schools and American Society, 1780–1860. New York: Hill and Wang. Kafai, Yasmin B. 1998. “Video Game Designs by Girls and Boys: Variability and Consistency of Gender Differences.” Pp 90–117 in From Barbie to Mortal Kombat: Gender and Computer Games. Edited by Henry Jenkins and Justine Cassell. Cambridge, MA: MIT Press.
Kanter, Rosabeth Moss. 1975. “Women and the Structure of Organizations: Explorations in Theory and Behavior.” In Another Voice: Feminist Perspectives on Social Life and Social Science. Edited by M. Millman and R. M. Kanter. New York: Anchor Books. ———. 1977. Men and Women of the Corporation. New York: Basic Books. Kaplan, Judy, and Linn Shapiro, eds. 1998. Red Diapers: Growing Up in the Communist Left. Urbana: University of Illinois Press. Karmiloff-Smith, Annette, and Barbel Inhelder. 1974. “If You Want to Get Ahead, Get a Theory.” Cognition 3: 195–212. Keats, Ezra Jack. 1962. The Snowy Day. New York: Penguin. Keise, Celestine. 1992. Sugar and Spice? Bullying in Single-Sex Schools. Stoke-onTrent, Staffordshire, UK: Trentham Books. Kelley, Florence, and Alzina P. Stevens. 1895. Hull-House Maps and Papers. New York: Crowell. Kemp, John R., ed. 1986. Lewis Hine Photographs of Child Labor in the New
Bibliography South. Jackson: University Press of Mississippi. Kempton, Tracey, Lisa Armistead, Michelle Wierson, and Rex Forehand. 1991. “Presence of a Sibling as a Potential Buffer Following Parental Divorce: An Examination of Young Adolescents.” Journal of Clinical Child Psychology 20: 434–438. Kendrick, Walter. 1996. The Secret Museum: Pornography in Modern Culture. 2d ed. Los Angeles: University of California Press. Kerber, Linda K. 1980. Women of the Republic: Intellect and Ideology in Revolutionary America. Chapel Hill: University of North Carolina Press. Kerr, Leah M. 2000. Driving Me Wild: Nitro-Powered Outlaw Culture. New York: Juno Books. Kerrigan, William Thomas. 1997. “‘Young America!’: Romantic Nationalism in Literature and Politics, 1843–1861.” Ph.D. diss., University of Michigan. Kerwin, Denise. 1994. “Ambivalent Pleasure from Married . . . with Children.” In Television: The Critical View. 5th ed. Edited by Horace Newcomb. New York: Oxford University Press. Kessler, Christina. 2000. No Condition Is Permanent. New York: Philomel Books. Kessler, Suzanne J. 1990. “The Medical Construction of Gender: Case Management of Intersexed Infants.” Signs 16, no. 1. Kett, Joseph. 1977. Rites of Passage: Adolescence in America, 1790 to the Present. New York: Basic Books. Kibler, M. Alison. 1999. Rank Ladies: Gender and Cultural Hierarchy in American Vaudeville. Chapel Hill: University of North Carolina Press.
801
Kidd, A., and R. Kidd. 1990. “Social and Environmental Influences on Children’s Attitudes toward Pets.” Psychological Reports 67: 807–818. Kidd, Bruce. 1996. The Struggle for Canadian Sport. Toronto: University of Toronto Press. Kidd, Bruce, and John Macfarlane. 1972. The Death of Hockey. Toronto: New Press. Kids and Guns. 2000. Washington, DC: Office of Juvenile Justice and Delinquency Prevention, U.S. Department of Justice. Kiefer, Monica. 1948. American Children through Their Books, 1700–1835. Philadelphia: University of Pennsylvania Press. Kimball, Marie. 1943. Jefferson: The Road to Glory, 1743 to 1776. New York: Coward-McCann. Kimmel, Michael. 1996. Manhood in America: A Cultural History. New York: Free Press. ———. 2000. The Gendered Society Reader. New York: Oxford University Press. Kincheloe, J. L., S. R. Steinberg, and A. D. Gresson III. 1997. Measured Lies: The Bell Curve Examined. New York: St. Martin’s Press. Kincheloe, Joe L. 1997. “Home Alone and ‘Bad to the Bone’: The Advent of a Postmodern Childhood.” Pp. 31–52 in Kinderculture: The Corporate Construction of Childhood. Edited by Shirley R. Steinberg and Joe L. Kincheloe. Boulder, CO: Westview Press. ———. 1999. How Do We Tell the Worker? The Socioeconomic Foundations of Work and Vocational Education. Boulder, CO: Westview Press. Kinder, Marsha. 1991. Playing with Power in Movies, Television, and Video Games. Berkeley: University of California Press.
802
Bibliography
Kinder, Marsha, ed. 1999. Kids’ Media Culture. Durham, NC: Duke University Press. Kindlon, Dan, and Michael Thompson. 1999. Raising Cain: Protecting the Emotional Life of Boys. New York: Ballantine. King, Margaret J. 1981. “Disneyland and Walt Disney World: Traditional Values in Futuristic Form.” Journal of Popular Culture (Summer): 114–140.
Kiselica, Mark S. 1995. Multicultural Counseling with Teenage Fathers: A Practical Guide. Thousand Oaks, CA: Sage. ———. 1999. “Counseling Teen Fathers.” Pp. 179–198 in Handbook of Counseling Boys and Adolescent Males. Edited by A. M. Horne and M. S. Kiselica. Thousand Oaks, CA: Sage.
King, Stephen. 1981. Danse Macabre. New York: Everest House Publishers.
Klahr, David. 2000. Exploring Science: The Cognition and Development of Discovery Processes. Cambridge, MA: MIT Press.
King, Wilma. 1995. Stolen Childhood: Slave Youth in Nineteenth Century America. Bloomington: Indiana University Press.
Klein, Alan. 1993. Little Big Man: Bodybuilding Subculture and Gender Construction. Albany: State University of New York Press.
Kinsey, Alfred C., Wardell B. Pomeroy, and Clyde E. Martin. 1948. Sexual Behavior in the Human Male. Philadelphia: W. B. Saunders.
———. 1994. “The Cultural Anatomy of Competitive Women’s Bodybuilding.” In Many Mirrors: Body Image and Social Relations. Edited by Nicole Sault. New Brunswick, NJ: Rutgers University Press.
Kipnis, Aaron. 1999. Angry Young Men. San Francisco: Jossey-Bass. Kirk, Robert William. 1994. Earning Their Stripes: The Mobilization of American Children in the Second World War. New York: Peter Lang. Kirsch, George B. 2000. “Young Men’s Hebrew Association.” Pp. 501–502 in Encyclopedia of Ethnicity and Sports in the United States. Edited by George B. Kirsch, Othello Harris, and Claire E. Nolte. Westport, CT: Greenwood Press. Kirsch, George B., ed. 1992. Sports in North America: A Documentary History. Vol. 3, The Rise of Modern Sports, 1840–1860. Gulf Breeze, FL: Academic International Press. Kirsch, George, Othello Harris, and Claire E. Nolte, eds. 2000. Encyclopedia of Ethnic Sports in the United States. Westport, CT: Greenwood Press.
Klein, Malcolm W. 1995. The American Street Gang. New York: Oxford University Press. ———. 1996. “Gangs in the United States and Europe.” European Journal on Criminal Policy and Research (special issue): 63–80. Klein, Malcolm W., and Cheryl Lee Maxson. 1989. “Street Gang Violence.” Pp. 198–234 in Violent Crime, Violent Criminals. Edited by M. E. Wolfgang and M. A. Weiner. Newbury Park, CA: Sage. Klein, Malcolm W., Cheryl Maxson, and Jody Miller, eds. 1995. The Modern Gang Reader. Los Angeles: Roxbury. Klepp, Susan, and Billy Smith, eds. 1992. The Infortunate: The Voyage and Adventures of William Moraley, an Indentured Servant. University Park: Pennsylvania State University Press.
Bibliography
803
Klier, Barbara, Jacquelyn Quiram, and Mark Siegel, eds. 1999. Alcohol and Tobacco: America’s Drugs of Choice. Wylie, TX: Information Plus.
Kuhn, Deanna, Eric Amsel, and Michael O’Loughlin. 1988. The Development of Scientific Thinking Skills. Orlando, FL: Academic Press.
Klier, Barbara, Mark Siegel, and Jacquelyn Quiram, eds. 1999. Illegal Drugs: America’s Anguish. Wylie, TX: Information Plus.
Kuhn, Thomas S. 1962. The Structure of Scientific Reasoning. Chicago: University of Chicago Press.
Kline, Stephen. 1993. Out of the Garden: Toys and Children’s Culture in the Age of TV Marketing. London: Verso. Klinman, Debra G., Joelle H. Sander, Jacqueline L. Rosen, Karen R. Longo, and Lorenzo P. Martinez. 1985. The Teen Parent Collaboration: Reaching and Serving the Teenage Father. New York: Bank Street College of Education. Knox, Thomas W. 1881. The Young Nimrods in North America: A Book for Boys. New York: Harper and Brothers. Kohler, Anna. 1897. “Children’s Sense of Money.” Studies in Education 1, no. 9: 323–331. Kopka, Deborah L. 1997. School Violence: A Reference Handbook. Santa Barbara, CA: ABC-CLIO. Kornhauser, Elizabeth Mankin. 1991. Ralph Earl: The Face of the Young Republic. Hartford, CT: Wadsworth Atheneum. Koven, Edward. 1996. Smoking: The Story behind the Haze. Commack, NY: Nova Science. Kowaleski, Michael., ed. 1997. Gold Rush: A Literary Exploration. Berkeley: Heyday Books and California Council for the Humanities. Kraft, Louis. 1941. “Center, The Jewish.” In The Universal Jewish Encyclopedia. Edited by Isaac Landman. Kraushaar, Otto. 1972. American Nonpublic Schools: Patterns of Diversity. Baltimore: Johns Hopkins University.
Kulikoff, Allan. 1986. Tobacco and Slaves: The Development of Southern Cultures in the Chesapeake, 1680–1800. Chapel Hill: University of North Carolina Press. Kurdek, L., and A. E. Siesky. 1980. “Children’s Perceptions of Their Parents’ Divorce.” Journal of Divorce 3, no. 4: 339–378. Kushner, Howard I. 1993. “Suicide, Gender, and the Fear of Modernity in Nineteenth-Century Medical and Social Thought.” Journal of Social History 26, no. 3: 461–490. ———. 1998. The Age of the Child: Children in America, 1890–1920. New York: Twayne. La Flesche, Francis. 1900. The Middle Five: Indian Schoolboys of the Omaha Tribe. 1900. Reprint, Madison: University of Wisconsin Press, 1963. Ladd, Wayne M., and Angela Lumpkin, eds. 1979. Sport in American Education: History and Perspective. Reston, VA: NASPE-AAHPERD. Ladd-Taylor, Molly, and Lauri Umanski, eds. 1998. “Bad” Mothers: The Politics of Blame in Twentieth-Century America. New York: New York University Press. Lamb, Michael E. 1997. The Role of the Father in Child Development. 3d ed. New York: John Wiley and Sons. Lambert, Barbara, ed. 1980. Music in Colonial Massachusetts 1630–1820: Music in Public Places. Boston: Colonial Society of Massachusetts.
804
Bibliography
Landale, Nancy S. 1989. “Opportunity, Movement, and Marriage: U.S. Farm Sons at the Turn of the Century.” Journal of Family History 14, no. 4: 365–386. Lane, Frederick S. III. 2000. Obscene Profits: The Entrepreneurs of Pornography in the Cyber Age. New York: Routledge. Langfeld, William. 1928. The Young Men’s Hebrew Association of Philadelphia: A Fifty-Year Chronicle. Philadelphia: Young Men’s and Young Women’s Hebrew Association of Philadelphia. Lanzinger, I. 1990. “Toward Feminist Science Teaching.” Canadian Woman Studies 13, no. 2. LaRossa, Ralph. 1997. The Modernization of Fatherhood: A Social and Political History. Chicago: University of Chicago Press. Latta, Alexander Bonner, and E. Latta. 1860. The Origin and Introduction of the Steam Fire Engine Together with the Results of the Use of Them in Cincinnati, St. Louis and Louisville, for One Year, also, Showing the Effect on Insurance Companies, etc. Cincinnati: Moore, Wilstach, Keys. Lauden, Larry. 1977. Progress and Its Problems. Berkeley: University of California Press. Lawes, Carolyn J. “Capitalizing on Mother: John S. C. Abbott and SelfInterested Motherhood.” Proceedings of the American Antiquarian Society 108, pt. 2: 343–395. Lawrence, Richard. 1998. School Crime and Juvenile Justice. New York: Oxford University Press. Lears, T. J. Jackson. 1981. No Place of Grace: Anti-Modernism and the Transformation of American Culture, 1880–1920. New York: Pantheon Books.
Lee, Alfred McClung. 1937. The Daily Newspaper in America: Evolution of a Social Instrument. New York: Macmillan. Lee, L., and G. Zhan. 1998. “Psychosocial Status of Children and Youth.” Pp. 211–233 in Handbook of Asian American Psychology. Edited by L. Lee and N. Zane. Thousand Oaks, CA: Sage. Lee, Sharon, and Marilyn Fernandez. 1998. “Trends in Asian American Racial/Ethnic Intermarriage: A Comparison of 1980 and 1990 Census Data.” Sociological Perspectives 41, no. 2: 323–343. Lee, Stacy. 1996. Unraveling the “Model Minority” Stereotype: Listening to Asian American Youth. New York: Columbia University Teachers College Press. Leitenberg, Harold, Mark J. Detzer, and Debra Srebnik. 1993. “Gender Differences in Masturbation and the Relation of Masturbation Experience in Preadolescence and/or Early Adolescence to Sexual Behavior and Sexual Adjustment in Young Adulthood.” Archives of Sexual Behavior 22 (April): 87–98. Leland, John. 2000. “Why America Is Hooked on Professional Wrestling.” Newsweek 135, no. 6 (February 7): 46. Lemann, N. 2000. The Big Test: The Secret History of the American Meritocracy. New York: Farrar, Straus and Giroux. Lemke, Bob. 1997. Standard Catalog of Baseball Cards. Iola, WI: Krause Publications. Lender, Mark Edward. 1980. “The Social Structure of the New Jersey Brigade.” In The Military in America from the Colonial Era to the Present. Edited by Peter Karsten. New York: Free Press. L’Engle, Madeleine. 1962. A Wrinkle in Time. New York: Farrar, Straus and Giroux.
Bibliography Leppek, Chris. 1995. “The Life and Times of Denver’s Joe ‘Awful’ Coffee.” Western States Jewish History 27, no. 1 (October): 43–61. Lesko, Nancy, ed. 2000. Masculinities at School. Thousand Oaks, CA: Sage. Leverenz, David. 1989. Manhood and the American Renaissance. Ithaca: Cornell University Press. Levi, Antonia. 1996. Samurai from Outer Space: Understanding Japanese Animation. Chicago: Caris Publishing. Levine, Lawrence W. 1997. Highbrow/ Lowbrow: The Emergence of Cultural Hierarchy in America. Cambridge, MA: Harvard University Press. Levine, Peter. 1992. Ellis Island to Ebbets Field: Sport and the American Jewish Experience. New York: Oxford University Press. Levinson, Stacey, Stacey Mack, Daniel Reinhardt, Helen Suarez, and Grace Yeh. 1991. “Halloween as a Consumption Experience.” Undergraduate research thesis, Rutgers University School of Business. Levy, Barry. 1988. Quakers and the American Family: British Settlement in the Delaware Valley. New York: Oxford University Press. Levy, Jo Ann. 1992. They Saw the Elephant: Women in the California Gold Rush. Norman: University of Oklahoma Press. Lewinsohn, Peter M., Paul Rohde, and John R. Seeley. 1996. “Adolescent Suicidal Ideation and Attempts: Prevalence, Risk Factors, and Clinical Implications.” Clinical Psychology: Science and Practice 3, no. 1: 25–46. Lewis, David Levering. 1993. W. E. B. Du Bois: Biography of a Race. New York: Henry Holt.
805
Lewis, Dorothy. 1992. “From Abuse to Violence: Psychophysiological Consequences of Maltreatment.” Journal of the American Academy of Child and Adolescent Psychiatry 31 (May): 383–391. ———. 1998. Guilty by Reason of Insanity. New York: Fawcett-Columbine. Lewis, Theodore 1997. “Toward a Liberal Vocational Education.” Journal of Philosophy of Education 31, no. 3: 477–489. Lewontin, Richard. 2000. The Triple Helix: Gene, Organism, and Environment. Cambridge: Harvard University Press. Lewontin, Richard, Steven Rose, and Leon Kamin. 1984. Not in Our Genes: Biology, Ideology, and Human Nature. New York: Pantheon. Ley, David, and Roman Cybriwsky. 1974. “Urban Graffiti as Territorial Markers.” Annals of the Association of American Geographers 64: 491–505. Lhamon, W. T., Jr. 1998. Raising Cain: Blackface Performance from Jim Crow to Hip Hop. Cambridge: Harvard University Press. Licht, Walter. 1992. Getting Work: Philadelphia, 1840–1950. Cambridge, MA: Harvard University Press. Limber, Susan P., P. Cunningham, V. Flerx, J. Ivey, M. National, S. Chai, and G. Melton. 1997. “Bullying among School Children: Preliminary Findings from a School-Based Intervention Program.” Paper presented at the Fifth International Family Violence Research Conference, Durham, NH, June–July. Linder, Marc. 1990. “From Street Urchins to Little Merchants: The Juridical Transvaluation of Child Newspaper Carriers.” Temple Law Review (Winter): 829–864.
806
Bibliography
———. 1997. “What’s Black and White and Red All Over? The Blood Tax on Newspapers.” Loyola Poverty Law Review 3: 57–111. Lingeman, Richard R. 1976. Don’t You Know There’s a War On? The American Home Front 1941–1945. New York: Capricorn Books. Lingenfelter, Mary R., and Harry D. Kitson. 1939. Vocations for Girls. New York: Harcourt Brace. Link, William A. 1986. A Hard Country and a Lonely Place: Schooling, Society and Reform in Rural Virginia, 1870–1920. Chapel Hill: University of North Carolina Press. Lipsitz, George. 1990. Time Passages: Collective Memory and American Popular Culture. Minneapolis: University of Minnesota Press. Lock, Stephen, and Lois Reynolds, eds. 1998. Ashes to Ashes: The History of Smoking and Health. Atlanta, GA: Rodopi. Lomawaima, K. Tsianina. 1994. They Called It Prairie Light: The Story of Chilocco Indian School. Lincoln: University of Nebraska Press. Lott, Eric. 1993. Love and Theft: Blackface Minstrelsy and the American Working Class. New York: Oxford University Press. Lovejoy, Owen. 1910. “Newsboy Life: What Superintendents of Reformatories and Others Think about Its Effects.” National Child Labor Committee, pamphlet no. 32 (June). Lovell, Margaretta. 1988. “Reading Eighteenth-Century American Family Portraits: Social Images and Self Images.” Winterthur Portfolio 22, no. 4 (Winter) 243–264.
Lowery, Carol R., and Shirley A. Settle. 1985. “Effects of Divorce on Children: Differential Impact of Custody and Visitation Patterns.” Family Relations: Journal of Applied Family and Child Studies 34, no. 4: 455–463. Lucas, Christopher J. 1994. American Higher Education: A History. New York: St. Martin’s Press. Lugaila, Terry A. 1998. “Marital Status and Living Arrangements: March 1998 (Update).” Current Population Reports. U.S. Bureau of the Census Publication no. P20-514. Washington, DC: U.S. Department of Commerce. Lumpkin, Angela. 1985. A Guide to the Literature of Tennis. Westport, CT: Greenwood Press. Lundstrom, Linden J. 1957. The Choir School. Minneapolis, MN: Augsburg Publishing House. Lynch, Tom. 1878. “St. Louis: The Volunteer Fire Department, 1832–1858.” National Fireman’s Journal (August 3). Lynd, Robert S., and Helen Merrell Lynd. 1929. Middletown: A Study in Contemporary American Culture. New York: Harcourt Brace. MacCann, Donnarae, and Gloria Woodard, eds. 1989. The Black American in Books for Children: Readings in Racism. 2d ed. Metuchen, NJ: Scarecrow Press. MacDonald, J. Fred. 1983. Blacks and White TV: African Americans in Television since 1948. Chicago: Nelson Hall. MacDonald, Robert H. 1967. “The Frightful Consequences of Onanism: Notes on the History of a Delusion.” Journal of the History of Ideas 28 (1967): 423–431. MacLeod, Anne Scott. 1975. A Moral Tale: Children’s Fiction and American
Bibliography Culture, 1820–1860. Hamden, CT: Archon Books. Macleod, David I. 1983. Building Character in the American Boy: The Boy Scouts, YMCA, and Their Forerunners, 1870–1920. Madison: University of Wisconsin Press.
807
Marable, Manning. 1986. W. E. B. Du Bois: Black Radical Democrat. Boston: Twayne Publishers. Marble, Scott. 189-. “Daughters of the Poor.” Unpublished manuscript, Billy Rose Theater Collection, New York Public Library at Lincoln Center.
———. 1998. The Age of the Child: Children in America, 1890–1920. New York: Twayne Publishers.
Mark, Diane Mei Lin, and Ginger Chih. 1993. A Place Called Chinese America. Dubuque, IA: Kendall/Hunt Publishing.
———. Forthcoming. Landmarks of American Sports. American Landmarks Series. Edited by James O. Horton. New York: Oxford University Press.
Marks, Stuart A. 1991. Southern Hunting in Black and White: Nature, History, and Ritual in a Carolina Community. Princeton, NJ: Princeton University Press.
Malone, Ann Patton. 1992. Sweet Chariot: Slave Family and Household Structure in Nineteenth-Century Louisiana. Chapel Hill: University of North Carolina Press.
Marling, Karal Ann, ed. 1997. Designing Disney’s Theme Parks: The Architecture of Reassurance. New York: Flammarion.
Malone, Dumas. 1948. Jefferson the Virginian. Boston: Little, Brown. Maloney, P. 1980. “Street Hustling: Growing Up Gay.” Unpublished manuscript. Mandleco, Barbara L., Susanne F. Olsen, Clyde C. Robinson, Elaine S. Marshall, and Mary K. McNeilly-Choque. 1998. “Social Skills and Peer Relationships of Siblings of Children with Disabilities.” Pp. 106–120 in Children’s Peer Relations. Edited by P. T. Slee and K. Rigby. New York: Routledge. Manfredi, Christopher P. 1998. The Supreme Court and Juvenile Justice. Lawrence: University of Kansas Press. Mangan, J. A., and James Walvin, eds. 1987. Manliness and Morality: Middle Class Masculinity in Britain and America, 1800–1940. New York: St. Martin’s Press. Mangold, George B. 1936. Problems of Child Welfare. 3d ed. New York: Macmillan.
Marsden, George M. 1990. Religion and American Culture. New York: Harcourt Brace Jovanovich. Marsh, Dave. 1996. The Bruce Springsteen Story. Vol. 1, Born to Run. New York: Thunder’s Mouth Press. Marsh, Herbert W. 1991. “Employment during High School: Character Building or a Subversion of Academic Goals?” Sociology of Education 64: 172–189. Marshall, William A., and James M. Tanner. 1970. “Variations in the Pattern of Pubertal Changes in Boys.” Archives of Disease in Childhood 45: 13–23. Marten, James. 1998. The Children’s Civil War. Chapel Hill: University of North Carolina Press. ———. 1999. Lessons of War: The Civil War in Children’s Magazines. Wilmington, DE: SR Books. Martin, Chris. 1996. The Top Fuel Handbook. Wichita, KS: Beacon Publishing. Martin, Joseph Plumb. 1993. Ordinary Courage: The Revolutionary War
808
Bibliography
Adventures of Joseph Plumb Martin. New York: Brandywine Press. Marty, Martin E. 1984. Pilgrims in Their Own Land: 500 Years of Religion in America. New York: Penguin. Mason, Daniel, and Barbara Schrodt. 1996. “Hockey’s First Professional Team: The Portage Lakes Hockey Club of Houghton, Michigan.” Sport History Review 27: 49–71. Mather, Cotton. 1723. The Pure Nazarite: Advice to a Young Man. Boston: T. Fleet for John Phillips. Mather, Fred. 1897. Men I Have Fished With. New York: Forest and Stream Publishing. Mattingly, Paul H., and Edward W. Stevens, Jr. 1987. “Schools and the Means of Education Shall Forever Be Encouraged”: A History of Education in the Old Northwest, 1787–1880. Athens: University of Georgia Press. Maupin, Melissa. 1996. The Ultimate Kids’ Club Book: How to Organize, Find Members, Run Meetings, Raise Money, Handle Problems, and Much More! Minneapolis: Free Spirit. Maury, Ann. 1853. Memoirs of a Huguenot Family. New York: G. P. Putnam. May, Elaine Tyler. 1980. Great Expectations: Marriage and Divorce in Post-Victorian America. Chicago: University of Chicago Press. Mayes, Herbert R. 1928. Alger: A Biography without a Hero. New York: Macy-Masius. Maynard, W. Barksdale. 1999. “‘An Ideal Life in the Woods for Boys’: Architecture and Culture in the Earliest Summer Camps.” Winterthur Portfolio 34, no. 1 (Spring): 3–29.
Mays, Arthur B. 1952. Essentials of Industrial Education. New York: McGraw-Hill. McAleer, John. 1984. Ralph Waldo Emerson: Days of Encounter. Boston: Little, Brown. McCaslin, Nellie. 1971. Theatre for Children in the United States: A History. Norman: University of Oklahoma Press. ———. 1987. Historical Guide to Children’s Theatre in America. Westport, CT: Greenwood Press. McClellan, Keith. 1998. The Sunday Game: At the Dawn of Professional Football. Akron, OH: University of Akron Press. McCloud, Scott. 1993. Understanding Comics. Princeton, WI: Kitchen Sink Press. ———. 2000. Reinventing Comics. New York: HarperPerennial. McCoy, J. Kelly, Gene H. Brody, and Zolinda Stoneman. In press. “Temperament and the Quality of Youths’ Best Friendships: Do Sibling and ParentChild Relationships Make a Difference?” McCullough, David. 1981. Mornings on Horseback. New York: Simon and Schuster. McDaniel, Henry Bonner. 1941. The American Newspaperboy: A Comparative Study of His Work and School Activities. Los Angeles: Wetzel. McFeely, William S. 1991. Frederick Douglas. New York: Simon and Schuster. McGaw, Judith A. 1982. “Women and the History of American Technology.” Signs: Journal of Women in Culture and Society 7: 798–828. ———. 1987. Most Wonderful Machine: Mechanization and Social Change in
Bibliography
809
Berkshire Paper Making, 1801–1885. Princeton, NJ: Princeton University Press.
Press. 1998. Reprint, Ann Arbor: University of Michigan Press.
McKelvey, Carole A., and JoEllen Stevens. 1994. Adoption Crisis: The Truth Behind Adoption and Foster Care. Golden, CO: Fulcrum Publishing.
———. 1996. “Health and Disease.” Pp. 757–786 in Encyclopedia of the United States in the Twentieth Century, vol. 2. Edited by Stanley I. Kutler et al. New York: Scribner’s.
McKenzie, Richard B., ed. 1998. Rethinking Orphanages for the 21st Century. Thousand Oaks, CA: Sage. McKinney, C. F. N.d. “A Discussion of Leadership.” Culver Military Academy, 7. McLaughlin, Milbrey W., Merita A. Irby, and Juliet Langman. 1994. Urban Sanctuaries: Neighborhood Organizations in the Lives and Futures of Inner-City Youth. San Francisco: Jossey-Bass. McNeal, James U. 1987. Children as Consumers: Insights and Implications. Lexington, MA: D.C. Heath. ———. 1992. Kids as Customers: A Handbook of Marketing to Children. New York: Lexington Books. ———. 1999. The Kids Market: Myths and Realities. Ithaca, NY: Paramount Market Publishing. McNeil, Alex. 1996. Total Television. New York: Penguin. Mead, George. 1934. Mind, Self and Society. Chicago: University of Chicago Press. Mechling, Jay. 1986. “Children’s Folklore.” Pp. 91–120 in Folk Groups and Folklore Genres. Edited by Elliott Oring. Logan: Utah State University Press. ———. 2001. On My Honor: The Boy Scouts and American Culture. Chicago: University of Chicago Press. Meckel, Richard A. 1990. Save the Babies: American Public Health Reform and the Prevention of Infant Mortality 1850–1929. Baltimore: Johns Hopkins University
Mednick, Sarnoff. 1977. “A Biosocial Theory of Learning Law-abiding Behavior.” Pp. 1–8 in Biosocial Bases of Criminal Behavior. Edited by S. Mednick and K. Christiansen. New York: Garner. Mednick, Sarnoff, Vicki Pollock, Jan Volavka, and William Gabriella. 1982. “Biology and Violence.” Pp. 21–80 in Criminal Violence. Edited by Marvin Wolfgang and Neil Weiner. Beverly Hills: Sage. Meeks, Carol. 1998. “Factors Influencing Adolescents’ Income and Expenditures.” Journal of Family and Economic Issues 19, no. 2: 131–150. Melson, G. 2001. Why the Wild Things Are. Cambridge, MA: Harvard University Press. Meltzoff, Andrew, and Allison Gropnik. 1996. Words, Thoughts, and Theories. Cambridge, MA: MIT Press. Mencken, H. L. 1982. The American Language. New York: Alfred A. Knopf. Mennel, Robert M. 1973. Thorns and Thistles: Juvenile Delinquency in the United States, 1825–1940. Hanover, NH: University Press of New England. Mergen, Bernard. 1982. Play and Playthings: A Reference Guide. Westport, CT: Greenwood Press. ———. 1997. Snow in America. Washington, DC: Smithsonian Institution Press. Merrill, Liliburn. 1908. Winning the Boy. New York: Fleming H. Revell.
810
Bibliography
Metcalfe, Alan. 1987. Canada Learns to Play: The Emergence of Organized Sport 1807–1904. Toronto: McClelland and Stewart. Milbrath, Constance. 1995. “Germinal Motifs in the Work of a Gifted Child Artist.” Pp. 101–134 in The Development of Artistically Gifted Children: Selected Case Studies. Edited by Claire Golomb. Hillsdale, NJ: Erlbaum. ———. 1998. Patterns of Artistic Development in Children: Comparative Studies of Talent. New York: Cambridge University Press. Milburn, William Henry. 1857. The Rifle, Axe, and Saddle-Bags, and Other Lectures. New York: Derby and Jackson. Millard, Elaine. 1997. Differently Literate: Boys, Girls and the Schooling of Literacy. London: Falmer Press. Miller, Jerome G. 1991. Last One over the Wall: The Massachusetts Experiment in Closing Reform Schools. Columbus: Ohio State University Press.
Masculinity. Edited by Michael Kimmel. Newbury Park, CA: Sage. Mishler, Paul C. 1999. Raising Reds: The Young Pioneers, Radical Summer Camps, and Communist Political Culture in the United States. New York: Columbia University Press. Mizell, C. Andre, and Lala C. Steelman. 2000. “All My Children: The Consequences of Sibling Group Characteristics on the Marital Happiness of Young Mothers.” Journal of Family Issues 21: 858–887. Mjagkij, Nina. 1994. Light in the Darkness: African Americans and the YMCA, 1852–1946. Lexington: University Press of Kentucky. Modell, John. 1989. Into One’s Own: From Youth to Adulthood in the United States, 1920–1985. Berkeley: University of California Press. Monaghan, E. Jennifer. 1983. A Common Heritage: Noah Webster’s Blue-back Speller. Hamden, CT: Archon Books.
Miller, Joanne, and Susan Yung. 1990. “The Role of Allowances in Adolescent Socialization.” Youth and Society 22, no. 2: 137–159.
Monroy, Douglas. 1999. Rebirth: Mexican Los Angeles from the Great Migration to the Great Depression. Berkeley: University of California Press.
Milliken, Randall. 1995. A Time of Little Choice: The Disintegration of Tribal Culture in the San Francisco Bay Area, 1769–1810. Menlo Park: Ballena Press.
Montagu, Ashley. 1985. “The Sociobiology Debate: An Introduction.” Pp. 24–33 in Biology, Crime and Ethics: A Study of Biological Explanations for Criminal Behavior. Edited by Frank Marsh and Janet Katz. Cincinnati: Anderson.
Minehan, Thomas. 1934. Boy and Girl Tramps of America. New York: Farrar and Rinehart. “Miseries of News-Girls.” 1881. New York Tribune, February 20, 12.
Moody, Richard. 1980. Ned Harrigan: From Corlear’s Hook to Herald Square. Chicago: Nelson-Hall.
Mishkind, Michael. 1987. “The Embodiment of Masculinity: Cultural, Psychological, and Behavioral Dimensions.” In Changing Men: New Directions in Research on Men and
Moon, Michael. 1987. “‘The Gentle Boy from the Dangerous Classes’: Pederasty, Domesticity, and Capitalism in Horatio Alger.” Representations 19 (Summer): 95–97.
Bibliography Mooney, Cynthia, ed. 1999. Drugs, Alcohol and Tobacco: Macmillan Health Encyclopedia. New York: Macmillan. Moore, Joan. 1991. Going Down to the Barrio: Homeboys and Homegirls in Change. Philadelphia: Temple University Press. Moore, John Hammond. 1976. Albemarle: Jefferson’s County, 1727–1976. Charlottesville: University Press of Virginia. Moore, John P., and Craig P. Terrett. 1998. Highlights of the 1996 National Youth Gang Survey. Washington, DC: U.S. Department of Justice, Office of Juvenile Justice and Delinquency Prevention. Moorhead, James. 1978. American Apocalypse: Yankee Protestants and the Civil War: 1860–1869. Louisville: Westminster/John Knox Press. Moorhouse, H. F. 1991. Driving Ambitions: A Social Analysis of the American Hot Rod Enthusiasm. Manchester: Manchester University Press. Morales, Armando. 1992. “A Clinical Model for the Prevention of Gang Violence and Homicide.” Pp. 105–118 in Substance Abuse and Gang Violence. Edited by R. C. Cervantes. Newbury Park, CA: Sage. Moran, Jeffrey P. 2000. Teaching Sex: The Shaping of Adolescence in the 20th Century. Cambridge, MA: Harvard University Press. Morgan, Carol M., and Doran J. Levy. 1993. “Gifts to Grandchildren.” American Demographics 9: 3–4. Morgan, Edmund S. 1975. American Slavery, American Freedom. New York: W. W. Norton. Morgan, Phillip D. 1998. Slave Counterpoint: Black Culture in the Eighteenth-Century Chesapeake and
811
Lowcountry. Chapel Hill: University of North Carolina Press. Morgan, Winona L. 1939. The Family Meets the Depression. Minneapolis: University of Minnesota Press. Mormino, Gary Ross. 1982. “The Playing Fields of St. Louis: Italian Immigrants and Sport, 1925–1941.” Journal of Sport History 9 (Summer): 5–16. Morris, Brian. 1999. In Favour of Circumcision. Sydney, Australia: University of New South Wales Press. Morris, Edmund. 1979. The Rise of Theodore Roosevelt. New York: Coward, McCann, and Geoghegan. Morrison, Donna R., and Andrew J. Cherlin. 1995. “The Divorce Process and Young Children’s Well-Being: A Prospective Analysis.” Journal of Marriage and the Family 57, no. 3: 800–812. Morrow, Johnny. 1860. A Voice from the Newsboys. New York: A. S. Barnes and Burr. Mortimer, Jeylan T., and Michael D. Finch. 1996. “Work, Family, and Adolescent Development.” Pp. 1–24 in Adolescents, Work, and Family: An Intergenerational Developmental Analysis. Edited by Mortimer and Finch. Thousand Oaks, CA: Sage. Mortimer, Jeylan T., Michael D. Finch, Ryu Seongryeol, Michael J. Shanahan, and Kathleen Thiede Call. 1996. “The Effects of Work Intensity on Adolescent Mental Health, Achievement, and Behavioral Adjustment: New Evidence from a Prospective Study.” Child Development 67: 1243–1261. Moshman, David. 1999. Adolescent Psychological Development: Rationality, Morality, and Identity. Mahwah, NJ: Erlbaum.
812
Bibliography
Mountjoy, John J. 2000. “Shooting for Better Gun Control.” Spectrum 73: 1–3.
Political Discourse. New York: Oxford University Press.
Moynihan, Ruth Barnes. 1975. “Children and Young People on the Overland Trail.” Western Historical Quarterly 6 (July): 279–294.
Nagle, Paul C. 1999. Descent from Glory: Four Generations of the John Adams Family. Cambridge, MA: Harvard University Press.
Munroe, Kirk. 1897. The Ready Rangers: A Story of Boys, Boats, and Bicycles, FireBuckets and Fun. Boston: Lothrop Publishing.
Napier, John Hawkins III, ed. 1989. “Military Schools.” In Encyclopedia of Southern Culture. Vol. 1, Agriculture— Environment. New York: Anchor Press/Doubleday.
Murphy, Jim. 1990. The Boys’ War: Confederate and Union Soldiers Talk about the Civil War. New York: Clarion Press. Murray, Gail Schmunk. 1998. American Children’s Literature and the Construction of Childhood. New York: Twayne Publishers. Murray, Michael D. 1990. “A Real Life Family in Prime Time.” In Television and the American Family. Edited by Jennings Bryant. Hillsdale, NJ: Erlbaum. Music of the American Revolution: The Birth of Liberty. 1976. New World Records. Musick, David. 1995. An Introduction to the Sociology of Juvenile Delinquency. Albany: State University of New York Press. Myers, Gene. 1998. Children and Animals. Boulder, CO: Westview Press. Myers, Robert. 1972. Celebrations: The Complete Book of American Holidays. New York: Doubleday. Myers, Robert J., and Joyce Brodowski. 2000. “Rewriting the Hallams: Research in 18th Century British and American Theatre.” Theatre Survey 41, no. 1: 1–22. Myers, Walter Dean. 1988. Scorpions. New York: HarperCollins. Nackenoff, Carol. 1994. The Fictional Republic: Horatio Alger and American
Nardinelli, Clark, and Curtis Simon. 1990. “Consumer Racial Discrimination in the Market for Memorabilia: The Case of Baseball.” Quarterly Journal of Economics (August): 575–596. Nasaw, David. 1985. Children of the City at Work and at Play. Garden City, NY: Anchor Books/Doubleday. Nass, Robert. 1993. “Sex Differences in Learning Abilities and Disabilities.” Annals of Dyslexia 43: 61–77. “National Allowance Survey: How Do You Compare?” 1999. Zillions (January–February): 8–11. National Assessment of Vocational Education, Independent Advisory Panel. 1994. Interim Report to Congress. Washington, DC: U.S. Department of Education. “National Chores Survey.” 1999. Zillions (March–April): 20–23. National Commission on Adolescent Sexual Health. 1995. Facing Facts: Sexual Health for America’s Adolescents. New York: Sexuality Information and Education Council of the United States. National Commission on Excellence in Education. 1983. A Nation at Risk: The Imperative for Educational Reform. ED 226 006. Washington, DC: Government Printing Office.
Bibliography National Science Board Commission on Pre-College Education in Mathematics, Science and Technology. 1983. Educating Americans for the 21st Century. A Report to the American People and the National Science Board. ED 223 913. Washington, DC: U.S. Government Printing Office. National Youth Development Information Center. 2001. www.nydic.org (accessed May 14). Includes a directory of more than 500 national contemporary youth organizations with links to individual organization websites. NCHS (National Center for Health Statistics). 2000. Health, United States, 2000. Hyattsville, MD: NCHS. Nearing, Scott. 1907. “The Newsboys at Night in Philadelphia.” The Survey 17 (February 2): 778–784. Nee, Victor G., and Brett de Bary Nee. 1986. Longtime Californ’: A Documentary Study of an American Chinatown. Stanford: Stanford University Press. Neft, David, Richard Johnson, Richard Cohen, and Jordan Deutsch. 1976. The Sports Encyclopedia: Basketball. New York: Grosset and Dunlap. Nelson, Murry. 1999. The Originals: The New York Celtics Invent Modern Basketball. Bowling Green, OH: Bowling Green University Popular Press. Nelson, Rebecca, and Marie J. MacNee, eds. 1996. The Olympics Factbook. Detroit: Visible Ink Press. Neslund, Douglas. 2001. “Voices of Angels” bookmarks, http://groups.yahoo. com/group/Voices_of_Angels/links (accessed March 11). Neugarten, Bernice L., and Karol K. Weinstein. 1964. “The Changing American Grandparents.” Journal of Gerontology 26: 199–204.
813
New England Primer Improved for the More Easy Attaining the True Reading of English, The. 1843. I. Webster, publisher. “New York Newsboys, The.” 1869. The Leisure Hours (November 1): 717. New York SPCC (New York Society for the Prevention of Cruelty to Children). Scrapbook collections in the archives contain the following clippings: On Wallie Eddinger, Jr., see New York Herald, November 1, 1892; Peoria, Illinois, Transcript, February 10, 1892; and Everybody’s Magazine, September 1, 1903. On Tommie Russell, see Tyrone, Pennsylvania, Daily Herald, January 25, 1892; New York Recorder, May 1, 1892; and New York Herald, December 29, 1897. On Elsie Leslie, see Everybody’s Magazine, September 1, 1903; and New York World, April 10, 1910. Newberger, Eli H. 1999. The Men They Will Become. New York: Perseus Books. Newcomb, Horace, ed. 2000. Television: The Critical View. 6th ed. New York: Oxford University Press. Newell, William Wells. 1883. Games and Songs of American Children. New York: Harper and Brothers. “Newsboys and Newsgirls Constitute an Endangered Species.” 2000. Editor and Publisher (January 31): 5. “Newsboys of Old: How They Flourished in California Thirty Years Ago.” 1882. San Francisco Call, January 29. “Newsboys’ Riot, A.” 1877. Detroit Evening News, July 21, 4. Nickerson, Craig. 1995. “Red Dawn in Lake Placid: The Semi-Final Hockey Game at the 1980 Winter Olympics as Cold War Battleground.” Canadian Journal of History of Sport 26: 73–85. Nightingale, Carl Husemoller. 1993. On the Edge: A History of Poor Black
814
Bibliography
Children and Their American Dreams. New York: Basic Books. Noll, Mark. 1992. A History of Christianity in the United States and Canada. Grand Rapids, MI: Eerdmans. Norris, Thaddeus. 1864. The American Angler’s Book. Philadelphia: E. H. Butler. Norton Family. Norton Diaries, 1876–1895. Copied and annotated by Helen Norton Starr. Manuscripts Division, Kansas State Historical Society, Topeka. Nusbaum, Paul. 1994. “Crowded House: Fun and Gaming.” Philadelphia Inquirer, May 29, 1994, 11ff. Nycum, Benjie. 2000. XY Survival Guide: Everything You Need to Know about Being Young and Gay. San Francisco: XY Publishing. Nye, F. Ivan, and Felix M. Berardo. 1973. The Family: Its Structure and Interaction. New York: Macmillan.
Oliver, Ronald, Richard Hazler, and John Hoover. 1994. “The Perceived Role of Bullying in Small-Town Midwestern Schools.” Journal of Counseling and Development 72, no. 4: 416–419. Olweus, Dan. 1993. Bullying at School: What We Know and What We Can Do. Oxford, UK: Blackwell. Oriard, Michael. 1982. Dreaming of Heroes: American Sports Fiction, 1860–1980. Chicago: Nelson-Hall. Oring, Elliott. 1992. Jokes and Their Relations. University Press of Kentucky. Osgood, Ernest Staples. 1929. The Day of the Cattleman. Minneapolis: University of Minnesota Press. Otnes, Cele, Kyungseung Kim, and Young Cham Kim. 1994. “Yes Virginia, There Is a Gender Difference: Analyzing Children’s Requests to Santa Claus.” Journal of Popular Culture 28, no. 1: 17–29.
Oakes, Jeannie. 1985. Keeping Track: How Schools Structure Inequality. New Haven, CT: Yale University Press.
Pabilonia, Sabrina Wulff. 1999. “Evidence on Youth Employment, Earnings, and Parental Transfers in the National Longitudinal Survey of Youths 1997.” Presented at the NLSY97 Early Results Conference at the Bureau of Labor Statistics, Washington, DC, November 18–19.
O’Brien, Richard. 1990. The Story of American Toys. London: New Cavendish Books.
———. 2000. “Youth Earnings and Parental Allowances.” University of Washington working paper.
O’Dell, Scott. 1967. The Black Pearl. New York: Bantam Doubleday Dell.
Packard, Cynthia, and Ray B. Browne. 1978. “Pinball Machine: Marble Icon.” Pp. 177–189 in Icons of America. Edited by Ray B. Browne and Marshall Fishwick. Bowling Green, OH: Bowling Green University Popular Press.
Nye, Russel. 1970. The Unembarrassed Muse. New York: Dial Press.
Ogletree, Shirley Matile, Larry Denton, and Sue Winkle Williams. 1993. “Age and Gender Differences in Children’s Halloween Costumes.” Journal of Psychology 127: 633–637. Okami, Paul, and Laura Pendleton. 1994. “Theorizing Sexuality: Seeds of a Transdisciplinary Paradigm Shift.” Current Anthropology 35 (February): 85–91.
Pacula, Rosalie. 1998. Adolescent Alcohol and Marijuana Consumption: Is There Really a Gateway Effect? Cambridge, MA: National Bureau of Economic Research. Pagani, Linda, Richard E. Tremblay, Frank Vitaro, Margaret Kerr, and Pierre McDuff.
Bibliography 1998. “The Impact of Family Transition on the Development of Delinquency in Adolescent Boys: A 9-Year Longitudinal Study.” Journal of Child Psychology and Psychiatry and Allied Disciplines 39, no. 4: 489–499. Palmer, Patricia. 1986. The Lively Audience: A Study of Children around the TV Set. Sidney: Allen and Unwin. Papenfuse, Edward C., and Gregory A. Stiverson. 1973. “General Smallwood’s Recruits: The Peacetime Career of the Revolutionary War Private.” William and Mary Quarterly 30: 117–132. Park, K. 1999. “‘I Really Do Feel I’m 1.5!’: The Construction of Self and Community by Young Korean Americans.” Amerasia Journal 25, no. 1: 139–164. Parke, Ross D. 1996. Fatherhood. Cambridge, MA: Harvard University. Parks, Rita. 1982. The Western Hero in Film and Television. Ann Arbor: UMI Research Press. Parks, Wally. 1966. Drag Racing, Yesterday and Today. New York: Trident Press. Parsons, Michael J. 1987. How We Understand Art: A Cognitive Developmental Account of Aesthetic Experience. New York: Cambridge University Press. Paulsen, Gary. 1993. Harris and Me. New York: Bantam Doubleday Dell.
815
Peiss, Kathy. 1986. Cheap Amusements: Working Women and Leisure in Turn-ofthe-Century New York. Philadelphia: Temple University Press. Pelligrini, Anthony D. In press. “The Roles of Dominance and Bullying in the Development of Early Heterosexual Relationships.” Journal of Emotional Abuse. Penney, David. 1993. “Indians and Children: A Critique of Educational Objectives.” Akwe:kon [Native Americas] 10 (Winter): 12–18. Penny Merriment: English Songs from the Time of the Pilgrims. 1986. Plimoth Plantation. Pepler, Deborah J., Rona Abramovitch, and Carl Corter. 1981. “Sibling Interaction in the Home: A Longitudinal Study.” Child Development 52: 1344–1347. “Perch Fishing.” 1873. American Sportsman 3 (December 13). Perlman, Joel. 1988. Ethnic Differences: Schooling and Social Structure among the Irish, Italians, Jews, and Blacks in an American City, 1880–1935. New York: Cambridge University Press. Pernick, Martin S. 1996. The Black Stork: Eugenics and the Death of “Defective” Babies in American Medicine and Motion Pictures since 1915. New York: Oxford University Press.
Payton, Crystal. 1982. Space Toys. Sedalia, MO: Collectors Compass.
Petersen, David, ed. 1996. A Hunter’s Heart: Honest Essays on Blood Sport. New York: Henry Holt.
Peabody, James B., ed. 1973. John Adams: A Biography in His Own Words. New York: Newsweek, distributed by Harper and Row.
Petersen, Paul. 2001. “A Minor Consideration.” Gardena, CA: www.minorcon. org//history.html (accessed March 1, 2001).
Peavy, Linda, and Ursula Smith. 1999. Frontier Children. Norman: University of Oklahoma Press.
Peterson, Merrill D. 1970. Thomas Jefferson and the New Nation. New York: Oxford University Press.
816
Bibliography
Peterson, Merrill, ed. 1984. Thomas Jefferson: Writings. New York: Library of America.
in The Role of the Father in Child Development. 3d ed. Edited by Michael E. Lamb. New York: John Wiley and Sons.
Peterson, Robert W. 1985. The Boy Scouts: An American Adventure. New York: American Heritage.
Podbersek, A., Elizabeth Paul, and James Serpell, eds. 2000. Companion Animals and Us. Cambridge, UK: Cambridge University Press.
Peterson, Robert. 1990. Cages to Jump Shots: Pro Basketball’s Early Years. New York: Oxford University Press. Phillips, Dennis J. 1989. Teaching, Coaching and Learning Tennis: An Annotated Bibliography. Metuchen, NJ: Scarecrow Press. Phinney, Jean S. 1989. “Stages of Ethnic Identity Development in Minority Group Adolescents.” Journal of Early Adolescence 9, nos. 1–2: 34–49. Phipps, William E. 1977. “Masturbation: Vice or Virtue?” Journal of Religion and Health 16: 183–195. Pickering, Samuel F., Jr. 1993. Moral Instruction and Fiction for Children, 1747–1820. Athens: University of Georgia Press. Pirog-Good, Maureen A. 1996. “The Education and Labor Market Outcomes of Adolescent Fathers.” Youth and Society 28: 236–262. Pisciotta, Alexander W. 1982. “Saving the Children: The Promise and Practice of Parens Patriae, 1838–1898.” Crime and Delinquency 28, no. 3 (July): 410–425. Platt, Anthony. 1977. The Child Savers: The Invention of Delinquency. 2d ed. Chicago: University of Chicago Press. Pleak, R. R., and H. F. Meyer-Bahlburg. 1990. “Sexual Behavior and AIDS Knowledge of Young Male Prostitutes in Manhattan.” Journal of Sex Research 27, no. 4: 557–587. Pleck, Elizabeth H., and Joseph H. Pleck. 1997. “Fatherhood Ideals in the United States: Historical Dimensions.” Pp. 33–48
Polakow, Valerie, ed. 2000. The Public Assault on America’s Children. New York: Teachers College Press. Pollack, William. 1998. Real Boys: Rescuing Our Sons from the Myths of Boyhood. New York: Henry Holt. Pond, Fred E. (Will Wildwood). 1919. Life and Adventures of “Ned Buntline” with Ned Buntline’s Anecdote of “Frank Forester” and Chapter of Angling Sketches. New York: Cadmus Book Shop. Popper, Karl Raimund. 1959. The Logic of Scientific Discovery. London, UK: Hutchinson. Porter, Roy. 1995. “Forbidden Pleasures: Enlightenment Literature of Sexual Advice.” Pp. 75–98 in Solitary Pleasures: The Historical, Literary, and Artistic Discourses of Autoeroticism. Edited by Paula Bennett and Vernon A. Rosario II. New York: Routledge. Post, Robert C. 1998. “Hot Rods and Customs: The Men and Machines of California’s Car Culture, at the Oakland Museum of California.” Technology and Culture 39: 116–121. ———. 2001. High Performance: The Culture and Technology of Drag Racing, 1950–2000. Baltimore, MD: Johns Hopkins University Press. Postol, Todd Alexander. 1997. “Creating the American Newspaper Boy: MiddleClass Route Service and Juvenile Salesmanship in the Great Depression.” Journal of Social History (Winter): 327–345.
Bibliography Powers, Jane B. 1992. The “Girl Question” in Education: Vocational Education for Young Women in the Progressive Era. Washington, DC: Falmer Press. Powers, Stephen. 1999. The Art of Getting Over: Graffiti at the Millennium. New York: St. Martin’s Press. Prescott, Heather Munro. 1998. “A Doctor of Their Own”: The History of Adolescent Medicine. Cambridge, MA: Harvard University Press. Preston, Samuel H., and Michael R. Haines. 1991. Fatal Years: Child Mortality in Late Nineteenth-Century America. Princeton, NJ: Princeton University Press. Pridmore, Jay. 1999. Classic American Bicycles. Osceola, WI: Motorbikes International. Proctor, Nicholas Wolfe. 1988. “Bathed in Blood: Hunting in the Antebellum South.” Ph.D. diss., Emory University. Proffit, William R. 1993. Contemporary Orthodontics. 2d ed. St. Louis: Mosby Year Book. Pruett, Kyle D. 2000. Fatherneed. New York: Free Press. Public/Private Ventures. 2000. Youth Development: Issues, Challenges, and Directions. Philadelphia: Public/Private Ventures. Pustz, Matthew. 1999. Comic Book Culture: Fanboys and True Believers. Jackson: University Press of Mississippi. Putney, Clifford. 1997. “From Character to Body Building: The YMCA and the Suburban Metropolis, 1950–1980.” Pp. 231–249 in Men and Women Adrift: The YMCA and YWCA in the City. Edited by Nina Mjagkij and Margaret Spratt. New York: New York University Press. Putney, Clifford W. 1995. “Muscular Christianity: The Strenuous Mood in
817
American Protestantism, 1880–1920.” Ph.D. diss., Brandeis University. Quay, H. C., ed. 1987. Handbook of Juvenile Delinquency. New York: Wiley. Rabinowitz, Benjamin. 1948. The Young Men’s Hebrew Association (1854–1913). New York: National Jewish Welfare Board. Rader, Benjamin G. 1983. American Sports: From the Age of Folk Games to the Age of Spectators. Englewood Cliffs, NJ: Prentice-Hall. ———. 1999. American Sports: From the Age of Folk Games to the Age of Televised Sports. Upper Saddle River, NJ: Prentice-Hall. Rand Youth Poll. 2000. Teen-age Personal Spending Continues to Climb While Youths’ Overall Impact on Economy Intensifies. New York: Rand Youth Poll. Randall, Henry S. 1858. The Life of Thomas Jefferson. New York: Derby and Jackson. Randall, Willard Sterne. 1993. Thomas Jefferson: A Life. New York: Henry Holt. Randolph, Sarah N. 1978. The Domestic Life of Thomas Jefferson, Compiled from Family Letters and Reminiscences, by His Great-Granddaughter. 1871. Reprint, Charlottesville: University Press of Virginia. Raphael, Maryanne, and Jenifer Wolf. 1974. Runaway: America’s Lost Youth. New York: Drake Publishers. Rasmussen, Wayne D. 1989. Taking the University to the People: Seventy-Five Years of Cooperative Extension. Ames: Iowa State University Press. Rayburn, Jim III. 1984. Dance Children Dance: The Story of Jim Rayburn, Founder of Young Life. Wheaton, IL: Tyndale.
818
Bibliography Criminologist. New York: Alfred A. Knopf.
Reagan, Daniel Ware. 1984. “The Making of an American Author: Melville and the Idea of a National Literature.” Ph.D. diss., University of New Hampshire.
Richards, Jeffrey H. 1995. Mercy Otis Warren. New York: Twayne Publishers.
Rebora, Carrie, Paul Staiti, Erica E. Hirshler, Theodore E. Stebbins, Jr., and Carol Troyen. 1995. John Singleton Copley in America. New York: Metropolitan Museum of Art.
Richardson, John, and Carl Simpson. 1982. “Children, Gender and Social Structure: An Analysis of the Contents of Letters to Santa Claus.” Child Development 53: 429–436.
Reck, Franklin M. 1951. The 4-H Story. Ames: Iowa State College Press.
Riess, Steven A. 1989. City Games: The Evolution of American Urban Society and the Rise of Sports. Urbana: University of Illinois Press.
Reed, Anna. 1829. Life of George Washington. Philadelphia: American Sunday School Union. Reinen, I. J., and T. Plomp. 1993. “Some Gender Issues in Educational Computer Use: Results of an International Comparative Survey.” Computers and Education: An International Journal 20, no. 4: 353–365.
———. 1995. Sport in Industrial America, 1850–1920. Wheeling, IL: Harlan Davidson. Riis, Jacob. 1890. How the Other Half Lives. 1997 Reprint, New York: Penguin. Riley, Patricia, ed. 1993. Growing Up Native American. New York: Avon Books.
Reinier, Jacqueline S. 1996. From Virtue to Character: American Childhood, 1775–1850. New York: Twayne Publishers.
Riordan, Cornelius. 1990. Girls and Boys in School: Together or Separate. New York: Teachers College Press.
Reisner, Robert. 1971. Graffiti: Two Thousand Years of Wall Writing. New York: Cowles Book Company.
———. 1999. “The Silent Gender Gap: Reading, Writing and Other Problems for Boys.” Education Week 19 (November): 46–49.
Remafedi, Gary. 1999. “Suicide and Sexual Orientation.” Archives of General Psychiatry 56: 885–886. Remondino, Peter C. 1891. History of Circumcision. Philadelphia: F. A. Davis. Restad, Penne. 1995. Christmas in America: A History. New York: Oxford University Press. Retherford, Robert D. 1975. The Changing Sex Differential in Mortality. Westport, CT: Greenwood. Rhode, Deborah. 1997. Speaking of Sex. Cambridge: Harvard University Press. Rhodes, Richard. 1999. Why They Kill: The Discoveries of a Maverick
Risman, Barbara. 1999. Gender Vertigo. New Haven: Yale University Press. Ritter, Thomas J., and George C. Denniston. 1996. Say No to Circumcision: 40 Compelling Reasons. Aptos, CA: Hourglass Books. Ritvo, Harriet. 1987. The Animal Estate. Cambridge, MA: Harvard University Press. Road and Track. New York: Hachette Fillipacci Magazines. Roberts, Brian. 2000. American Alchemy: The California Gold Rush and Middle Class Culture. Chapel Hill: University of North Carolina Press.
Bibliography Roberts, Randy. 1999. But They Can’t Beat Us: Oscar Robertson’s Crispus Attucks Tigers. Indianapolis: Indiana Historical Society. Robinson, Bryan E. 1988. Teenage Fathers. Lexington, MA: Lexington Books. Rod and Custom. Los Angeles: emap usa. Rodder’s Journal, The. Huntington Beach, CA: Rodder’s Journal. Rodgers, Joseph L., H. Harrington Cleveland, Edwin van den Oord, and David C. Rowe. 2000. “Resolving the Debate over Birth Order, Family Size, and Intelligence.” American Psychologist 55: 599–612. Rodkin, Philip C., Thomas W. Farmer, Ruth Pearl, and Richard Van Acker. 2000. “Heterogeneity of Popular Boys: Antisocial and Prosocial Configurations.” Developmental Psychology 36, no. 1 (January): 14–24. Roediger, David R. 1991. The Wages of Whiteness: Race and the Making of the American Working Class. New York: Verso. Rogers, Joseph L., and David C. Rowe. 1988. “Influence of Siblings on Adolescent Sexual Behavior.” Developmental Psychology 24: 722–728. Rogers, Naomi. 1992. Dirt and Disease: Polio before FDR. New Brunswick, NJ: Rutgers University Press. Rogers, Richard G. 1992. “Living and Dying in the U.S.A.: Sociodemographic Determinants among Blacks and Whites.” Demography 29: 287–303. Rogin, Michael. 1992. “Blackface, White Noise: The Jewish Jazz Singer Finds His Voice.” Critical Inquiry 18 (Spring): 417–453. Rohrbough, Malcolm. 1997. Days of Gold: The California Gold Rush and the
819
American Nation. Berkeley: University of California Press. Roosevelt, Theodore. 1913. Theodore Roosevelt: An Autobiography. 1985. Reprint, New York: Da Capo Press. Rorabaugh, William J. 1986. The Craft Apprentice: From Franklin to the Machine Age in America. New York: Oxford University Press. Roscoe, Will. 1991. The Zuni ManWoman. Albuquerque: University of New Mexico Press. Rosen, Ruth. 1982. The Lost Sisterhood: Prostitution in America, 1900–1918. Baltimore: Johns Hopkins University Press. Rosenblum, Walter, Naomi Rosenblum, and Alan Trachtenberg. 1977. America and Lewis Hine: Photographs 1904–1940. Millerton, NY: Aperture. Rosengarten, Theodore, ed. 1986. Tombee: Portrait of a Cotton Planter. New York: Quill Press. Rosenheim, Margaret K., Franklin E. Zimring, David S. Tanenhaus, and Bernardine Dohrn, eds. 2001. A Century of Juvenile Justice. Chicago: University of Chicago Press. Rosenthal, Michael. 1984. The Character Factory: Baden-Powell and the Origins of the Boy Scout Movement. New York: Pantheon Books. Ross, Dorothy. 1972. G. Stanley Hall: The Psychologist as Prophet. Chicago: University of Chicago Press. Rossi, Alice S., and Peter H. Rossi. 1990. Of Human Bonding: Parent-Child Relations across the Life Course. New York: Aldine de Gruyter. Rothman, David J. 1980. Conscience and Convenience: The Asylum and Its Alternatives in Progressive America. Boston: Little, Brown.
820
Bibliography
Rotundo, Anthony. 1993. American Manhood: Transformations in Masculinity from the Revolution to the Modern Era. New York: Basic Books.
Ryan, Mary P. 1981. Cradle of the Middle Class: The Family in Oneida County, New York, 1790–1865. New York: Cambridge University Press.
Rourke, Constance. 1928. Troupers of the Gold Coast; or, The Rise of Lotta Crabtree. New York: Harcourt Brace.
———. 1982. The Empire of the Mother: Americans Writing about Domesticity, 1830 to 1860. New York: Institute for Research in History and Naworth Press.
Rovetta, Catherine Humbargar, and Leon Rovetta. 1968. Teacher Spanks Johnny: A Handbook for Teachers. Stockton, CA: Willow House Publishers.
Ryerson, Ellen. 1978. The Best-Laid Plans: America’s Juvenile Court Experiment. New York: Hill and Wang.
Rowe, David C., and Bill L. Gulley. 1992. “Sibling Effects on Substance Use and Delinquency.” Criminology 30: 217–233.
Sabin, Roger. 1996. Comics, Comix and Graphic Novels: A History of Comic Art. London: Phaidon.
Rowlings, J. K. 1997. Harry Potter and the Sorcerer’s Stone. New York: Scholastic.
Sachar, Louis. 1998. Holes. New York: Farrar, Straus and Giroux.
Royce, Josiah. 1886. California: From the Conquest in 1846 to the Second Vigilance Committee in San Francisco. Boston: Houghton Mifflin.
Sadie, Stanley, ed. 1980. The New Grove Dictionary of Music and Musicians. London: Macmillan.
Rudgley, Richard. 1994. Essential Substances: A Cultural History of Intoxicants in Society. New York: Kodansha International. Rudwick, Elliott M. 1969. W. E. B. Du Bois: Propagandist of the Negro Protest. New York: Atheneum. Ruger, A. 1869. Bird’s Eye View of Young America: Warren County, Illinois. Map, Warren County, IL. Library of Congress Map Division. Rushkoff, Douglas. 1996. Playing the Future: How Kid’s Culture Can Teach Us to Thrive in an Age of Chaos. New York: HarperCollins. Rutland, Robert Allen. 1995. A Boyhood in the Dust Bowl. Boulder: University Press of Colorado. Ryan, Caitlin, and Donna Futterman. 1998. Lesbian and Gay Youth: Care and Counseling. New York: Columbia University Press.
Sadker, Myra, and David Sadker. 1994. Failing at Fairness: How Our Schools Cheat Girls. New York: Touchstone. Salinger, Sharon. 1987. “To Serve Well and Faithfully”: Labor and Indentured Servants in Pennsylvania 1682–1800. Cambridge, UK: Cambridge University Press. Saloutos, Theodore. 1964. The Greeks in the United States. Cambridge, MA: Harvard University Press. Sammons, Jeffrey T. 1990. Beyond the Ring: The Role of Boxing in American Society. Urbana: University of Illinois Press. Sanchez, Ellen, Trina Reed Robertson, Carol Marie Lewis, and Barri Rosenbluth. In press. “Preventing Bullying and Sexual Harassment in Elementary Schools: The Expect Respect Model.” Journal of Emotional Abuse. Sanchez, George I. 1940. Forgotten People: A Study of New Mexicans. Albuquerque: University of New Mexico Press.
Bibliography Sanders, Jo. 1990. “Computer Equity for Girls: What Keeps It from Happening.” Pp. 181–185 in Fifth World Conference on Computers in Education in Sydney, Australia. Amsterdam: Elsevier Science Publishing. Sante, Luc. 1991. Low Life: Lures and Snares of Old New York. New York: Farrar, Straus and Giroux. Santino, Jack. 1983. “Halloween in America: Contemporary Customs and Performances.” Western Folklore 42, no. 1: 1–20. ———. 1994. Halloween and Other Festivals of Life and Death. Knoxville: University of Tennessee Press. ———. 1995. All around the Year: Holidays and Celebrations in American Life. Urbana: University of Illinois Press. Saroyan, William. 1952. The Bicycle Rider in Beverly Hills. New York: Scribner’s. Sartain, William. 1864. “Young America Crushing Rebellion and Sedition.” Engraving in Library of Congress Prints and Photographs Division. Savin-Williams, Ritch C. 1990. Gay, Lesbian, and Bisexual Youth: Expressions of Identity. Washington, DC: Hemisphere. ———. 1998. “. . . And Then I Became Gay”: Young Men’s Stories. New York: Routledge. Savin-Williams, Ritch C., and Kenneth M. Cohen. 1996. The Lives of Lesbians, Gays, and Bisexuals: Children to Adults. Fort Worth, TX: Harcourt Brace College Publishing. Saxton, Alexander. 1990. The Rise and Fall of the White Republic: Class Politics and Mass Culture in Nineteenth-Century America. New York: Verso. Schaffner, Laurie. 1999. Teenage Runaways: Broken Hearts and Bad Attitudes. New York: Haworth Press.
821
Scharff, Virginia. 1991. Taking the Wheel: Women and the Coming of the Motor Age. New York: Free Press. Scharnhorst, Gary, and Jack Bales. 1981. Horatio Alger, Jr.: An Annotated Bibliography of Comment and Criticism. Metuchen, NJ: Scarecrow Press. ———. 1985. The Lost Life of Horatio Alger., Jr. Bloomington: Indiana University Press. Schatz, Thomas. 1981. Hollywood Genres: Formulas, Filmmaking, and the Studio System. New York: Random House. Schechter, Harold. 1996. “A Short Corrective History of Violence in Popular Culture.” New York Times Magazine (July 7): 32–33. Schlegel, Alice, and Herbert Barry III. 1991. Adolescence: An Anthropological Inquiry. New York: Free Press. Schlossman, Steven L. 1977. Love and the American Delinquent: The Theory and Practice of “Progressive” Juvenile Justice, 1825–1920. Chicago: University of Chicago Press. ———. 1995. “Delinquent Children: The Juvenile Reform School.” In The Oxford History of the Prison. Edited by Norval Morris and David J. Rothman. New York: Oxford University Press. Schneider, Eric C. 1992. In the Web of Class: Delinquents and Reformers in Boston, 1810s–1930s. New York: New York University Press. ———. 1999. Vampires, Dragons, and Egyptian Kings: Youth Gangs in Postwar New York. Princeton: Princeton University Press. Schob, David E. 1975. Hired Hands and Plowboys: Farm Labor in the Midwest, 1815–1860. Urbana: University of Illinois Press.
822
Bibliography
Schoenfeld, Stuart. 1988. “Folk Judaism, Elite Judaism and the Role of the Bar Mitzvah in the Development of the Synagogue and Jewish School in America.” Contemporary Jewry 9, no. 1: 85.
Secretary’s Commission on Achieving Necessary Skills (SCANS). 1991. What Work Requires of Schools: A SCANS Report for America 2000. Washington, DC: U.S. Department of Labor.
“School Goals: Draft.” 2000. Culver Academies, October 27.
Sedlak Andrea J., and Debra D. Broadhurst. 1996. Third National Incidence Study of Child Abuse and Neglect: Final Report. Washington, DC: U.S. Department of Health and Human Services.
Schramm, Wilbur, Jack Lyle, and Edwin Parker. 1961. Television in the Lives of Our Children. Palo Alto, CA: Stanford University Press. Schrank, Robert. 1998. Wasn’t That a Time? Growing Up Radical and Red in America. Cambridge: MIT Press. Schultz, Stanley K. 1973. The Culture Factory: Boston Public Schools, 1789–1860. New York: Oxford University Press. Schulz, John A., and Douglas Adair, eds. 1966. The Spur of Fame: Dialogues of John Adams and Benjamin Rush, 1805–1813. San Marino, CA: Huntington Library. Schwartz, Marie Jenkins. 2000. Born in Bondage: Growing Up Enslaved in the Antebellum South. Cambridge, MA: Harvard University Press. Schwarz, Ira M. 1989. (In)Justice for Juveniles: Rethinking the Best Interest of the Child. Lexington, MA: Lexington Books. Schwarz, Ira M., ed. 1992. Juvenile Justice and Public Policy. Lexington, MA: Lexington Books. Schwieder, Dorothy. 1993. 75 Years of Service: Cooperative Extension in Iowa. Ames: Iowa State University Press. Scieszka, Jon. 1992. The Stinky Cheese Man and Other Fairly Stupid Tales. New York: Penguin. ———. 1996. The Time Warp Trio Series. New York: Penguin.
Segerstrom, Suzanne, William McCarthy, and Nicholas Caskey. 1993. “Optimistic Bias among Cigarette Smokers.” Journal of Applied Social Psychology 23: 1606–1618. Seiter, Ellen. 1995. Sold Separately: Parents and Children in Consumer Culture. New Brunswick, NJ: Rutgers University Press. Sellers, Charles. 1991. The Market Revolution: Jacksonian America, 1815– 1846. New York: Oxford University Press. Sellers, John R. 1974. “The Common Soldier in the American Revolution.” In Military History of the American Revolution. Edited by Betsy C. Kysley. Washington, DC: USAF Academy. Sendak, Maurice. 1963. Where the Wild Things Are. New York: HarperCollins. Serpell, James. 1986. In the Company of Animals. Oxford: Basil Blackwell. Sexuality Information and Education Council of the United States (SIECUS). 1995. SIECUS Position Statements on Sexuality Issues 1995. New York: SIECUS. Seymour, Harold. 1960. Baseball: The Early Years. New York: Oxford University Press. ———. 1990. Baseball: The People’s Game. New York: Oxford University Press.
Bibliography
823
Shakeshaft, C. 1986. “A Gender at Risk.” Phi Delta Kappan 67, no. 7: 499–503.
Illegal Work Activities during 1931.” New York: Child Labor Committee, 13.
Shakur, Sanyika. 1993. Monster: The Autobiography of an LA Gang Member. New York: Penguin.
Siegel, Mark, Alison Landes, and Nancy Jacobs. 1995. Illegal Drugs and Alcohol: America’s Anguish. Wylie, TX: Information Plus.
Shapiro, Jeremy, Rebekah L. Dorman, William H. Burkey, Carolyn J. Welker, and Joseph B. Clough. 1997. “Development and Factor Analysis of a Measure of Youth Attitudes toward Guns and Violence.” Journal of Clinical Child Psychology 26: 311–320. Shaw, Daniel S., Robert E. Emery, and Michele D. Tuer. 1993. “Parental Functioning and Children’s Adjustment in Families of Divorce: A Prospective Study.” Journal of Abnormal Clinical Psychology 21, no. 1 (February): 119–134. Sheff, David. 1993. Game Over: How Nintendo Zapped an American Industry, Captured Your Dollars, and Enslaved Your Children. New York: Random House. Shelden, Randall, Sharon Tracy, and William Brown. 1997. Youth Gangs in American Society. New York: Wadsworth. Sheon, Aaron. 1976. “The Discovery of Graffiti.” Art Journal 36, no. 1: 16–22. Sherman, Arloc. 1994. Wasting America’s Future: The Children’s Defense Fund Report on the Costs of Child Poverty. Boston: Beacon Press. Sherman, Miriam. 1986. “Children’s Allowances.” Medical Aspects of Human Sexuality 20, no. 4: 121–128. Shilling, Chris. 1993. The Body and Social Theory. London: Sage.
Siks, Geraldine Brain, and Hazel Brain Dunnington, eds. 1967. Children’s Theatre and Creative Dramatics. Seattle: University of Washington Press. Silverman, Kenneth. 1976. A Cultural History of the American Revolution: Painting, Music, Literature, and the Theatre. New York: Thomas Y. Crowell. Simmons, Leo W. 1942. Sun Chief: The Autobiography of a Hopi Indian. New Haven, CT: Yale University Press. Simmons, William S. 1986. Spirit of the New England Tribes: Indian History and Folklore, 1620–1984. Hanover: University Press of New England. Simon, David, and Edward Burns. 1997. The Corner: A Year in the Life of an Inner City Neighborhood. New York: Broadway. Simpson, Marc. 1994. The Rockefeller Collection of American Art at the Fine Arts Museums of San Francisco. San Francisco: Fine Arts Museums of San Francisco. Simpson, Wayne. 1987. “Hockey.” Pp. 169–229 in A Concise History of Sport in Canada. Edited by Don Morrow, Mary Keyes, Wayne Simpson, Frank Cosentino, and Ron Lappage. Toronto: Oxford University Press.
Shoup, Laurence, and Randall Milliken. 1999. Inigo of Rancho Posolmi: The Life and Times of a Mission Indian. Menlo Park: Ballena Press.
Singer, Ben. 1992. “A New and Urgent Need for Stimuli: Sensational Melodrama and Urban Modernity.” Paper presented at the Melodrama Conference, British Film Institute, London.
Shulman, Harry M. 1932. “Newsboys of New York: A Study of the Legal and
Sinyard, Neil. 1992. Children in the Movies. New York: St. Martin’s Press.
824
Bibliography
Sjostrom, Lisa, and Nan D. Stein. 1995. Bullyproof: A Teacher’s Guide on Teaching and Bullying for Use with Fourth and Fifth Grade Students. Wellesley, MA: Wellesley College Center for Research on Women. Sjovold, Carl-Petter. 1999. “An Angling People: Nature, Sport and Conservation in Nineteenth-Century America.” Ph.D. diss., University of California at Davis. Skal, David J. 1993. The Monster Show: A Cultural History of Horror. New York: Penguin. ———. 1998. Screams of Reason: Mad Science and Modern Culture. New York: W. W. Norton. Skiing Heritage: Journal of the International Skiing History Association. 1989– . Quarterly. 499 Town Hill Road, New Hartford, CT. Skolnick, Jerome H., Theodore Correl, Elizabeth Navarro, and Roger Rabb. 1988. The Social Structure of Street Drug Dealing. Unpublished report to the Office of the Attorney General of the State of California. Berkeley: University of California at Berkeley. Skoloff, Gary, et al. 1995. To Win the War: Home Front Memorabilia of World War II. Missoula, MT: Pictorial Publishing. Sleeter, Christine. 1986. “Learning Disabilities: The Social Construction of a Special Education Category.” Exceptional Children 53: 46–54. Slide, Anthony. 1994. The Encyclopedia of Vaudeville. Westport, CT: Greenwood Press. Slotkin, Richard. 1992. Gunfighter Nation: The Myth of the Frontier in Twentieth-Century America. New York: Atheneum. Smith, Abbott Emerson. 1947. Colonists in Bondage: White Servitude and Convict
Labor in America, 1607–1776. New York: W. W. Norton. Smith, Adam. 1776. Wealth of Nations. 1937. Reprint, New York: Modern Library. Smith, I. Evelyn. 1947. “Adoption.” Pp. 22–27 in Social Work Year Book 9. New York: Russell Sage Foundation. Smith, John. 1907. The Generall Historie of Virginia, New England and the Summer Isles Together with the True Travels, Adventures and Observations, and a Sea Grammar. Vol. 1. Glasgow: J. Maclehose and Sons. Smith, Kristin. 2000. Who’s Minding the Kids? Child Care Arrangements. Washington, DC: U.S. Department of Commerce, Economic and Statistics Administration, U.S. Census Bureau. Smith, Lewis W., and Gideon L. Blough. 1929. Planning a Career: A Vocational Civics. New York: American Book Company. Smith, Page. 1962. John Adams. Vol. 1. Garden City, NY: Doubleday. Smith, Peter K. 1991. “The Silent Nightmare: Bullying and Victimization in School Peer Groups.” The Psychologist: Bulletin of the British Psychological Society 4: 243–248. Smith, Robert A. 1972. A Social History of the Bicycle: Its Early Life and Times in America. New York: American Heritage Press. Smith, Ronald A. 1990. Sports and Freedom: The Rise of Big-Time College Athletics. New York: Oxford University Press. Snarey, John. 1993. How Fathers Care for the Next Generation: A Four-Decade Study. Cambridge, MA: Harvard University Press.
Bibliography Snow, Richard. 1989. Coney Island: A Postcard Journey to the City of Fire. New York: Brightwater Press. Snyder, H. N., et al. 1993. Juvenile Court Statistics 1990. Washington, DC: U.S. Department of Justice. Snyder, Robert W. 1989. The Voice of the City: Vaudeville and Popular Culture in New York. New York: Oxford University Press. Solomon, Robert C. 1974. “Sexual Paradigms.” The Journal of Philosophy 71 (June): 336–345. Sommers, Christina H. 2000. “The War against Boys.” The Atlantic Monthly (May): 59–74. Sonenstein, Freya L., Kellie Stewart, Laura Duberstein Lindberg, Marta Pernas, and Sean Williams. 1997. Involving Males in Preventing Teen Pregnancy: A Guide for Program Planners. Washington, DC: Urban Institute. Southern, Eileen. 1971. The Music of Black Americans. New York: W. W. Norton. Sowerby, Millicent E., comp. 1952–1959. Catalogue of the Library of Thomas Jefferson. Washington, DC: Library of Congress. Spencer, Lyle M., and Robert K. Burns. 1943. Youth Goes to War. Chicago: Science Research Associates. Spertus, Ellen. 1991. “Why Are There So Few Female Computer Scientists?” AI Lab Technical Report 1315. Artificial MIT (August). Spigel, Lynn. 1991. “From Domestic Space to Outer Space: The 1960s Fantastic Family Sitcom.” In Close Encounters: Film, Feminism, and Science Fiction. Edited by Constance Penley, Elisabeth Lyon, Lynn Spigel, and Janet Bergstrom. Minneapolis: University of Minnesota Press.
825
———. 1992. Make Room For TV: Television and the Family Ideal in Postwar America. Chicago: University of Chicago Press. Spiller, Robert E. 1971. “Emerson’s ‘The Young American.’” Clio 1: 37–41. Spinelli, Jerry. 1982. Space Station Seventh Grade. Toronto: Little, Brown. Spitz, Rene A. 1952. “Authority and Masturbation: Some Remarks on a Bibliographical Investigation.” The Psychoanalytic Quarterly 21 (October): 490–527. Spitzer, Robert J. 1999. “The Gun Dispute.” American Educator 23: 10–15. Sponsler, C. 1993. “Juvenile Prostitution Prevention Project.” WHISPER 13, no. 2: 3–4. Sports Illustrated for Kids Omnibus Study. 1989. Cited on p. 29 in Kids as Customers: A Handbook of Marketing to Children. Edited by James U. McNeal. New York: Lexington Books, 1992. Spring, Joel. 1974. “Mass Culture and School Sports.” History of Education Quarterly 14 (Winter): 483–499. Stack, Herbert J. 1946. “Greater Safety for Our Youth: An American Opportunity.” Journal of Educational Sociology 20, no. 2: 114–123. Standing Bear, Luther. 1933. Land of the Spotted Eagle. 1978. Reprint, Lincoln: University of Nebraska Press. Statistics Research Group. 1997. U.S. Pet Ownership and Demographics Sourcebook. Schaumburg, IL: American Veterinary Medical Association. Stefanko, Michael. 1984. “Trends in Adolescent Research: A Review of Articles Published in Adolescence.” Adolescence 19, no. 73: 1–13.
826
Bibliography
Steig, William. 1971. Amos and Boris. New York: Farrar, Straus and Giroux. ———. 1976. Abel’s Island. Toronto: Collins Publishing. Stein, Charles W., ed. 1984. American Vaudeville as Seen by Its Contemporaries. New York: Alfred A. Knopf. Stein, Mark A. 1987. “Carriers—The Young Are Fading.” Los Angeles Times, April 10, 1, 30–31. Stein, Nan D. 1999. Classrooms and Courtrooms: Facing Sexual Harassment in K–12 Schools. New York: Teachers College Press.
Stephan, Walter G., and Cookie W. Stephan. 1989. “Antecedents of Intergroup Anxiety in Asian-Americans and Hispanic-Americans.” International Journal of Intercultural Relations 13: 203–219. Stets, Joan E., and Murray A. Straus. 1990. “Gender Differences in Reporting Marital Violence and Its Medical and Psychological Consequences.” Pp. 151–165 in Physical Violence in American Families: Risk Factors and Adaptations to Violence in 8,145 Families. Edited by M. A. Straus and R. J. Gelles. New Brunswick, NJ: Transaction.
———. In press. “What a Difference a Discipline Makes.” Journal of Emotional Abuse.
Stevenson, Brenda E. 1996. Life in Black and White: Family and Community in the Slave South. New York: Oxford University Press.
Steinberg, Lawrence. 1982. “Jumping off the Work Experience Bandwagon.” Journal of Youth and Adolescence 11, no. 3: 183–205.
Stewart, Jack. 1989. “Subway Graffiti: An Aesthetic Study of Graffiti on the Subway System of New York City, 1970–1978.” Ph.D. diss., New York University.
———. 1990. “Autonomy, Conflict, and Harmony in the Family Relationship.” Pp. 255–276 in At the Threshold: The Developing Adolescent. Edited by Shirley Feldman and Glen R. Elliott. Cambridge, MA: Harvard University Press.
Stine, R. L. 1995. Goosebumps. New York: Apple/Scholastic.
Steinberg, Lawrence, Suzanne Fegley, and Sanford M. Dornbusch. 1993. “Negative Impact of Part-time Work on Adolescent Adjustment: Evidence from a Longitudinal Study.” Developmental Psychology 29, no. 2: 171–180. Steinberg, Laurence, and Shelli Avenevoli. 1998. “Disengagement from School and Problem Behaviors in Adolescence: A Developmental-Contextual Analysis of the Influence of Family and Part-time Work.” Pp. 392–424 in New Perspectives on Adolescent Risk Behavior. Edited by Richard Jessor. New York: Cambridge University Press.
Stocker, Clare, and Judy Dunn. 1991. “Sibling Relationships in Childhood: Links with Friendships and Peer Relationships.” British Journal of Developmental Psychology 8: 227–244. Stocker, Clare, Judy Dunn, and Robert Plomin. 1989. “Sibling Relationships: Links with Child Temperament, Maternal Behavior, and Family Structure.” Child Development 60: 715–727. Stoddard, John F. 1866. The American Intellectual Arithmetic. New York: Sheldon. Stoneman, Zolinda, Gene H. Brody, and Carol MacKinnon. 1986. “Same-sex and Cross-sex Siblings: Activity Choices, Roles, Behavior, and Gender Stereotypes.” Sex Roles 15: 495–511.
Bibliography Stormshak, Elizabeth A., Christina J. Bellanti, and Karen L. Bierman. 1996. “The Quality of Sibling Relationships and the Development of Social Competence and Behavioral Control in Aggressive Children.” Developmental Psychology 32: 79–89. Stouthamer-Loeber, Magda, and Evelyn H. Wei. 1998. “The Precursors of Young Fatherhood and Its Effect on Delinquency of Teenage Males.” Journal of Adolescent Health 22: 56–65. Stowe, Catherine M. 1978. “The National Junior and Boys Tennis Championships (June).” Unpublished history project, Kalamazoo, MI. Strasburger, Victor, and Don Greydanus, eds. 1990. Adolescent Medicine: The AtRisk Adolescent. Philadelphia: Hanley and Belfus. Straus, Murray A. 1992. Children as Witnesses to Marital Violence: A Risk Factor for Lifelong Problems among a Nationally Representative Sample of American Men and Women. Report of the 23d Ross Roundtable. Columbus, OH: Ross Laboratories. Straus, Murray A., and Richard J. Gelles, eds. 1990. Physical Violence in American Families: Risk Factors and Adaptations to Violence in 8,145 Families. New Brunswick, NJ: Transaction. Street Rodder. Anaheim, CA: McMullen Argus/PRIMEDIA Publishers. Sung, Betty Lee. 1967. Mountain of Gold: The Story of the Chinese in America. New York: Macmillan. Sutherland, Edwin. 1973. “Susceptibility and Differential Association.” Pp. 42–43 in Edwin H. Sutherland on Analyzing Crime. Edited by K. Schuessler. Chicago: University of Chicago Press. Sutton-Smith, Brian, Jay Mechling, Thomas W. Johnson, and Felicia R.
827
McMahon, eds. 1999. Children’s Folklore: A Source Book. Logan: Utah State University Press. Swearer, Susan, and Beth Doll. In press. “Bullying in Schools: An Ecological Framework.” Journal of Emotional Abuse. Szasz, Margaret Connell. 1985. “Native American Children.” Pp. 311–332 in American Childhood: A Research Guide and Historical Handbook. Edited by Joseph M. Hawes and N. Ray Hiner. Westport, CT: Greenwood Press. Tadman, Michael. 1989. Speculators and Slaves: Masters, Traders, and Slaves in the Old South. Madison: University of Wisconsin Press. Talbot, Margaret. 2000. “The Maximum Security Adolescent.” New York Times Magazine, September 10. Tanenhaus, David S. 1998–1999. “Juvenile for the Child: The Beginning of the Juvenile Court in Chicago.” Chicago History 27: 4–19. Tanner, J. M. 1971. “Sequence, Tempo, and Individual Variations in Growth and Development of Boys and Girls Aged Twelve to Sixteen.” In Twelve to Sixteen. Edited by Jerome Kagan. New York: Norton. Tarratt, Margaret. 1970. “Monsters from the Id.” Pp. 330–349 in Film Genre Reader II. Edited by Barry Keith Grant. Austin: University of Texas Press. Tate, Cassandra. 1999. Cigarette Wars. New York: Oxford University Press. Tattum, Delwyn P., ed. 1993. Understanding and Managing Bullying. Oxford: Heinemann. Tattum, Delwyn P., and David A. Lane, eds. 1988. Bullying in Schools. Stoke-onTrent, Staffordshire, UK: Trentham Books. Tawa, Nicholas E. 2000. High Minded and Low Down: Music in the Lives of
828
Bibliography
Taylor, Dwight. 1962. Blood-andThunder. New York: Atheneum.
Thomas, J. L. 1990. “The Grandparent Role: A Double Bind.” International Journal of Aging and Human Development 31: 169–177.
Taylor, Henry C. 1970. Tarpleywick: A Century of Iowa Farming. Ames: Iowa State University Press.
Thomas, John C., ed. 1825. “Memoirs of Stephen Allen.” New York: New York Historical Society.
Teenage Research Unlimited. 2000. Teens Spend $153 Billion in 1999. Northbrook, IL: Teenage Research Unlimited.
Thomas, Keith. 1983. Man and the Natural World: Changing Attitudes in England, 1500–1800. London: Allen Lane.
Teitelbaum, Kenneth. 1993. Schooling for “Good Rebels”: Socialist Education for Children in the United States, 1900–1920. Philadelphia: Temple University Press.
Thompson, Warren S. 1949. “The Demographic Revolution in the United States.” Annals of the American Academy of Political and Social Sciences, no. 262.
Telander, Rick. 1976. Heaven Is a Playground. New York: St. Martin’s Press.
Thoreau, Henry David. 1962. Walden, or, Life in the Woods. New York: Time.
Terkel, Studs. 1970. Hard Times: An Oral History of the Great Depression. New York: Pantheon.
Thornberry, Terence P., Carolyn A. Smith, and Gregory J. Howard. 1997. “Risk Factors for Teenage Fatherhood.” Journal of Marriage and the Family 59: 505–522.
Americans, 1800–1861. Northeastern University Press.
Teti, Douglas M. In press. “Sibling Relationships.” In Interiors: Retrospect and Prospect in the Psychological Study of Families. Edited by J. McHale and W. Grolnick. Mahwah, NJ: Erlbaum. Thai, H. C. 1999. “‘Splitting Things in Half Is So White!’: Conceptions of Family Life and Friendship and the Formation of Ethnic Identity among Second Generation Vietnamese Americans.” Amerasia Journal 25, no. 1: 53–88. Theis, Sophie van Senden. 1937. “Adoption.” Pp. 23–25 in Social Work Year Book 4. New York: Russell Sage Foundation.
Thorne, Barrie. 1983. Gender Play. New Brunswick, NJ: Rutgers University Press. Thrasher, Frederic. 1963. The Gang: A Study of 1,313 Gangs in Chicago. Rev. ed. Chicago: University of Chicago Press. Tobin, Joseph. 2000. “Good Guys Don’t Wear Hats”: Children Talk about the Media. New York: Teachers College Press. Toll, Robert C. 1974. Blacking Up: The Minstrel Show in Nineteenth-Century America. New York: Oxford University Press.
“Then and Now: Newspaper Distributing in Detroit in the ’50s.” 1896. Friend Palmer Scrapbook (Detroit Public Library) 13 (May 26): 70.
———. 1976. On with the Show! The First Century of Show Business in America. New York: Oxford University Press.
Thomas, Glyn V., and A. M. Silk. 1990. An Introduction to the Psychology of Children’s Drawings. New York: New York University Press.
Tomes, Nancy. 1998. The Gospel of Germs: Men, Women, and the Microbe in American Life. Cambridge, MA: Harvard University Press.
Bibliography Torr, James, ed. 2000. Alcoholism. San Diego, CA: Greenhaven Press. Townsend, John Rowe. 1971. A Sense of Story: Essays on Contemporary Writers for Children. Philadelphia: J. B. Lippincott. Trattner, Walter I. 1970. Crusade for the Children: A History of the National Child Labor Committee and Child Labor Reform in America. Chicago: Quadrangle Books. Triay, Victor Andres. 1998. Fleeing Castro: Operation Pedro Pan and the Cuban Children’s Program. Gainesville: University Press of Florida. “The Trout Brook.” 1847. Spirit of the Times 17, June 19. Tuan, Yi Fu. 1984. Dominance and Affection: The Making of Pets. New Haven: Yale University Press. Tuhy, Carrie. 1981. “The Star Wars Generation Takes on Inflation.” Money 11 (July): 88–96. Turkle, S. 1984. The Second Self: Computers and the Human Spirit. New York: Simon and Schuster. Tuttle, William M. 1993. “Daddy’s Gone to War”: The Second World War in the Lives of America’s Children. New York: Oxford University Press. Twain, Mark (Samuel Clemens). 1946. The Adventures of Tom Sawyer. New York: Grosset and Dunlap. ———. 1980. Roughing It. Reprint, New York: New American Library. Ulrich, Laurel Thatcher. 1999. “Sheep in the Parlor, Wheels on the Common: Pastoralism and Poverty in Eighteenth Century Boston.” Pp. 182–200 in Inequality in Early America. Edited by Carla Gardia Pestana and Sharon V. Salinger. Hanover: University Press of New England.
829
Uncle John. 1848. Boys’ Own Book of Sports, Birds, and Animals. New York: Leavitt and Allen. U.S. Census Bureau. 2000. Statistical Abstract of the United States: 1999. Washington, DC: Government Printing Office. U.S. Census Bureau. “Marital Status and Living Arrangements,” http://www.census. gov/population/www/socdemo/ ms-la.html. U.S. Congress. House of Representatives. Committee on Immigration and Naturalization. 1939. Admission of German Refugee Children. Hearings before the Committee on Immigration and Naturalization, House of Representatives, 76th Congress, 1st Session on H.J. Res. 165 and H.J. Res. 168, Joint Resolutions to Authorize the Admission to the United States of a Limited Number of German Refugee Children. May 24–June 1, 1939. Washington, DC: Government Printing Office. U.S. Department of Education. 2000. Career Clusters: Adding Relevancy to Education. Pamphlet. Washington, DC: U.S. Department of Education. ———. 2000. “The Federal Role in Education,” http://www.ed.gov/offices/ OUS/fedrole.html (accessed March 28). U.S. Department of Education, National Center for Education Statistics. 1998. Violence and Discipline Problems in U.S. Public Schools: 1996–1997. NCES 98-030. Washington, DC: U.S. Government Printing Office. U.S. Departments of Education and Justice. 2000. Indicators of School Crime and Safety, 2000. NCES2001-017/NCJ184176. Washington, DC: U.S. Government Printing Office
830
Bibliography
U.S. Immigration and Naturalization Service. 1999. 1997 Statistical Yearbook. Washington, DC: Government Printing Office. U.S. Immigration Commission. 1911. Abstracts of Reports of the Immigration Commission. Washington, DC: Government Printing Office. U.S. Lawn Tennis Association. 1931. Fifty Years of Lawn Tennis in the United States. New York: USLTA. ———. 1972. Official Encyclopedia of Tennis. New York: Harper and Row. U.S. Scouting Service Project. 2000. “BSA Declaration of Religious Principle,” http://www.usscouts.org/aboutbsa/rp. html (accessed May 29, 2000). U.S. Tennis Association. 1995– . Tennis Yearbook. Lynn, MA: H. O. Zimman. Uys, Errol Lincoln. 1999. Riding the Rails: Teenagers on the Move during the Great Depression. New York: TV Books. Venezky, Richard L. 1992. “Textbooks in School and Society.” Pp. 436–461 in Handbook of Research on Curriculum. Edited by Philip W. Jackson. New York: Macmillan. Vey, Shauna. 1998. “Protecting Childhood: The Campaign to Bar Children from Performing Professionally in New York City, 1874–1919.” Ph.D. diss., City University of New York. Vickers, Daniel. 1994. Farmers and Fishermen: Two Centuries of Work in Essex County, Massachusetts, 1630–1850. Chapel Hill: University of North Carolina Press. Videogames.com. “The History of Video Games,” http://www.videogames.com/ features/universal/hov/ (accessed December 27, 2000). Viken, James P. 1978. “The Sport of Drag Racing and the Search for Satisfaction,
Meaning, and Self.” Ph.D. diss., University of Minnesota. “Voices from the Combat Zone: Game Grrrlz Talk Back.” Pp. 328–341 in From Barbie to Mortal Combat: Gender and Computer Games. Edited by Henry Jenkins and Justine Cassell. Cambridge, MA: MIT Press. Volling, B. L., and Jay Belsky. 1992. “The Contribution of Mother-Child and FatherChild Relationships to the Quality of Sibling Interaction: A Longitudinal Study.” Child Development 63: 1209–1222. Vorrasi, Joseph A., John J. Eckenrode, and Charles V. Izzo. 2000. Intergenerational Transmission of Marital Violence: A Gender-Similarity Hypothesis. Paper presented at the Fifth International Conference on the Victimization of Children and Youth, Durham, NH. Voyer, Daniel, Susan Voyer, and M. P. Bryden. 1995. “Magnitude of Sex Differences in Spatial Abilities: A MetaAnalysis and Consideration of Critical Variables.” Psychological Bulletin 117: 250–270. Wager-Fisher, Mary. 1880. “The Philadelphia Newsboys.” Wide Awake 11, no. 1 (July): 16, 18. Wagner, Bob. 1992. The Nationals and How They Grew in Kalamazoo. Kalamazoo, MI: J-B Printing. Wagner, Carolyn Ditte. 1979. “The Boy Scouts of America: A Model and a Mirror of American Society.” Ph.D. diss., Johns Hopkins University. Wagner, Mazie E., Herman J. P. Schubert, and Daniel S. P. Schubert. 1979. “SibshipConstellation Effects on Psychological Development, Creativity, and Health.” Advances in Child Development and Behavior 14: 57–148.
Bibliography Waldorf, D. 1994. “Drug Use and HIV Risk among Male Sex Workers: Results of Two Samples in San Francisco.” Pp. 114–131 in The Context of HIV Risk among Drug Users and Their Sexual Partners. Edited by R. J. Battjes, Z. Sloboda, and W. C. Grace. NIDA Research Monograph. Rockville, MD: National Institute on Drug Abuse. Walett, Francis G., ed. 1974. The Diary of Ebenezer Parkman, 1703–1782. Worcester, MA: American Antiquarian Society. Walker, Bonnie L., ed. 1996. Injury Prevention for Young Children: A Research Guide. Westport, CT: Greenwood Press. Walker, Williston, Richard A. Norris, David W. Lotz, and Robert T. Handy. 1985. A History of the Christian Church. 4th ed. New York: Charles Scribner’s Sons. Wall, Helena. 1990. Fierce Communion: Family and Community in Early America. Cambridge, MA: Harvard University Press. Wallace, Anthony. 1966. Religion: An Anthropological View. New York: Random House. Wallerstein, J. S., and J. B. Kelly. 1980. Surviving the Break-up: How Children and Parents Cope with Divorce. New York: Basic Books. Wallerstein, J. S., S. B. Corbin, and J. M. Lewis. 1988. “Children of Divorce: A 10Year Study.” In Impact of Divorce, Single Parenting, and Stepparenting on Children. Edited by E. M. Hetherington and Josephine D. Arasteh. Hillsdale, NJ: Erlbaum. Walsh, Mark. 2000. “Hazing Is Widespread, Student Survey Shows.” Education Week 20, no. 1 (September 6): 14.
831
Walters, Pamela Barnhouse, and Phillip J. O’Connell. 1988. “The Family Economy, Work, and Educational Participation in the United States 1890–1940.” American Journal of Sociology 93: 1116–1152. Walworth, Arthur. 1938. School Histories at War. Cambridge: Harvard University Press. Ward Platt, M. P., and R. A. Little. 1998. Injury in the Young. Cambridge, UK: Cambridge University Press. Ward, Paul. 1875. “Street Arabs: Bootblacks and Newsboys.” Oliver Optic’s Magazine 18 (December): 949. Ward, Winifred. 1958. Theatre for Children. Anchorage, KY: Children’s Theatre Press. Warner, Beth S., Mark D. Weist, and Amy Krulak. 1999. “Risk Factors for School Violence.” Urban Education 34: 52–68. Washington, Booker T. 1900. The Story of My Life and Work. In The Booker T. Washington Papers. Vol. 1, The Autobiographical Writings. Edited by Louis R. Harlan. Chicago: J. L. Nichols. 1972. Reprint, Urbana: University of Illinois Press. ———. 1901. Up from Slavery: An Autobiography. In The Booker T. Washington Papers. Vol. 1, The Autobiographical Writings. Edited by Louis R. Harlan. New York: Doubleday, Page. 1972. Reprint, Urbana: University of Illinois Press. Watkins, T. H. 1999. The Hungry Years: A Narrative History of the Great Depression in America. New York: Henry Holt. Webb, Lester Austin. 1958. “The Origins of Military Schools in the United States Founded in the Nineteenth Century.” Ph.D. diss., School of Education, University of North Carolina.
832
Bibliography
Webber, Thomas L. 1978. Deep Like the Rivers: Education in the Slave Quarter Community, 1831–1865. New York: W. W. Norton. Weinstein, Neil. 1987. Taking Care: Understanding and Encouraging SelfProtective Behavior. New York: Cambridge University Press. Weir, La Vada. 1977. Skateboards and Skateboarding. New York: Pocket Books. Weisberg, D. K. 1985. Children of the Night: A Study of Adolescent Prostitution. Lexington: D.C. Heath. Weisman, Adam M., and Kaushal K. Sharma. 1997. “Parricide and Attempted Parricide: Forensic Data and Psychological Results.” In The Nature of Homicide: Trends and Changes. Washington, DC: U.S. Department of Justice, Office of Justice Programs, National Institute of Justice. Weisman, Mary Lou. 1994. “When Parents Are Not in the Best Interests of the Child.” Atlantic Monthly 274, no. 1 (July): 42–63. Wekerle, Christine, and David A. Wolfe. 1996. “Child Maltreatment.” In Child Psychopathology. Edited by E. J. Mash and R. A. Barkley. New York: Guilford Press. Wellesley College Center for Research on Women. 1992. The AAUW Report: How Schools Shortchange Girls—A Study of Major Findings on Girls and Education. Washington, DC: AAUW Educational Foundation. Wellman, Henry, and Susan Gelman. 1992. “Cognitive Development: Foundational Theories of Core Domains.” Annual Review of Psychology 43: 337–376. Welsh, Ralph S. 1976. “Severe Parental Punishment and Delinquency: A Developmental Theory.” Journal of Clinical Child Psychology 5, no. 1: 17–21.
Werner, Emmy E. 1998. Reluctant Witnesses: Children’s Voices from the Civil War. Boulder, CO: Westview Press. ———. 2000. Through the Eyes of Innocents: Children Witnesses of World War II. Boulder, CO: Westview Press. Wertham, Frederick. 1953. Seduction of the Innocent. New York: Rinehart. ———. 1996. “The Psychopathology of Comic Books.” American Journal of Psychotherapy 50, no. 4 (Fall): 472–490. Wessel, Thomas, and Marilyn Wessel. 1982. 4-H: An American Idea 1900–1980. Chevy Chase: National 4-H Council. West, Candace, and Don Zimmerman. 1987. “Doing Gender.” Gender and Society 1, no. 2. West, Elliott. 1983. “Heathens and Angels: Childhood in the Rocky Mountain Mining Towns.” Western Historical Quarterly (April): 145–164. ———. 1989. Growing Up with the Country: Childhood on the Far-Western Frontier. Albuquerque: University of New Mexico Press. ———. 1996. Growing Up in TwentiethCentury America: A History and Reference Guide. Westport, CT: Greenwood Press. West, Elliott, and Paula Petrik, eds. 1992. Small Worlds: Children and Adolescents in America, 1850–1950. Lawrence: University of Kansas Press. West, Richard. 1987. Television Westerns: Major and Minor Series, 1946–1978. City?: McFarland. Westbrook, Robert. 1987. “Lewis Hine and the Two Faces of Progressive Photography.” Tikkun 2 (April–May): 24–29. Reprinted in Leon Fink, ed. 2001. Major Problems in the Gilded Age and Progressive Era. 2d ed. Boston: Houghton Mifflin.
Bibliography Wheeler, Raymond, and Francis Perkins. 1932. Principles of Mental Development. New York: Thomas Y. Crowell. Wheeler, T. T., S. P. McGorray, L. Yorkiewicz, S. D. Keeling, and C. J. King. 1994. “Orthodontic Treatment Demand and Need in Third- and Fourth-Grade Schoolchildren.” American Journal of Orthodontics and Dentofacial Orthopedics 106, no. 1: 22–33. Wheeler, Tom. 1990. American Guitars: An Illustrated History. New York: Harper. “Where Is Young Life?” 2001. http://www.younglife.org.
833
Culture, from Its Beginnings to the Zoot Suit. Ithaca: Cornell University Press. White, Timothy. 1990. Rock Lives: Profiles and Interviews. New York: Holt. Whitfield, H. N., J. D. Frank, G. Williams, and J. A. Vale, eds. 1999. “Circumcision: BJU Supplement 1.” BJU International 83, no. 1 (January). Whitney, Irene, and Peter K. Smith. 1993. “A Survey of the Nature and Extent of Bullying in Junior/Middle and Secondary Schools.” Educational Research 31, no. 1: 3–25. Nan Stein, 1999, Classrooms and Courtrooms.
Whipple, Edward G., and Eileen G. Sullivan, eds. 1998. “New Challenges for Greek Letter Organizations: Transforming Fraternities and Sororities into Learning Communities.” New Directions for Student Services, no. 81. San Francisco: Jossey-Bass.
Whitton, Blair. 1981. American Clockwork Toys, 1862–1900. Exton, PA: Schiffer Publishing.
Whisnant, David E. 1971. “Selling the Gospel News, or the Strange Career of Jimmy Brown the Newsboy.” Journal of Social History 5, no. 3: 269–309.
Wienke, Chris. 1998. “Negotiating the Male Body: Men, Masculinity, and Cultural Ideals.” The Journal of Men’s Studies 6, no. 2: 255–282.
Whitbeck, Les B., and Dan R. Hoyt. 1999. Nowhere to Grow: Homeless and Runaway Adolescents and Their Families. New York: Aldine de Gruyter.
Wilder, Laura Ingalls. 1961. Farmer Boy. New York: HarperCollins.
White, Jerry. 1983. The Church and the Parachurch: An Uneasy Marriage. Portland: Multnomah Press. White, Phillip, and James Gillett. 1994. “Reading the Muscular Body: A Critical Decoding of Advertisements in Flex Magazine.” Sociology of Sport Journal 11: 18–39. White, Ryan, and Ann Marie Cunningham. 1991. Ryan White: My Own Story. New York: Dial Press. White, Shane, and Graham J. White. 1999. Stylin: African American Expressive
Widmer, Edward L. 1998. Young America: The Flowering of Democracy in New York City. New York: Oxford University Press.
Wilentz, Sean. 1984. Chants Democratic: New York City and the Rise of the American Working Class, 1788–1850. New York: Oxford University Press. Wilks, Corinne, and Catherine Melville. 1990. “Grandparents in Custody and Access Disputes.” Journal of Divorce and Remarriage 13: 36–42. Williams, Paul N. 1972. “Boys Town, America’s Wealthiest City?” Sun Newspapers of Omaha. Special report, March 30. Wilmeth, Don, with Tice L. Miller. 1996. Cambridge Guide to American Theatre. Cambridge: Cambridge University Press.
834
Bibliography
Wilson, Douglas L., ed. 1989. Jefferson’s Literary Commonplace Book. Princeton: Princeton University Press. In Julian P. Boyd et al., eds. 1950– . The Papers of Thomas Jefferson. Princeton: Princeton University Press. Wilson, Edward O. 1978. On Human Nature. Cambridge, MA: Harvard University Press. Winkler, Karl Tilman. 1996. “Reformers United: The American and the German Juvenile Court, 1882–1923.” Pp. 235–274 in Institutions of Confinement: Hospitals, Asylums, and Prisons in Western Europe and North America 1550–1900. Edited by Norbert Finzsch and Robert Juette. Cambridge: Cambridge University Press. Winston, R. B., Jr., William R. Nettles III, and John H. Opper Jr., eds. 1987. “Fraternities and Sororities on the Contemporary College Campus.” New Directions for Student Services, no. 40. San Francisco: Jossey-Bass. Winters, Paul, ed. 1997. Teen Addiction. San Diego, CA: Greenhaven Press. Wojciechowka, Maia. 1964. Shadow of a Bull. New York: Simon and Schuster. Wolfenstein, Martha. 1954. Children’s Humor: A Psychological Analysis. Bloomington: Indiana University Press. Wolfgang, Marvin, and Franco Ferracuti. 1967. The Subculture of Violence: Toward an Integrated Theory in Criminology. London: Tavistock. Wood, Robin. 1986. Hollywood from Vietnam to Reagan. New York: Columbia University Press. Woodward, Calvin M. 1889. “The Results of the St. Louis Manual Training School.” Journal of Proceedings and Addresses. Session of the year 1889, held in Nashville, TN, National Education Association.
Woolery, George. 1983. Children’s Television: The First Twenty-Five Years. Metuchen, NJ: Scarecrow. Worrell, Estelle Ansley. 1980. Children’s Costume in America 1607–1910. New York: Charles Scribner’s Sons. Wright, Esmond, ed. 1989. Benjamin Franklin: His Life as He Wrote It. Cambridge, MA: Harvard University Press. Wyatt-Brown, Bertram. 1982. Southern Honor: Ethics and Behavior in the Old South. New York: Oxford University Press. Yablonsky, Lewis, and Jonathan Brower. 1979. The Little League Game: How Kids, Coaches, and Parents Really Play It. New York: New York Times Books. Yep, Laurence. 1975. Dragonwings. New York: HarperCollins. YMCA of the USA. 2001. “YMCA’s at a Glance,” http://www.ymca.net (accessed May 14, 2001). “Yo, It’s Time for Braces.” 2001. http:// tqjunior.thinkquest.org/5029/ (accessed in March, 2001). Yoshimi, Jeff. 1997. “Hapas at a Los Angeles High School: Context and Phenomenology.” Amerasia Journal 23, no. 1: 130–148. Young America Admires the Ancients. 1783–1840. Library of Congress Prints and Photographs Division. Young America!: The Organ of the National Reform Association. Formerly the Workingman’s Advocate. 1844–1845. Library of Congress Newspapers and Periodical Division. Young, Brian M. 1990. Television Advertising and Children. Oxford: Oxford University Press.
Bibliography Young, Jacob. 1857. Autobiography of a Pioneer. Cincinnati: Jennings and Pye; New York: Eaton and Mains. Youth Theatre Journal. 1986– . American Association of Theatre for Youth. Zainaldin, Jamil S. 1979. “The Emergence of a Modern American Family Law: Child Custody, Adoption and the Courts.” Northwestern University School of Law 73 (February): 1038–1089. Zald, Mayer N. 1970. Organizational Change: The Political Economy of the YMCA. Chicago: University of Chicago Press. Zaslow, Martha J. 1988. “Sex Differences in Children’s Response to Parental Divorce: 1. Research Methodology and Postdivorce Family Forms.” American Journal of Orthopsychiatry 58, no. 3: 355–378. Zelizer, Viviana A. 1985. Pricing the Priceless Child: The Changing Social Value of Children. New York: Basic Books.
835
Zhou, M. 1999. “Coming of Age: The Current Situation of Asian American Children.” Amerasia Journal 25, no. 1: 1–27. Zimring, Franklin E. 1982. The Changing Legal World of Adolescence. New York: Free Press. Zipes, Jack. 1983. Fairy Tales and the Art of Subversion: The Classical Genre for Children and the Process of Civilization. New York: Wildman Press. Zopf, Paul E., Jr. 1992. Mortality Patterns and Trends in the United States. Westport, CT: Greenwood. Zoske, J. 1998. “Male Circumcision: A Gender Perspective.” Journal of Men’s Studies 6, no. 2: 189–208. Zuboff, Shoshana. 1988. In the Age of the Smart Machine. New York: Basic Books. Zucci, John E. 1992. The Little Slaves of the Harp: Italian Street Musicians in Nineteenth-Century Paris, London, and New York. Montreal: McGill-Queen’s University Press.
Index
A Minor Consideration, 498 Abbott, Jacob, 99 Abuse, 1–6, 725–726 boys as perpetrators, 1, 3 bullying, 130–136 juvenile delinquency and, 411 labor exploitation, 2, 3 legal issues, 4 mothers and, 725 prostitution and, 540 punishment at school, 727–728 societal tolerance of, 5 Society for the Prevention of Cruelty to Children (SPCC), 272, 474, 494–496, 717 See also Child labor; Discipline; Violence Accidents, 6–10 firearms and, 332 frontier boys and, 291 leading causes, 9 sports injuries, 9–10 Acker, Joan, 428 Action figures, 707–708 Actors and performers, 493–498. See also Theater Adams, Andy, 109 Adams, John, 10–14 Adams, William Taylor, 100, 232 Addams, Jane, 402–403, 495–496, 668 Adolescence, 14–19 advertiser targeting, 18 age segmentation, 15, 16 Hall on, 15–16 premarital sex and, 16, 18–19 scouting and, 119–120 teen pregnancy, 246–249 transitions through, 709–713 Adoption, 20–25, 366
adoption rights movement, 24–25 gender preferences, 22–23 reforms, 21–22 transracial, 23–24 Advertising, 18 African American boys, 25–36 adoption of, 23–24 allowances, 43 apprenticeships, 52 Asian Americans and, 61 basketball and, 80–81 Big Brothers, 86, 87 blackface minstrelsy, 461–462 Booker T. Washington, 749–752 cartoons and, 685 children’s literature and, 108, 109 civil rights era, 32–33 Civil War and, 29, 162–164, 166 contemporary society, 33–35 Depression era, 31–32 disease and mortality, 207 early republic and, 226 education and, 532 foster care, 272, 274 Frederick Douglass’s experiences, 213–217, 624, 628–629 gangs and, 309, 730 grandparents and, 323 Great Depression and, 327, 329 hip-hop and rap culture, 34–35 intelligence testing, 373–374 jobs, nineteenth century, 387–388, 390 jobs, seventeenth and eighteenth centuries, 384 orphanages and, 482, 532
837
play activities, 29, 303 play areas and, 643 post–Civil War era, 29–31 poverty and, 532 public schools and, 587, 589, 591 racial violence and, 728 reading, 1600s and 1700s, 97 scouting and, 119 sexuality, 598, 599 sharecropping, 30, 395 smoking prevalence, 632 suicide and, 648 teen fathers, 246 television and, 676, 678, 681–687 tennis and, 695–696 W. E. B. Du Bois, 749–752 World War II and, 757 YMCA and, 762 See also Slavery Aggression. See Abuse; Bullying; Competition; Violence Aid to Dependent Children (ADC), 326, 481, 484, 533–534 Aid to Families with Dependent Children (AFDC), 273, 326, 484–485, 533–534 AIDS/HIV, 24, 541–542, 601, 603, 605–606 Alcohol use, 360, 541, 630, 632–637 educational approaches, 635 sexual activity and, 601 suicide and, 651 temperance movement, 103, 175 Aldrich, Thomas Bailey, 100 Alexander, Lloyd, 109 Alger, Horatio, Jr., 36–39, 100, 103, 363, 474, 539, 564–565 Allowances, 40–44
838
Index
Almshouses, 531–532 America Goals 2000, 592 American Camping Association (ACA), 147–148 American Junior Golf Association, 647 American Legion, 645 American Sunday School Union (ASSU), 101, 490, 589, 653, 656–657 American Youth for Democracy (AYD), 418 American Youth Soccer Organization, 645 Amusement and theme parks, 45–49 Anarchism, 417 Andy Hardy, 253 Anger, 232–233, 234 Apprenticeship, 49–53, 173, 282–283, 384, 389 Aretino, Pietro, 521 Arnold, Benedict, 564 Art and artists, 53–57 graffiti, 315–319 portraiture, 524–529 Ashe, Arthur, 696 Asian American boys, 58–63, 153–158 homosexuality, 62 intergenerational conflicts, 59, 157–158 interracial relations, 61–62 model minority stereotype, 60–61 television portrayals, 676 World War II and, 757 Association of Junior Leagues of America, 698 Athens, Lonnie, 735–741 Athletic activities. See Sports and athletic activities; specific sports Atlas, Charles, 89 Attention deficit disorder (ADD), 235, 413, 415 Attention deficit hyperactivity disorder (ADHD), 413, 415 Baden-Powell, Robert, 116, 117, 119, 147 Bar mitzvah, 65–69 Barbour, Ralph Henry, 106 Barnum, P. T., 716 Barrie, J. M., 699 Bartholomew, Freddie, 251–252, 496
Baseball, 69–73, 305, 639–640, 642, 645, 665 clubs, 178–179 Latino boys and, 442 Baseball cards, 71, 73–77, 298–299 Basketball, 17, 77–82, 175, 642, 645–646, 667, 668, 760 Bat mitzvah, 68–69 Batman, 183, 658, 660, 661 Baum, L. Frank, 105 Beard, Daniel Carter, 144, 301 Beecher, Henry Ward, 586 Behavioral standards, manners and gentility, 421–424 Bestiality, 597 Bicycles, 82–85, 149, 646 Big Brothers, 86–88, 177 Big Sisters, 86, 87 Binet scales, 372 Birth control, 600 Birth order, 607–608 Black Entertainment Television (BET), 685 Black Power consciousness, 33 Blackface minstrelsy, 461–462 Board games, 305 Boarding schools, 579–582 Bodies, 88–94 puberty changes, 710 Books and reading boys’ schools in, 580–581 comics and superheroes, 107–108, 180–185, 658–662 cultural diversity in literature, 108, 109–110 early U.S. boys’ experiences, 225 1800s, 98–103 hunting, 350 juvenile periodicals, 101–102, 105 1900–1960, 103–108 pulp fiction, 658–659 racial/ethnic stereotypes, 103 schoolbooks, 100, 573–578 since 1960, 108–113 1600s and 1700s, 94–98 See also specific authors Boone, Daniel, 349 Bowling, 644 Boxing, 114–116, 639, 642, 644 Boy Scouts, 4, 92, 116–122, 177, 178, 301, 350 camping, 120, 144, 147 coeducation, 179 discipline and, 203–204
firearms and, 332 gays and, 122 local racism, 119 older adolescents and, 17 religion and, 491 sexuality and, 119–120 sports and, 120 urban outreach, 121 World War II and, 754 YMCA sponsorship, 762 The Boy with Green Hair, 253 Boyce, William D., 118 Boys and Girls Clubs of America, 177 Boys’ choruses, 122–126 Boys Clubs of America, 188 Boys’ fiction. See Books and reading Boys’ Life, 105 Boys Town, 127–130, 491, 550, 565 Boys Town Choir, 124 Brace, Charles Loring, 484, 508, 509, 512, 564. See also Children’s Aid Society Braces and orthodontics, 485–488 Brothers and sisters, 606–612. See also Siblings Brown, John George, 474 Brown v. Board of Education of Topeka, Kansas, 591 Bullying, 130–136, 334, 728 gender differences, 134–135 sexual aggression and teasing, 134–135 Bunyan, John (Pilgrim’s Progress), 225, 659 Burnett, Frances Hodgson, 102, 169–170 California missions, 139–143 Campbell, Bruce, 109 Camping, 120, 144–148 Campus Crusade for Christ, 492 Captain America, 183, 658 Captain Marvel, 183, 478 Captains Courageous, 252 Card games, 299 Career education, 396 Carl D. Perkins Acts, 746 Carnegie, Andrew, 363 Cars, 149–153 drag racing, 217–220 juvenile delinquency and, 408 Cartoons, 670–674 gender considerations, 672–673
Index minorities and, 685 sex and violence messages, 673 Catholic Youth Organization (CYO), 114, 644–645 Chaplin, Charlie, 250–251, 496 Child, Lydia Maria, 103, 174 Child abuse. See Abuse Child advocacy, 3–4 Child labor as abuse, 2, 3 coal mining, 390–391, 392 Great depression and, 394–395 Lewis Hine’s photography and, 504–507 regulation, 391, 392, 393, 476–477 textile industry, 384–385, 390, 393 theatrical performers and, 494–497 See also Apprenticeship; Indentured Servants; Work Child neglect, 2, 272 Child pornography, 523 Child protective services, 4 Child Welfare League of America, 20, 22 Children’s Aid Society, 3, 271, 472, 484, 564 placing out, 508–512 Children’s Bureau, 20 Children’s Defense Fund, 4, 248 Children’s Educational Theatre (CET), 698 Children’s theater, 697–700. See also Theater Chinese American boys, 153–158. See also Asian American boys Chodorow, Nancy, 427 Chorpening, Charlotte, 699 Christian Endeavor Movement, 490–491 Christmas, 339–341 Cigarette smoking, 360, 630 Circumcision, 158–162, 604 Citizen Kane, 252 Civil rights movement, 32–33 Civil War, 162–166 African American boys and, 162–163, 166 black boys and, 29 newsboys and, 472 religion and, 490 Civilian Conservation Corps (CCC), 325, 330
Cleland, John, 521 Clemens, Samuel (Mark Twain), 100, 311, 662, 698 Close Encounters of the Third Kind, 255 Clothing, 166–172, 526 in portraiture, 526–528 Clubs, 173–179 4-H, 177, 178, 275–280 athletic, 643–644 fraternities, 284–288 gangs, 306–311 medieval, 454, 457 Coal mining, 390–391, 392 Cocaine, 357, 359–360, 541, 730 Coeducation, 17, 582 Coleman, Gary, 497, 498 Colonial-era education, 12–13 Comic books, 107–108, 180–185, 478, 658–661, 756 Common school movement, 575, 579–580, 588–589, 727 Communist Children’s Movement, 417 Communist Party, 416, 417, 477 Competition, 185–190. See also Sports Compulsory attendance laws, 590 Computers, 190–195 career choices, 195 education, 191–192 games, 193–194 Internet, 194–195 role models, 192–193 Comstock, Anthony, 522, 659 Condom use, 600 Coney Island, 45–46 Coogan, Jackie, 250, 496 Cooper, Jackie, 252 Cooper, James Fenimore, 99, 349 Cooperative Extension Service, 275, 277, 279 Cormier, Robert, 110 Corporal punishment, 201–204, 548, 551, 727–728. See also Abuse; Discipline Cosby, Bill, 685 Covenant House, 567 Cowboys, 196–199 television Westerns, 687–691 toys, 706–707 Cox, Palmer, 102 Crack, 359, 541, 730 Cricket, 639–641
839
Criminal activity. See Juvenile delinquency Crockett, Davy, 658 Crusader Rabbit, 672 Crying, 231 Cub Scouting, 116, 120, 121. See also Boy Scouts of America Cubans, 364–365 Culkin, Macaulay, 497–498 Dancing, 459, 460 Darwin, Charles, 372, 501 Dating, 17, 598, 711, 712 Davy Crockett, 688 Dead End Kids, 253 Dewey, John, 745 Diff’rent Strokes, 498 Discipline, 201–204 parental differences, 610 poliomyelitis, 517–520 punishment at school, 727–728 reformatories and, 548, 551 violence and, 2, 733 Disease and death, 205–208 frontier boyhood and, 291 orphanages and, 483 plantations and, 516 prostitution and, 541–542, 603 substance use and, 630–631, 633 sexually transmitted diseases, 541–542, 602–606 Disney, Walt, 255 Disney theme parks, 45, 47, 48 Divorce, 209–212 child adjustment problems, 210–211 fathers and, 245 grandparents and, 324 Dixon, Franklin W., 109 Dr. Seuss, 105, 107 Dogs, 499–504 Domestic violence, 725–726. See also Abuse; Violence Doubleday, Abner, 70 Douglass, Frederick, 213–217, 624, 628–629 Drag racing, 217–220 Drake, Daniel, 585–586 Drawing and painting, 53–57 Drug abuse, 357–361. See also Substance use; specific substances Du Bois, W. E. B., 108, 749, 751–752
840
Index
Du Bois Clubs, 418 Dyslexia, 413 Early republic, 223–226 preachers in, 535–538 Easter, 337–338 Edison, Thomas, 473 Education African Americans and, 30, 32–33, 532, 587, 589, 591 Brown v. Board of Education of Topeka, Kansas, 591 Chinese American boys, 156 coeducation and, 17, 582–583 colonial era, 12–13 common school movement, 575, 579–580, 588–589, 727 compulsory attendance laws, 476, 590 computer training, 191–192 emotions and, 235 English influence, 580–581 evangelical methods, 655 farm boys, 238–239 frontier children and, 291–292 Gender Equity in Education Act, 592 gender roles, 428 Hall’s adolescence theory and, 16 juvenile delinquency and, 408 learning disabilities, 413–415 left-wing, 416–419 Mexican Americans and, 440–442 military schools, 444–449 Native Americans and, 468, 469, 533 orphanages and, 484 public schools, 586 reformatories and, 547 rural schools, 586 school transitions, 711 schoolbooks, 573–578 schools for boys, 579–583 sex education, 604–605 tracking system, 413 universal school movement, 17–18 vocational education, 742–747 Einstein, Albert, 593 Emerson, Ralph Waldo, 227–230, 465 Emotions, 231–236 controlling, 235
homosexuality and, 234 Empire of the Sun, 256 Enright, Elizabeth, 107 Entertainment Software Ratings Board (ESRB), 721 E.T., the Extra-Terrestrial, 255–256 Eugenics movement, 22 Evolution, 576 Explorer Scouting, 116, 121 Fairy tales, 697, 699–700 Fanny Hill, 521 Farm boys, 237–241 cowboys, 196–199 education of, 238–239, 586 play, 239–240 poverty, 530–531 work, 386–387 World War II and, 754–755 Fathers, 241–245 adolescents as, 246–249 divorce and, 210, 211–212, 245 domestic violence and, 725 siblings’ interactions and, 610 television portrayals, 677 Federal Communications Commission (FCC), 672 Fellowship of Christian Athletes (FCA), 492 Ferrer y Guardina, Francisco, 417 Films, 250–257, 327–328 cars in, 151 child performers, 497–498 drag racing in, 219 horror, 342–347 Mexicans boys and, 441–442 World War II and, 757 Fire companies, 257–261 Firearms. See Guns First Day Society, 654 Fishing, 261–266 Flanagan, Edward, 124, 127–129, 491, 550, 565 Football, 266–270, 305, 641, 645, 668–669 Foster care, 271–275 placing out, 508–512 See also Orphanages 4-H clubs, 177, 178, 275–280 Franklin, Benjamin, 50–51, 97, 280–284, 321, 471, 521, 564, 580 Fraternities, 284–288 Freud, Sigmund, 604
Frontier boyhood, 225, 289–292. See also Cowboys; Farm boys Future Farmers of America, 754 Gambling, 295–300 Games, 301–306 colonial era, 638–639 computers and video games, 193–194, 299, 305–306, 708, 719–723 Depression era, 327 gambling, 295–300 playground movement, 303–305 pre–Civil War black children, 29 See also Sports; Toys Gangs, 306–311, 408 criminal activity, 308–309 gender identity and, 310–311 graffiti, 318 illegal substances and, 730 poverty and, 310 violence and, 729–731 Gay or bisexual boys, 569–573 Asian Americans, 62 Boy Scouts and, 122 empirical studies, 571 oppression of, 572 prostitution and, 541–542 same-sex relationships, 598 suicide and, 571, 649 television cartoons and, 674 Geisel, Theodor S., 105, 107 Gender Equity in Education Act, 592 Gender identity. See Masculinity George Junior Republic, 550 Gerry, Elbridge T., 494–495, 717 GI Joe, 673–674, 707 Gilkey, Dave, 661 Gloria, 255 Glue sniffing, 360 Goffman, Erving, 426, 660 Gold Rush, 311–315 Golf, 647 Gonzales, Elián, 365–366 Goodrich, Samuel, 99 Graffiti, 315–319 Graham, Billy, 492 Grandparents, 320–324 Great Depression, 325–330, 534, 704–705 African Americans and, 31–32 child labor, 394–395
Index Greeley, Horace, 52, 585 Grimes, Nikki, 111 Griswold, M. M., 466 Growth spurt, 710 Gulick, Luther Halsey, 667–668, 692, 760, 761 Guns, 330–334, 350 schools and, 728–729 suicide and, 650–651 toys, 704 Gymnastics, 647 Hairstyles, 172 Hall, G. Stanley, 15–16, 203, 232, 350, 457, 501, 604, 761 Hallam, Lewis, Jr., 493–494 Halloween, 338–339 Hardy Boys, 106–107, 109 Harris, Aurand, 699 Harris, Joel Chandler, 102–103 Heinlein, Robert, 108 High school, 17–18 Highlights for Children, 105 Hine, Lewis, 504–507 Hinton, S. E., 110 Hip-hop and rap music, 34–35, 60, 316, 685 Hispanic boys gangs, 730 Mexican Americans, 439–443 sexuality and, 599 smoking prevalence, 632 teen fathers, 246 television portrayals, 676 Hitler’s Children, 253 HIV/AIDS, 24, 541–542, 601, 603, 605–606 Hockey, 353–357, 646 Holidays, 337–341 Home Alone, 256, 498 Homeless children, 328–329, 564–567. See also Runaway boys Homicide, 332, 410 family member as perpetrator, 725 parricide, 726 Homosexuality. See Gay or bisexual boys; Same-sex relationships Horror films, 342–347 Houdini, Harry, 564 How Green Was My Valley, 253 Huckleberry House, 567 Huffing, 360 Hughes, Thomas, 580, 666, 667 Hull House, 668
Hunting, 347–351 Ice hockey, 353–357, 646 Ice skating, 644 Illegal substances, 357–361 gangs and, 730 juvenile delinquency and, 408 See also Substance use Immigrants, 361–366 adoption of, 366 refugees, 364–365 work, 363–364, 389–390 See also Asian American boys Immunization, 517, 518, 519, 520 Indentured servants, 52, 363, 367–371, 383–384, 508, 509, 514, 531–532, 548 Inhalants, 360 Intelligence testing, 371–375, 441 Inter-Varsity Christian Fellowship, 492 International Workers Order (IWO), 417, 418 Irving, Washington, 99 Jacques, Brian, 111 Jefferson, Thomas, 348, 377–381, 515 Jewish boys bar mitzvah, 65–69 circumcision, 159 Young Men’s Hebrew Association (YMHA), 645, 764–767 Jewish Community Centers, 767 Job corps, 534 Jokes, 397–401 The Jungle Book, 253 Juvenile delinquency, 18, 406–411 boys as criminals, 5 gangs and, 308–309, 408 home violence and, 411 IQ and, 374 legal system and, 401–406 Mexican American boys and, 442 sexual delinquency, 410 status offenses, 408–409, 567 World War II and, 757–758 youth crime rate trends, 405–406 See also Juvenile justice
841
Juvenile justice due process protections, 405, 409 gender discrimination, 410 incarceration, 404 juvenile courts, 401–406, 550 nineteenth-century reformatories, 545–549 parens patriae, 402, 406, 547 probation, 404, 409 serious and violent offenders, 405, 409, 410 social class and, 409 twentieth-century reformatories, 549–553 Juvenile Justice and Delinquency Prevention Act, 552–553, 567 Kanter, Rosabeth Moss, 427–428 Kelley, Florence, 476 Kennedy, John F., 273 Kessler, Suzanne, 429 The Kid, 251, 496 Kinkel, Kip, 726 Knox, Thomas W., 350 Konigsburg, E. L., 110 Kramer vs. Kramer, 255 Labor Youth League, 418 Lacrosse, 665 Lancaster, Joseph, 587 Learning disabilities, 413–415 Lee, Stan, 661 Left-wing education, 416–419 Legal system, 401–406 L’Engle, Madeleine, 109 Lewis, Dorothy, 734 Libertarianism, 417 Literature. See Books and reading Little House books, 107 Little League Baseball, 72, 178, 305, 645, 668–669 Little Lord Fauntleroy, 102, 169–170, 251, 495 London, Jack, 564 The Lone Ranger, 688 Love, 233 Mad, 184, 661 Maltreatment. See Abuse Mann, Horace, 587 Manners and gentility, 421–424 Marble games, 298, 303 Marijuana, 357–359 Masculinity competitiveness, 188
842
Index
Masculinity (continued) emotions and, 231 gangs and, 310–311 jokes and, 400 masculinities, 425–429 physique and, 89–94 portraiture, 526 suicidal behavior and, 650 video games and, 720 violence and, 2 See also Muscular Christianity Masturbation, 16, 430–434, 597, 763 Mather, Cotton, 95 McDowall, Roddy, 253 McFarlane, Leslie, 106 McGuffey’s readers, 98, 100, 575 McMahon, Vincent, 661–662 Means, Florence Crannell, 108 Measles vaccine, 519 Medicaid, 534 Medieval boys’ clubs, 454, 457 Mednick, Sarnoff, 733 Melodrama, 434–439 Men and Religion Forward, 454, 457 Menendez, Eric and Lyle, 726 Mexican American boys, 439–443 Mickey Mouse Club, 707 Migrant families, 329 Military schools, 444–449 Minow, Newton, 672 Moody, D. L., 490 Mortality and illness, 205–208 Mothers, 450–453 abuse of children, 725 divorce and, 210 siblings’ interactions and, 610 television portrayals, 677 Mothers against Drunk Driving (MADD), 633 MTV, 685 Multiracial boys, 62 Muscular Christianity, 72, 92, 145, 454–458, 492, 580, 598, 639, 666–668, 760. See also Sports; Young Men’s Christian Association Muscular physique, 88–94 Music, 459–462 boys’ choruses, 122–126 hip-hop and rap, 34–35, 60, 316, 685 rock bands, 557–559 Myers, Walter Dean, 111
Naismith, James, 77, 175, 667 National Alliance for Youth Sports, 669 National Association for the Advancement of Colored People (NAACP), 682 National Association of Black Social Workers, 23 National Association of Broadcasters (NAB), 672 National Child Labor Committee (NCLC), 476, 504, 505 National Industrial Recovery Act, 478 National Newsboys’ Association (NNA), 178 National Survey of Adolescent Males (NSAM), 599 National Youth Administration (NYA), 325, 330, 443 Nationalism, 465–466 schoolbooks and, 575, 578 Native American boys, 467–470 athletic activities, 80 boarding schools, 469, 533 California Gold Rush and, 313 California missions, 139–143 fathers and, 242 foster care, 274 games, 665 poverty and, 533 reading material, 96–97 suicide rates, 648, 649 television Westerns and, 687, 690 Neglect, 2, 272 New York Children’s Aid Society. See Children’s Aid Society Newbery, John, 94, 96 Newsboy Welfare Committee, 476–477 Newsboys, 178, 391, 471–479 theaters and, 437–438 North American Man-Boy Love Association (NAMBLA), 523 Nurturing behavior, 503 Occupations, vocational education, 742–747 O’Dell, Scott, 109 Orphanages, 4, 481–485, 509 African Americans and, 482, 532 Orthodontics, 485–488
O’Sullivan, John, 465, 466 Our Gang, 251, 496 Packer, Edward, 110 Page, Thomas, 102 Parachurch ministries, 489–493 Parens patriae, 402, 406, 547 Parental relations, divorce, 209–212 Parricide, 726 Pastor, Tony, 715, 716 Patten, Gilbert, 105 Paulsen, Gary, 111 Payne, John Howard, 494 Peck, George, W., 100 People for the American Way, 578 Performers and actors, 493–498 Pestalozzi, Johann Heinrich, 588 Peter Pan, 699 Pets, 239, 499–504 Petting, 598 Photographs, 504–507 Physical appearance, 88–94, 710 Piaget, Jean, 593 Pilgrim’s Progress, 225, 659 Placing out, 508–512 Plantations, 512–517 Plato, 662 Play. See Games; Sports; Toys Playground Association of America, 642, 643 Playground movement, 303–305, 643, 692 Poliomyelitis, 517–520 Polovchak, Walter, 365 Pony League, 645 Pop Warner football, 270, 305, 645, 668–669 Popular culture. See Art and artists; Films; Television Pornography, 520–524 Portraiture, 524–529 Poverty, 529–535 foster care and, 273 gangs and, 310 runaways and, 566–567 See also Great Depression Pregnancy, teenage, 19, 247 Probation, 404, 409 Professional wrestling, 661–662 Prostitution, 538–543, 566 health and, 603 World War II and, 758 Puberty transitions, 15, 710–711. See also Adolescence Public schools, 584–592, 657
Index African Americans and, 587, 589, 591 common school movement, 579–580, 588–589 compulsory attendance laws, 590 sports and, 590 Public Schools Athletic League (PSAL), 188–189, 667, 692–693 Purity leagues, 603 Racism scouting and, 119 textbooks and, 577–578 See also African American boys; Slavery Radio programs, 327, 688, 705, 756 Rap and hip-hop music, 34–35, 60, 316, 685 Rayburn, Jim, 491 Reading. See Books and reading Reasoning, 593–596 Reformatories, 404, 509 discipline in, 548, 551 nineteenth-century, 545–549 twentieth-century, 549–553 Refugees, 364–365 Religion bar mitzvah, 65–69 Boy Scouts and, 491 California missions, 139–143 Civil War and, 490 parachurch ministries, 489–493 preachers in the early republic, 535–538 reformatories and, 546–547 schoolbooks and, 574 YMCA and, 761–763 See also Sunday schools; Young Men’s Christian Association Revolutionary War, 553–556 Riis, Jacob, 476 Rites of passage bar mitzvah, 65–69 hunting, 348, 349 Native American ceremonies, 468 Robinson, Edgar M., 761–762 Rock bands, 557–559 Rock climbing, 647 Roller skating, 646, 703 Rooney, Mickey, 253
Roosevelt, Theodore, 178, 188, 189, 276, 484, 559–563, 692 Rowling, J. K., 112 Runaway boys, 328, 564–567 poverty and, 566–567 prostitution, 566 shelters for, 567 Sachar, Louis, 112 Safe-sex practices, 600 Same-sex relationships, 569–573, 600. See also Gay or bisexual boys School transitions, 711 Schoolbooks, 100, 573–578 bilingual textbooks, 577 nationalism in, 575, 578 racism in, 577–578 religious influences, 574 Schools, boys’, 579–583 Schools, military, 444–449 Schools, public, 584–592, 657. See also Common school movement; Education Schools, Sunday. See Sunday schools Schools, violence at, 727–729 Scientific reasoning, 593–596 Scieszka, Jon, 112 Scooters, 614 Screen Actors Guild (SAG), 498 Seeger, Pete, 419 Sendak, Maurice, 109 Settlement houses, 72, 78, 79, 667–668 Sex education programs, 604–605 Sex roles. See Masculinity Sex therapy, 433 Sex work, 538–543 Sexual aggression, 134 Sexual delinquency, 410 Sexual orientation, 569. See also Gay or bisexual boys; Same-sex relationships Sexuality, 597–602 adolescence theories and, 16, 18–19 bestiality, 597 cartoons and, 673 Hall’s adolescence theory and, 16 masturbation, 430–434, 597, 763 pornography, 520–524 prostitution, 538–543, 598 Puritans and, 597
843
purity leagues, 603 race/ethnicity and, 598, 599 same-sex relationships, 569–573 scouting and, 119–120 sexual health, 601–602 social purity movement, 433 substance use and, 601 teen fathers, 246 transitions of adolescence, 711–712 unprotected sex, 542 YMCA and, 763 See also Gay or bisexual boys Sexually transmitted diseases, 602–606 prostitution and, 541–542, 603 Shane, 254 Sharecroppers, 30, 387, 395, 532 Shelters, for runaways, 567 Sheppard-Towner Maternity and Infancy Protection Act, 533 The Shining, 255 Shyer, Marlene Fanta, 110 Siblings, 606–612 birth order, 607–608 differential treatment of, 610–611 disabilities and, 607 gender and, 608–609 jealousy of, 234 parents and, 610 Single mothers, television portrayals, 676–677 Sisters and brothers, 606–612 Skateboarding, 614–618, 646 Skiing, 618–622, 646 Slavery, 25–29, 626–630 Booker T. Washington’s experiences, 749–750 Civil War and, 162–163 corporal punishment, 202 early republic and, 226 emotional expression, 231 fathers and, 242–243 Frederick Douglass’s experiences, 213–217, 624, 628–629 games and play activities, 303 indentured labor and, 52, 370–371 interactions with white children, 628 literature about, 102–103
844
Index
Slavery (continued) plantations and, 512, 514, 515–516 poverty and, 532 slave trade, 622–626 work, 28–29, 384, 387 Smith, Lane, 112 Smith-Hughes Act, 590, 745 Smoking, 360, 630–637 Snow sports, 618–622, 646 Snowboarding, 621–622, 646 Soapbox derbies, 150 Soccer, 641, 645 Social purity movement, 433 Socialist Party of America, 416 Society for the Prevention of Cruelty to Children (SPCC), 272, 474, 494–496, 717 Soul Train, 683–684 Special education learning disabilities, 413–415 standardized testing, 373–374 Special needs children, adoption of, 23 Spiderman, 661 Spielberg, Steven, 255 Spinelli, Jerry, 111 Spock, Benjamin, 160 Sports, 189 adult misbehavior and youth programs, 669 athletic clubs, 643–644 Boy Scouts and, 120 clothing styles, 172 colonial era to 1920, 637–642 discipline, 203 emotions and, 233–234 Fellowship of Christian Athletes, 492 injuries, 9–10 masculine identity and, 92. See also Muscular Christianity 1921 to the present, 643–647 playground movement and, 305, 642, 643 public schools and, 590 teams, 665–670 YMCA and, 17, 72, 77, 78, 92, 175, 188, 203, 454, 457, 641, 645–646, 667, 692, 760 YMHA and, 766 See also specific sports Standardized testing, 371–374 Stanford-Binet Intelligence Scales, 372
Status offenses, 408–409, 567 Steig, William, 110 Stepfathers, 211–212 Steroid abuse, 88 Stine, R. L., 112 Stowe, Harriet Beecher, 699 Stratemeyer, Edward L., 102, 106, 108 Student Nonviolent Coordinating Committee (SNCC), 33 Students against Drunk Driving (SADD), 633 Substance use, 357–361 educational approaches, 635–636 gangs and, 730 health consequences, 630–631, 633 problem behaviors and, 408, 541, 601, 633–634 smoking and drinking, 630–637. See also Alcohol use; Tobacco use steroids, 88 suicide and, 651 See also specific substances The Sugarland Express, 255 Suicide, 332, 648–653 causes of, 651–652 ethnic and gender differences, 648–651 gays and, 571 prevention education, 652 Summer camps, left-wing, 418–419 Sunday schools, 175, 456, 490, 589, 653–657 Superheroes, 107–108, 182–183, 478, 658–662 Superman, 182–183 Surfing, 614, 646, 658, 660 Swimming, 647 Syphilis, 604, 605 Team sports, 665–670. See also Sports Teddy bears, 703 Teenage parents, 246–249 Teenage pregnancy, 19, 247 Television cartoons, 670–674 child performers, 497–498 domestic comedy and family drama, 675–681 racial/ethnic issues, 676, 678, 681–687 toys and, 706–708
vaudeville and, 718–719 Westerns, 687–691, 706 Temperance movement, 103, 175 Temple, Shirley, 251 Tennis, 641–642, 647, 692–696 Textile mills, 384–385, 390, 393 Theater, 697–700 actors and performers, 493–498 blackface minstrelsy, 461–462 melodrama, 434–439 newsboys and, 437–438 vaudeville, 715–719 Thoreau, Henry David, 348 To Kill a Mockingbird, 254 Tobacco use, 360, 630–637 educational approaches, 635–636 Tom Swift Series, 106 Toys, 701–708 action figures, 707–708 bicycles, 82–85 cowboy, 706–707 Depression era, 704–705 farm boys and, 239 frontier boys and, 291 in portraiture, 528–529 sex stereotyping, 703–704 television and, 706–708 weapons, 704, 705–706 See also Games Track and field, 647 Trains, 704 Transgendered persons, 541 Transitions through adolescence, 709–713 Truancy, 407 Tunis, John, 107 Uncle Tom’s Cabin, 699 United Boys’ Brigades of America, 454, 457 United Nations Children’s Fund (UNICEF), 4 United Nations Declaration of Human Rights, 4 Upward Bound, 534 Vaccination, 7, 517, 518, 519, 520 Vaudeville, 715–719 Venereal diseases. See Sexually transmitted diseases Video games, 193–194, 299, 305–306, 708, 719–723
Index ratings system, 721 violence and, 720 Violence, 374 action figure toys and, 708 bullying, 130–136, 728 cartoons and, 673 child-on-parent, 726 comics and pulp fiction, 658, 660 crime, 405, 409, 410 gangs and, 729–731 guns, 330–334 history of, 724–731 masculinity and, 2 poverty and, 535 prostitution and, 542 racial, 728 school settings, 727–729 societal tolerance and celebration of, 5 suicide, 648–653 theories of, 733–741 toy weapons, 704, 705–706 video games and, 720 See also Abuse; Discipline Violentization theory, 735–741 Virginia Manual Labor School, 551 Vocational education, 742–747 Volleyball, 645, 667, 760 Voluntary societies, 490 War on Poverty, 534 WarGames, 722 Washington, Booker T., 749–752 Washington, George, 423–424, 656 Washington, Mali Vai, 695 Webster, Noah, 94, 97 Wechsler Intelligence Scale for Children (WISC), 372 West, James E., 118 Westerns, 687–691 White, E. B., 107 Whitman, Walt, 539 Whittier State Reform School, 551 Wife beating, 725–726
Wilder, Laura Ingalls, 107 Williams, George, 175, 759 Winter sports, 618–622, 646 Wojciechowka, Maia, 109 Wonder Woman, 660 Woodham, 726 Woods, Tiger, 695 Woodson, Carter, 108 Work accidents and mortality, 7 at-home crafts, 386–387 career education, 396 computer-associated careers, 195 cowboys, 196–199 dangers of, 392–397 Depression-era African Americans, 31–32 enslaved boys, 28–29 farm boys and, 237–241, 386–387 frontier boys, 290, 291 Great Depression and, 326–327, 394–395 hiring out, 383 illegal activities, 395–396 immigrants and, 363–364, 389–390, 441 jobs in the seventeenth and eighteenth centuries, 381–385 jobs in the nineteenth century, 385–391 jobs in the twentieth century, 392–397 placing out and, 508–509 poor boys, 531–532 prostitution, 538–543 slave children and, 384, 387 transitions of adolescence, 712 urban youth, 389 vocational education, 742–747 World War II and, 753 See also Apprenticeship; Child labor; Indentured servants; Newsboys
845
Works Progress Administration (WPA), 443 World War II, 753–758 World Wrestling Federation, 661–662 Wrestling, 639, 661–662 XY syndrome, 734–735 The Yearling, 253 Yelnats, Stanley, 112 Yep, Laurence, 110 Young Men’s Christian Association (YMCA), 4, 175, 178, 759–764 anti-obscenity campaign, 522 athletic activities and, 17, 72, 77, 78, 92, 175, 188, 203, 454, 457, 641, 645–646, 667, 692, 760 camping, 144, 145–146 Christian Citizenship Training Program, 764 racial issues, 762 recruitment of prepubescent boys, 17 religiosity and, 146, 761–763 religious orientation, 146 Scouting and, 118, 762 sexuality and, 603, 763 World War II and, 755 See also Muscular Christianity Young America movement, 465–466 Young Communist League (YCL), 418 Young Life, 491 Young Men’s Hebrew Association (YMHA), 645, 764–767 Young People’s Socialist League (YPSL), 416 Young Pioneers of America, 417 Youth authority agencies, 552 Youth for Christ, 492 Youth Risk Behavior Surveys (YRBS), 599
About the Editors
Priscilla Ferguson Clement, professor of history at Pennsylvania State University, Delaware County, is the author of Growing Pains: Children in the Industrial Age, 1850–1890. Jacqueline S. Reinier, professor emerita at California State University at Sacramento, is the author of From Virtue to Character: American Childhood, 1775–1850. Both scholars have published widely in children’s history.
848