The Social Cost of Deindustrialization: Postwar Trenton, New Jersey

Patrick Luckie

Studying local history is something that is often overlooked and underestimated in social studies classrooms around the country. Think about it—do you have any memory of learning about your own local community in a coordinated school or social studies effort? Big ideas like imperialism, global culture, and other themes of the past and present usually take precedence over learning about one’s own local history in the high school. As part of my undergraduate senior research project at Rider University, I grappled with this fact and produced a short study of my own local history which I used to inform my instruction in the classroom. This article will present the research I have done and will end with a short analysis of how my research project on local history has affected my instruction in Ewing High School and how it can change the way we think about teaching local history in all American high school social studies classrooms.

These powerful words were written by Dr. Jack Washington, a teacher of Social Studies in Trenton public schools for over 40 years and author of, The Quest for Equality: Trenton’s Black Community 1890-1965 which traces racial struggle and movements for equality over the city’s history. Trenton’s uniqueness as Washington describes, is a product of its deep history, rooted in the American Revolution, World War II, and the Civil Rights Movement of the 1960s. Trenton was once a manufacturing powerhouse, home to multiple industries which forged the urban landscape of the state’s capital and produced thousands of union jobs for its inhabitants. These included the mighty John A. Roebling’s Sons Company, which aided in the creation of the Brooklyn Bridge and whose factory in West Chambersburg served as a symbol of innovation and opportunity for decades. Trenton’s pottery industry was also one of the largest and most successful in the whole nation alongside its iron, steel, rubber, and textile companies. Together, these industries provided enough stable employment and pay to support a rapidly growing population of mostly first and second generation European immigrants from Italy, Ireland, Germany, Poland, and Hungary, to name a few. Trenton’s manufacturing prowess was best showcased in 1917 with the first lighting of the famous “Trenton Makes, The World Takes” sign on the Lower Trenton Bridge, a symbol which still stands today in 2023.

 The “golden age” of the city, as historian John T. Cumbler describes it, lasted from around 1850 to 1920 when Trenton established itself as one of the manufacturing capitals of the nation.[2] Almost perfectly situated between two of America’s largest cities in New York and Philadelphia, Trenton industrialists used its strategic geographic location along the Delaware River to tap into large markets and supply the massive manufacturing needs of the east coast. Trenton at this time was truly a symbol of the American dream, and people flocked to the city in search of opportunities. By 1920, the population of the city surpassed 119,000 people and it was amongst the most densely populated places in the state of New Jersey.[3]

The first signs of the city’s decline came with the weakening of its labor movement. By the 1920s, the age of mechanization had begun and the economic shift from factory work to mechanized manufacturing began weakening labor unions overtime. Worker’s unions and cooperation between owners and workers alike had been central to the functioning of the local economy and the glue by which the city binded itself together. Overtime, businesses could no longer maintain the standards of work they had previously upheld and conditions within the city started to slowly deteriorate. From 1910-1920 Trenton underwent its largest leap in population within a decade and shortly thereafter it began experiencing some of its greatest economic struggles. Plants began relocating outside of the city and unionized jobs were becoming more and more difficult to attain. Economic historians have grappled with this shift in the post-war era, claiming “US corporations aggressively sought to break free of expensive union contracts and to seek out ways to pay lower wages and allied social costs in order to increase profits.”[4] This is a persistent trend in this study. With great increases in population and the changing state of the local and national economy, Trenton suffered meaningful losses in employment and manufacturing output.

With the Great Depression beginning in 1929 and the waging of the Second World War in 1939, Trenton retreated back to manufacturing and away from addressing the issues surrounding labor which had marked its initial decline. The waging of the war meant a massive nation-wide mobilization of industry towards fueling the war effort. The war-time economy of Trenton temporarily revitalized the city. Roebling’s Sons employed droves of new workers, opportunities for overtime became more available, unions strengthened, worker’s pay went up, and the largest wave of black migrants in the city’s history began making their way to Trenton beginning in the 1940s.[5] These migrants came to Trenton and other cities in what is known as The Great Migration. That is the movement of millions of African Americans predominantly from the rural southern states to the urban north and midwest between 1910-1970.

This temporary boom did not yield long-term progress for Trenton in the post-war period. During the 1950s, many of the city’s largest industries began relocating outside the city limits and the economy did not adequately support its largest ever population of over 129,000 people.[6] In 1952, Trenton’s most popular employer Roebling’s Sons was sold to Colorado Fuel and Iron Company which over the next decade cut its employment numbers in Trenton and relocated its major manufacturing and business centers outside the city limits. This was the fate for many of the most popular industries within the city which sold their shares to larger corporations after WWII, leaving the fate of the city’s economy in the hands of interests which had little to no connection to it. The rubber, steel, iron, and pottery industries which had defined the city of Trenton and produced its “golden age” became shadows of their former selves and the physical conditions of the city reflected this change. Overtime, thousands of industrial jobs were lost and the population of Trenton dropped 13,382 people from 1950 to 1960 and an additional 9,381 people the following decade.[7] Population decline continued to the year 2000 and stabilized between 80,000 to 90,000 in the 21st century. 

This study seeks to answer two fundamental questions: 1) What were the major effects of deindustrialization on Trenton, NJ in the decades immediately following WWIII? 2) How were these effects felt by the people living within the city at this time? In answering these questions, this study will provide a lens through which race and class come to the forefront of the discussion. Trenton’s decline overlaps with the migration of thousands of African Americans to the city in search of economic opportunities. This demographic shift was the largest in the city’s history and was not met with opportunity but rather inequality and increased racial tension. The major effects of deindustrialization on Trenton, NJ in the post-war period were economic destabilization, movement to the suburbs, and increased racial tensions between white and black Trentonians. Each subsection of this work will dive into these effects individually as well as their overall impact on life in Trenton. It is important to recognize that this movement away from manufacturing and its effects were not phenomena restricted to certain areas or regions. Rather it was a national trend which all rust belt cities like Trenton grappled with in the 21st century. In addition to deindustrialization broadly,  the age of mechanized labor, the shifting of the U.S. economy towards greater support for large corporations, and the social movements of the 1960s all played extremely important roles in shaping American cities in the post-war era.

Secondary source literature on the decline of U.S. cities in the post-WWII period falls into the fields of American urban, economic, and social history. One of the most popular works on these subjects is historian Thomas J. Sugrue’s The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit, which examines the many ways in which American cities began to decline following WWII with specific focus on racial inequality and division. In his work, Sugrue states that Trenton, like Detroit and other rust belt cities of the time, experienced hundreds of thousands of layoffs in manufacturing jobs nationwide due to the changing state of the U.S. economy and the lack of government spending allocated towards Northern cities.[8] These conditions radically transformed urban environments into almost unrecognizable versions of their industrial heights. Sugrue explores the connections between suburbanization, demographic change, and the racial attitudes of northern whites to produce an all-encompassing case study of the decline of Detroit. At the heart of his argument is that racial segregation and inadequate political responses to signs of crisis determined the fate of the city. The importance of this historical research cannot be overstated. Before this book was originally published in 1996, the stories of Detroit and other American cities who suffered from the consequences of deindustrialization and racial division in the post-war period were largely untold. The Origins of Urban Crisis continues to be one of the most influential modern studies of American urban history and is without doubt one of the most cited pieces of literature in the field.

Jefferson Cowie and Joseph Heathcott, who together produced Beyond the Ruins: The Meanings of Deindustrialization,built on the historical research of Sugrue by studying the impact of post-war deindustrialization across the nation. This book seeks to progress the conversation of historic decline to modern solutions for urban decay and economic instability. In doing so, it compiles a collection of essays from historians and other professionals to further explore deindustrialization and its impact on American cities.[9] From this perspective, the authors identify a complexity of causes and effects of urban decline which vary from city to city but share many similarities nationally. The value of this work is in its wide-scope. By compiling essays from multiple professionals in a variety of related disciplines, the image of declining cities in the U.S. following WWII becomes more clear than ever.

The most recognized work on post-war deindustrialization in specifically Trenton, New Jersey lies within historian John T. Cumbler’s A Social History of Economic Decline: Business, Politics, and Work in Trenton. This book outlines a long trajectory of economic conditions in Trenton beginning in the 1920s with focus on the Great Depression and researches the changing nature of the city up until the book’s publishing in 1989. One of Cumbler’s main arguments includes the notion that America experienced a gradual economic shift from civic to national capitalism following the Great Depression which empowered large corporations while simultaneously destroying the small businesses which held many industrial cities together.[10] He also explores the rich history of the city’s most impactful industries, politicians, union leaders, and manufacturing workers to provide a comprehensive view of Trenton’s economic and social decline. This work provides the foundation of historical knowledge on Trenton required to produce further research on this topic. However, Cumbler’s history of Trenton does not extend as far into the social consequences and effects of deindustrialization as one might expect. Nevertheless, virtually any modern historical literature on the city of Trenton cites this work. This points to the undying credibility of Cumbler as a historian and shows the importance and relevance of his arguments to the continued study of the city’s history.

More recent historical literature on related topics has largely focused on national trends of suburbanization and racial conflict. One such journal article titled “The Rural Past-in-Present and Postwar Suburban Progress” by University of Waterloo professor Stacy Denton studies the shift towards suburbanization following WWII. The author highlights the transformation of previously rural spaces to suburban landscapes and the implications of such transformations on national attitudes and beliefs towards race, culture, and class.[11] In a similar light, economic historian Leah Platt Bouston’s 2007 work “Black Migration, White Flight: The Effect of Black Migration on Northern Cities and Labor Markets” studies the effects of The Great Migration on northern cities and their economies. She also dives into the racist attitudes of northern whites which manifested themselves in movements out of increasingly diversifying cities and into the surrounding suburbs as part of a process termed “white flight.”[12] Both these works of history are incredibly valuable to this study of post-war Trenton for the topics and findings of their research are amongst the greatest effects of deindustrialization on the city.

The research done in this paper will synthesize the secondary source material on the decline of U.S. cities and apply their findings to a specific case study of Trenton, New Jersey. In doing so, it will paint a clearer picture of the more immediate social and economic effects of deindustrialization on the city in the decades following WWII. This will add to the historiography of urban history and Trenton historical study by compiling primary and secondary source documents to more deeply understand the major effects of deindustrialization and economic transformation on the city.  These major effects include economic destabilization, massive suburbanization, and increased racial tension. These symptoms of deindustrialization were felt most harshly by the city’s poor ethnic-white and growing black population. More specifically, economic decline in Trenton coincided with the arrival of black migrants which compounded racist attitudes and practices within the city. This is most clear in workplace and housing segregation which new migrants had to face upon their arrival.

Industry leaving Trenton following WWII radically changed the city’s local economy. Unionized factory jobs became harder to attain, poor residents were left with fewer options, and Trenton’s growing black community was segregated in their employment. Long-time union workers like those who worked in the pottery and steel plants found themselves in an unfamiliar situation. As Cumbler explained, “Those workers thrown out of work by plant closings had the hardest time finding work and represented the largest number of Trenton’s unemployed.”[13]

The selling of corporations like Roebling’s Sons produced a much weaker focus on the city’s manufacturing growth and output and instead, large corporations sought for the relocation of facilities and workers to outside the city. This left the existing workforce in the city out to dry and decreased options for employment, especially among the lower-income white and minority black populations.

 One action taken by the state and local government to fill this gap created by fleeing industry was growth in the employment of state workers and other public jobs. New Jersey state workers were in the 1950s and 60s, as they still are in the present day, centralized in the capital city of Trenton. Cumbler described this shift from manufacturing to public work as, “Blue Collar to White Collar and White Smock.”[14] This provided some relief to the city’s unemployment problem which exceeded the national average through the 1950s and 60s but it did not come close to meeting the pay and benefit standards that manufacturing jobs had produced just a decade prior. Additionally, the large majority of state workers employed at this time were disproportionately white men. Despite these changes, public and state employment was not enough to lift the city out of its economic slump nor its inherent issues with workplace discrimination.

A large part of the story of economic destabilization in Trenton as a product of deindustrialization was the negative consequences on its black community. Former Trentonian and author Helen Lee Jackson published her autobiography in 1978 charting her experience with racial discrimination as a black woman seeking meaningful employment in the city. Her description of Trenton reads as follows:

In 1940, Trenton was an industrial city with many potteries. Steel mills, factories, and a large auto plant, but the production lines were almost solidly white. Black men swept the floors, moved heavy equipment and shipping crates, and performed other burdensome tasks. In the business sections, they were almost invisible except as window cleaners, janitors, or elevator operators. There were no black salespeople in the stores, banks, or business offices. They were hired as maids, package wrappers, or seamstress. Even the five-and-ten-cent stores refused to hire blacks, except to sweep, dust, or move stock.[15]

Jackson’s firsthand experience with racial segregation and inequality in the city in the 1940s is a reflection of the racial attitudes and prejudices in Trenton and other northern cities earlier in the 20th century. Racist attitudes towards black migrants who largely came from the south was a characteristic of many industrial cities in the U.S. at this time as is highlighted in Sugrue’s work on Detroit and other rust belt cities. With greater numbers of black migrants entering northern cities, the problem of racial discrimination and inequality intensified and the competition for jobs in short supply fuel racist attitudes. According to Sugrue, a combination of factors including employer bias, the structure of the industrial work place, and the overarching ideologies and beliefs of racism and black inferiority contributed to this workplace segregation.[16] For Trenton, these differences in employment were visible to the observer and significantly impacted the lives of those seeking stable income. With the collapse of industry happening simultaneously with a dramatic increase in the city’s black population, this problem compounded. Black residents were not only excluded from whatever factory jobs were left on the basis of their race but they were also labeled as the source of the city’s problems altogether.

In a 1953 study of community services in Trenton, researchers found that the average black resident experienced twice as much unemployment and earned on average 30% less total income than the average white person at this time despite only a one year difference in their average acquired education.[17] These statistics are proof of income inequality and workplace discrimination and provide insight into the lived experiences of black people in Trenton at this time. Furthermore, research from The Journal of Economic History, suggests “black workers were channeled into negro jobs and faced limited opportunities for promotion.”[18] Access to financial resources and meaningful employment were among the largest reasons for black migration to Trenton and other northern cities. Upon their arrival however, they were met with egregious workplace discrimination and were given very little opportunities to climb the economic ladder. Black women specifically made up, “The least utilized pool of potential industrial labor power having much less than proportionate representation with her white counterpart” according to a 1950s study titled, The Negro in the Trenton Labor Market.[19] Many black women, including Helen Lee Jackson, struggled even more so than black men to find employment within the city. These conditions forced economically disadvantaged men and women alike to scramble for jobs and income in order to support themselves and their families.

Changes to the manufacturing economy and workplace discrimination created great instability in Trenton during the 1950s and 60s. Old union workers were suddenly left jobless and the fruits of their loyal labor to the city’s largest industries were now gone. Attempts to revitalize the economy largely failed and economic decline impacted the poor and minority black population of the city more harshly than anyone else in the form of unequal pay and limited job opportunities. With this knowledge, it becomes clear that deindustrialization and the exodus of industry destroyed the economy of Trenton that was historically forged by large-scale manufacturing and robust labor unions and disproportionately affected the new and growing black community.

Another major consequence of postwar deindustrialization on America’s rustbelt cities was the creation of and migration to the suburbs. Suburbs are the areas where urban centers like Trenton, NJ extend into previously rural environments where new housing developments, industries, and townships began to populate with greater and greater numbers of prior city-dwelling individuals. Historian Kenneth T. Jackson’s work on suburbanization titled,

Crabgrass Frontier: The Suburbanization of the United States,  provides the best historical analysis of this phenomenon which swept the nation in the 20th century. Among many important factors, he claims that the roots of suburbanization can be traced to the boom of the automobile industry in the 1920s which enabled those who could afford it to move further and further away from the cities in which they worked. Jackson states, “Indeed the automobile had a greater spatial and social impact on cities than any technological innovation since the development of the wheel” He goes further to explain, “After 1920 suburbanization began to acquire a new character as residential developments multiplied, as cities expanded far beyond their old boundaries, and as the old distinctions between city and country began to erode.”[20]

For Trenton NJ, this shift towards the suburbs was gradual beginning in the 1920s and peaking during the 1950s. It is important to note that suburbanization in Trenton and in cities across the nation happened gradually into the late 20th century. This coincided with a decline in major industries and jobs. Historical research on suburbanization has also revealed that many of these white suburbanites moved to the suburbs to create a physical barrier between them and their racial counterparts.[21] As a result of these factors, thousands of residents with the financial freedom to do so began expanding into the towns on the periphery like Hamilton, Ewing, and Lawrence. Many of whom continued to work as state workers or in other capacities inside Trenton while living outside the city. These towns saw unprecedented growth in the post-WWII years in housing developments thanks to VA and FHA loans which were granted to veterans of the war as part of president Franklin Delano Roosevelt’s New Deal Reforms.[22] It is important to note that these New Deal programs were especially beneficial to white service members and much historical literature has been written about the exclusionary practices associated with housing loans in relation to African Americans. This is relevant because during and shortly after WWII, the largest wave of black migrants traveled from predominantly southern states to Trenton and other northern cities in search of employment opportunities associated with the mobilization of industry towards the war effort. This search for opportunity overlapped with the decay of Trenton’s largest industries, leaving many black migrants below the poverty line, working menial jobs as opposed to fruitful unionized jobs, and in some cases, out of work completely. Compounding these issues was the inaccessibility of reasonable home loans for members of the black community.

The effects of suburbanization on the local economy of Trenton and its inhabitants can be seen through analysis of the popular media. Pride Magazine was a Trenton-based publication which centered its content around black businesses and black business owners. This specific magazine concerned itself with the failure of local politicians to enact positive change in the form of urban renewal plans which were targeted at improving the infrastructure, housing, and employment opportunities within the city. In March of 1972, Pride Magazine issued a publication titled, “Black Businesses Need Your Help!” which featured a section written by the magazine’s publisher Vance Phillips, who received his college education in Trenton. He wrote, “What are we doing to fill the vacuum of the cities which was created by relocation of the established business” He then goes on to say, “After spending 5 years of planning and developing new programs for structural and economic changes, Trenton Model Cities program has failed to meet the potential growth of new and old businesses in our community.”[23] Phillips like many black Americans living in Trenton during the 1970s saw visible signs of the city’s decline through the failure of local businesses. He believed what was needed to fix this problem was a stronger government response along with increased civic action from specifically the black community.[24]

 In this same publication, Phillips expressed his belief that, “a person who lives within the city should have preference over persons living outside of the cities in terms of employment.”[25] Here the author is addressing those who live in the surrounding suburbs but continue to fill job positions within the city limits. This would have been a popular message to Trenton’s black business owning population due to the negative effects that rapid suburbanization had on small businesses within the city.  In this magazine article, Phillips touches on an number of topics which are extremely relevant to this study. For one, the instability of small businesses in the wake of mass-suburbanization which he observed was largely due to the relocation of both industry and people to outside the city. Mostly ethnically-white Trentonians were leaving the city for the suburbs and taking with them their spending power. With population decline being spearheaded by movements to the suburbs, there simply was not enough money being circulated throughout the city to adequately support the small businesses which propped up its local economy.

Another popular message within this passage highlights that with most of Trenton’s workforce shifting into the surrounding suburbs, so too did its voting power.[26] This left black communities who resided within the urban centers even more powerless as a minority to change their own political environment. Suburbanization brought with it a massive decrease to the city’s population and tax-base. The previously 100,000+ populated city now had just around 80,000 inhabitants by 1970.[27] This rapid population decrease meant that the tax revenue generated was not enough to effectively grapple with the issues facing the economy and the evolving workforce.

Furthermore, local culture within the city which had been forged by America’s largest waves of European immigration in the 19th and early 20th century suffered as a result of deindustrialization and suburbanization. Many of the small businesses and social institutions which had historically characterized the city of Trenton were established by first and second generation Italian, Irish, Polish, and Hungarian immigrants. Many of whom traveled from the larger cities of New York and Philadelphia to find industrial jobs in Trenton. Dennis J. Starr’s book, The Italians of New Jersey, outlines the effects of suburbanization on the “old immigrants” of New Jersey, stating:

The movement to the suburbs and smaller urban places paralleled a major transformation of the state’s urban political economy. Following the war, the state’s largest cities did not participate in the postwar prosperity and economic development. Instead, their industrial bases eroded, their mercantile bases moved to suburban shopping malls and their overall, especially affluent white, populations shrank.[28]

The effect of suburbanization on the local culture of Trenton’s longest serving residents is a source of some historical debate. Cumbler notes that, “Despite suburbanization of the more successful Italians and Slavs, many of Trenton’s ethnic neighborhoods seemed as entrenched as ever in the 1950s.”[29] However, the following decades of the 1950s would see even more of Trenton’s staple “old immigrant” communities relocating to the suburbs and with them their cultural values and traditions. That being said, the cultural diversity of Trenton, New Jersey created by its ethnic melting pot of a history can still be felt today in 2023. Walking the streets of some of its most popular neighborhoods like Chambersburg, one can still see and feel the Italian influence of churches, social clubs, and bar-restaurants in the area. The main point here is that culture did suffer as a result of suburbanization and population decline, but it did not die, it rather faded into a less obvious and less present version of its former self.

            Looking at suburbanization as a major effect of postwar de-industrialization on the city of Trenton provides valuable insight into the cities rise and decline as a manufacturing powerhouse. Like many other rust belt cities of this time period, the trend of suburbanization caused unprecedented changes to the city’s local economy and demographics. The loss of unionized industry jobs encouraged many Trentonians to relocate to the surrounding towns which had recently seen great increases in housing development. In the process, those who left the city unintendedly left Trenton out to dry. Money from the pockets of those who moved to the suburbs was desperately needed to support small businesses in the city and their tax dollars could have been used to make meaningful change to the city’s failing infrastructure. As previously discussed, the local culture of the city also suffered as a result of these consequences which only compounded with each decade of further suburbanization and relocation away from the city. With a decreasing population, aging workforce, and a new wave of migrants without sufficient employment opportunities, the city began to decline into an unrecognizable version of its “Golden Age” of the 1920s.

Trenton’s deindustrialization and its history of racism and inequality are inextricably linked. In 1986, Historian Dennis J. Starr published, History of Ethnic and Racial Groups in Trenton, New Jersey: 1900 – 1960, which acts as one of the foremost important pieces of historical literature on Trenton race-relations. This research clearly establishes a link between deindustrialization and increased racial tensions by claiming:

As industries closed down or reduced their work force it became harder for Afro-American migrants to get a toe hold on the traditional ladder of social mobility–a factory job. Meanwhile the city’s sizable Italian, Polish and Hungarian communities became fearful lest their jobs be eliminated, their neighborhoods integrated. A siege mentality developed in light of the population shifts and exodus of industries, commercial businesses, colleges and government offices.[30]

This “siege mentality” was amplified overtime with the overcrowding of black communities in Trenton and the extension of black-owned or rented residences into shrinking ethnically white neighborhoods.

Between 1950 and 1960, Trenton’s black population rose to 22.8 percent of the total population. As discussed earlier, Trenton was a historically segregated city but in the 1950s and 60s this racial division took on a whole new light given the increases in population and decreases in economic opportunities and industry.[31] Trenton historian Jack Washington described Trenton following WWII stating, “That the 1950s was a period of benign neglect for the Black community is an understatement, for Black people were forgotten while their economic and political troubles continued to mount.”[32] These economic troubles can be seen most clearly through examination of housing segregation in the city and its continued influence on the lives of Trentonians. Along with housing and workplace discrimination, ethnically white residents used black migrants as scapegoats for their city’s economic misfortunes and decline.

            Housing in Trenton, NJ after the postwar years can be characterized as both segregated and worse for wear. Following the largest influx of black immigrants to the city in the late 1940s and early 50s, this new population was largely forced to live in the Coalport and Five Points areas of the city on its interior.[33] Housing opportunities for black residents were few and far between and were in most cases aged and deteriorated. Starr shed light on this inequality revealing, “By 1957 over 80 per cent of the city’s housing was over 50 years old and 20 percent of all housing units were dilapidated or had deficient plumbing.”[34] This was a problem for all city-dwellers and stood as a marker of the city’s decline following deindustrialization. For the black community, this problem was especially real given that the neighborhoods with the worst physical damage and infrastructure were those areas in which they settled. A 1950s survey of the city titled, Negro Housing in Trenton found, “the percentage of substandard housing among the Negro population is four times higher than that for the general population.”[35] Not only were black Trentonians limited in their occupation but also in the location and quality of their housing. This same study of housing in Trenton concluded that 1,200 new residential spaces would have to be erected in order to meet the needs and standards of the city. These spaces were not created and public housing efforts did not meet the requirements of the new growing population.[36]

With little options for housing, a lack of policy action to create new housing, and increases to the population, black migrants had no choice but to expand into Trenton’s old ethnically-white neighborhoods. In the eyes of many in the white majority, black migrants were the corrupting force which acted to take down their beloved city. Declining social and economic conditions in the city paired with old racist tendencies to produce conflict between ethnic groups. Cumbler eloquently explains this clash stating:

The decline of their industrial base narrowed the boundaries of choice for both white and black Trentonians, and in doing so it intensified conflict between them. Increasingly, Trenton’s problems became defined by the city’s white residents in terms of growth of its black population. Actually, its problems had other sources: the loss of its tax base with the closing down of factories, dilapidation of the existing housing stock, and the declining income of its citizens of whatever color.[37]

This excerpt captures the situation in Trenton during the 1950s and 60s in terms of race relations and the overall decline of the city. Racist attitudes were not a new trend in Trenton but were compounded with the arrival of large populations of black migrants. From the white perspective, black migrants were aiding in the destruction of the city. From the black perspective, Trenton did not provide the necessary resources for which they traveled north in search of in the first place.

The 1960s and the Civil Rights era was the historical boiling point for racial tensions and division in Trenton. The influence of the NAACP and other organizations for the advancement of racial equality along with intense riots brought race and class to the forefront of Trenton’s post-industrial issues. Most impactful, Trenton race riots following the assassination of Dr. Martin Luther King Jr. exploded in early April of 1968. These riots lasted for multiple days and resulted in fires erupting around the city as well as over 7 million dollars in damage to over 200 different businesses in Trenton at the time. During the chaos, around 300 mostly young black men were arrested by Trenton Police. The devastating damage to the downtown section of the city caused many to flee and abandon it altogether in the years that followed.[38] It would be unfair to say that these riots were a direct result of deindustrialization in postwar Trenton. However, the city’s history of racial inequality and the compounding forces of racial tension as a result of deindustrialization point to the creation of fertile ground for public outrage. Of course, the assassination of Dr. Martin Luther King Jr. served as the catalyst for race riots in the city but the broader history of discrimination and inequality in Trenton suggests an intense decades-long build up to the events that unfolded in April of 1968.

Trenton’s rise and fall as an American industrial city is truly a fascinating case study of the post-war era in U.S. history. What was once a manufacturing powerhouse along the Delaware River strategically placed between the two large cities of New York and Philadelphia was reduced to a shadow of its former glory by the 1950s and 60s. The causes of this decline can be found in the removal of industry away from the city following the war effort and signs of economic decline can be traced as far back as the 1920s. The effects of this shift however, remain the most significant in the broader history of the city. Rapid deindustrialization meant that wages and opportunities were significantly limited for all Trentonians but especially for its segregated black community. Many of those who could afford it elected to move to the surrounding suburbs, bringing with them their tax dollars, their votes, and their culture. Lastly, deindustrialization and the consequences of a radically transformed Trenton increased racial tensions in the form of housing and workplace discrimination.

These effects offer new insights into the Trenton of today. Trenton now has a black majority and interestingly, those same areas which housed black migrants in the 1950s on the city’s interior are still today in 2023 the site of high unemployment and low opportunities. Walking the streets of Trenton, one is quickly reminded of its rich history with many of its houses and abandoned factories still standing today as a reminder of the city’s complicated history. A hopeful message could be that a greater understanding of Trenton’s post-war history could provide the necessary insight to create better living conditions and opportunities for all its residents. However, today Trenton remains a city in an intense state of recovery from its industrial past. Historical research has been done to show that urban renewal plans have largely failed to revitalize the city’s economy in the 20th and 21st centuries and issues such as crime, poverty, drug abuse, poor infrastructure, among others continue to loom over the once prosperous city.

            Today, the “Trenton Makes, The World Takes” sign on the Lower Trenton Bridge still stands bright but its meaning has drastically changed since the last century. What was once a beacon of promise and stability is now a constant reminder of how far the city has fallen from its industrial and manufacturing heights.

Upon completing this research paper on Trenton, I gave a lesson to high school world history students at Ewing High school as part of my undergraduate co-teaching field work. Ewing is one of the border towns to the city of Trenton and was one of the most popular destinations for suburbanites who left the city in the 20th century at least in part because of deindustrialization and the city’s overall decline. The proximity of the topic and the familiarity students  had with popular street names, businesses, and buildings in the city created a feeling of relevance that sparked engagement. Students were surprised to be learning about a topic so close to home and they responded with passionate discussion and the creation of meaningful connections which were sparked through a mix of group and whole class discussions.

For social studies teachers, this successful shift from world history topics to a more grass roots approach to teaching local history can be used as a template for future lessons. Topics frequently come up during different units throughout the school year which deeply relate to the local history of wherever kids go to school. For Ewing students, Trenton’s decline as an industrial city directly related to their lived experiences. Many of my students had lived in or around Trenton for most of their lives. This practice of teaching local history to students is not overwhelming nor is it undoable. The same amount of effort it takes to create a lesson in a world history or AP class can be channeled into research dealing with one’s own local environment and history.

This template for teaching local history can be used to generate engagement in the classroom which is unique to any other topic. Once students are given the opportunity to learn and ask questions about their own town, city, home, etc. they begin to view the world through a more historical lens which is the goal of many if not all high school social studies teachers. Overall, my experience with this approach was overwhelmingly positive and I encourage any and all educators to shift their focus for at least one day of the year towards exploring their own local history and connecting it to larger themes within our discipline.

Black Businesses Need Your Help!. Pride Magazine. Trenton Public Library. March 1972. https://www.trentonlib.org/trentoniana/microfilm-newspapers/

Dwyer, William. This Is The Task. Findings of the Trenton, New Jersey Human Relations Self-Survey (Nashville: Fisk University, 1955).

Lee, Helen J. Nigger in the Window. Library of Congress, Internet Archive 1978.

Negro Housing in Trenton: The Housing Committee of the Self Survey. Trenton Public Library. Trentoniana Collection. Ca 1950.

“Negro in the Trenton Labor Market,” Folder: Community Services in Trenton, Box: Trenton Council on Human Relations, Trentoniana Collection, Trenton Public Library.

“Study of Community Services in Trenton,” Folder: Community Services in Trenton, Box: Trenton Council on Human Relations, Trentoniana Collection, Trenton Public Library.

Trenton Council of Social Agencies, Study of Northeast Trenton: Population, Housing, Economic, Social and Physical Aspects of the Area. Folder: Study of Northeast Trenton. Box 1: African American Experience. Trentoniana Collection. Trenton Public Library. 1958.

Boustan, Leah Platt. “Black Migration, White Flight: The Effect of Black Migration on Northern Cities and Labor Markets.” The Journal of Economic History 67, no. 2 (2007): 484–88. http://www.jstor.org/stable/4501161.

Cowie, J. & Heathcott, J. Beyond the Ruins: The Meaning of Deindustrialization. Cornell University Press, 2003.

Cumbler, John T. A Social History of Economic Decline: Business, Politics, and Work in Trenton (New Brunswick: Rutgers University Press, 1989).

Denton, Stacy. “The Rural Past-in-Present and Postwar Sub/Urban Progress.” American Studies 53, no. 2 (2014): 119–40. http://www.jstor.org/stable/24589591.

Division of Labor Market and Demographic Research. New Jersey Population Trends 1790 to 2000 (Trenton, NJ: New Jersey State Data Center, August 2001).

Gibson, Campbell. U.S. Bureau of the Census: Population of the 100 Largest Cities and Other Urban Places in the United States: 1790 – 1990, (Washington D.C.: U.S. Bureau of the Census, 1998).

Jackson, Kenneth T. Crabgrass Frontier: The Suburbanization of the United States. Oxford University Press, 1985.

Leynes, Jennifer B. “Three Centuries of African-American History in Trenton.” Trentoniana Collection. Trenton Historical Society. 2011.

Starr, Dennis J. “History of Ethnic and Racial Groups in Trenton, New Jersey, 1900-1960,” Trentoniana Collection. 1986.

Starr, Dennis J. The Italians of New Jersey: A Historical Introduction and Bibliography. New Jersey Historical Society. Newark, NJ. 1985.

Strangleman, Tim, James Rhodes, and Sherry Linkon. “Introduction to Crumbling Cultures: Deindustrialization, Class, and Memory.” International Labor and Working-Class History, no. 84 (2013): 7–22. http://www.jstor.org/stable/43302724.

Sugrue, Thomas J. The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit. (Revised Ed.). Princeton University Press, 2005. Originally published 1996.

Washington, Jack. The Quest for Equality: Trenton’s Black Community 1890-1965. Africa World Press. 1993.


[1] Jack Washington, The Quest for Equality: Trenton’s Black Community 1890-1965, Africa World Press, 1993, 56.

[2] John T. Cumbler, A Social History of Economic Decline: Business, Politics, and Work in Trenton, (New Brunswick: Rutgers University Press, 1989), 9.

[3] Division of Labor Market and Demographic Research, New Jersey Population Trends 1790 to 2000 (Trenton, NJ: New Jersey State Data Center, August 2001), 23.

[4] Tim Strangleman, James Rhodes, and Sherry Linkon, “Introduction to Crumbling Cultures: Deindustrialization, Class, and Memory.” International Labor and Working-Class History, no. 84 (2013), 19.

[5] Cumbler, A Social History, 132-133.

[6] Campbell Gibson, U.S. Bureau of the Census: Population of the 100 Largest Cities and Other Urban Places in the United States: 1790 – 1990, (Washington D.C.: U.S. Bureau of the Census, 1998)

[7] Division of Labor, New Jersey Population Trends, 26.

[8] Thomas J. Sugrue, The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit, (Revised Ed.), Princeton University Press, 2005, Originally published 1996, 128.

[9] Jefferson, Cowie & Joseph Heathcott, Beyond the Ruins: The Meaning of Deindustrialization, Cornell University Press, 2003. 1-3.

[10] Cumbler, A Social History, 93-95.

[11]Stacy Denton, “The Rural Past-in-Present and Postwar Sub/Urban Progress,” American Studies 53, no. 2 (2014): 119.

[12]Leah P. Boustan, “Black Migration, White Flight: The Effect of Black Migration on Northern Cities and Labor Markets.” The Journal of Economic History 67, no. 2 (2007): 484-485.

[13] Cumbler, A Social History, 147-148.

[14] Cumbler, A Social History, 145.

[15] Helen J. Lee, N—-r in the Window, Library of Congress, Internet Archive 1978, 131.

[16] Sugrue, Urban Crisis, 93-94.

[17] “Study of Community Services in Trenton,” Folder: Community Services in Trenton, Box: Trenton Council on Human Relations, Trentoniana Collection, Trenton Public Library, 8.

[18] Leah P. Boustan, “Black Migration, White Flight” 485-486.

[19] “Negro in the Trenton Labor Market,” Folder: Community Services in Trenton, Box: Trenton Council on Human Relations, Trentoniana Collection, Trenton Public Library, 33-34.

[20] Kenneth T. Jackson. Crabgrass Frontier: The Suburbanization of the United States, Oxford University Press, 1985, 188.

[21] Stacy Denton, “The Rural Past-in-Present,” 119.

[22] Cumbler, A Social History, 139.

[23] Black Businesses Need Your Help!. Pride Magazine. Trenton Public Library. March 1972, 5

[24] Black Businesses, Pride Magazine, 6

[25] Black Businesses, Pride Magazine, 6-7

[26] Black Businesses, Pride Magazine, 6-7.

[27] Gibson, U.S. Bureau of the Census, 43.

[28] Dennis J. Starr, The Italians of New Jersey: A Historical Introduction and Bibliography, New Jersey Historical Society, Newark, NJ 1985, 54.

[29] Cumbler, A Social History, 148-150.

[30] Dennis J. Starr, “History of Ethnic and Racial Groups in Trenton, New Jersey, 1900-1960,” Trentoniana Collection, 1986, 16-17.

[31] Cumbler, A Social History, 153.

[32] Washington, The Quest for Equality, 136.

[33] Trenton Council of Social Agencies, Study of Northeast Trenton: Population, Housing, Economic, Social and Physical Aspects of the Area, Folder: Study of Northeast Trenton, Box 1: African American Experience, Trentoniana Collection, Trenton Public Library, 1958, 53-54.

[34] Starr, Ethnic and Racial Groups in Trenton, 15.

[35] Negro Housing in Trenton: The Housing Committee of the Self Survey, Trenton Public Library, Trentoniana Collection, ca 1950s , 63.

[36] Negro Housing, Housing Committee, 67.

[37] Cumbler, A Social History, 156.

[38] Jennifer B. Leynes, “Three Centuries of African-American History in Trenton,” Trentoniana Collection, Trenton Historical Society. 2011, 3-4.


Forgotten Trails: Unmasking the Legacy of Native American Removal and its Contemporary Implications

Once, in the vast and untamed lands of what is now known as the United States, there thrived a multitude of Native American communities. These diverse and vibrant nations had cultivated rich cultures, deep-rooted traditions, and an intricate understanding of their surroundings. However, as the 19th century unfolded, a dark cloud loomed over these indigenous peoples. In the late 19th century, following a series of conflicts and broken treaties, Native American communities faced forced complete removal from their ancestral lands. The government policies aimed at assimilation and expansion systematically uprooted these communities, displacing them from their homes and severing their ties to their traditions and in 1890, a turning point occurred in Native American history with the forced removal of their communities from their ancestral lands. This displacement was not merely an isolated event but rather part of a broader pattern of marginalization that had persisted for centuries and continues to persist. Yet, despite its undeniable significance, this chapter of American history has largely been forgotten or intentionally overlooked.

The historical marginalization and lack of mainstream attention to the forced removal of Native American communities in U.S. history after 1890 has had profound effects on their social, economic, and political development in contemporary society. This study aims to explore how this neglect and amnesia surrounding the forced removals have contributed to ongoing disparities, underrepresentation, and challenges faced by Native Americans. By relegating this significant chapter of American history to obscurity, society unintentionally perpetuates the cycle of neglect and underrepresentation experienced by Native Americans. The absence of acknowledgment and understanding of the removal policies and their consequences has hindered the recognition of indigenous rights, cultural contributions, and the unique challenges faced by these communities. This research seeks to shed light on this historical oversight and highlight its implications for present-day disparities within Native American communities. By recognizing the impact of historical marginalization, it becomes possible to address current challenges effectively and foster development within these marginalized communities. Through an exploration of relevant literature, primary sources, and historiography, this research will provide a comprehensive understanding of how historical amnesia has shaped the experiences of Native Americans today. By uncovering the underlying causes of ongoing disparities, underrepresentation, and challenges they face, this study aims to contribute to broader efforts towards achieving equity and justice for Native American populations.

The study of the removal of Native Americans after 1890 has long been approached from various perspectives, often reflecting prevailing societal attitudes and biases. Traditional approaches to this topic have tended to focus on a few main ideas namely the notion that Native Americans desired urbanization and the belief that non-Native Americans were providing assistance in their transition. One common argument put forth by traditional studies is that Native Americans willingly sought relocation to urban areas. Proponents of this perspective suggest that indigenous communities recognized the benefits of modernity and sought opportunities for economic advancement through urbanization
        Another idea frequently emphasized in traditional approaches is the assumption that Native Americans were uneducated or culturally deficient compared to non-Native Americans. This perspective suggests that native cultures were inherently inferior and needed intervention from more advanced societies to progress. Consequently, it portrays non-Native American efforts as benevolent attempts to elevate indigenous populations through education, religious conversion, and exposure to Western technologies. In these traditional interpretations, non-Native American involvement was often depicted as an act of assistance rather than forced displacement. Advocates argue that government policies such as the Dawes Act of 1887 aimed at breaking up tribal landholdings into individual allotments were well-intentioned steps toward promoting private property ownership among Native Americans. Similarly, boarding schools designed to eradicate indigenous languages and cultural practices were presented as educational endeavors meant to “civilize” Native American children.

Lastly, the final common approach seen with the study of Native Americans on a broader scale is that Native American history stopped after 1890. Traditional approaches to the study of Native Americans have often treated Native American history as if it came to a standstill after the infamous Wounded Knee Massacre in 1890, perpetuating a skewed and incomplete narrative. This historical tunnel vision neglects the rich and complex tapestry of Native American experiences and contributions beyond that point. It wrongly reinforces the notion that Native Americans exist solely in a historical context, overlooking their vibrant and evolving cultures, traditions, and communities. This approach inadvertently marginalizes contemporary Native voices and their ongoing struggles, creating an inaccurate portrayal of their identity and relevance in modern America.

Overall, the approaches described, in addition to the obvious are problematic because they contribute to historical amnesia surrounding the removal of Native Americans by perpetuating a narrative that downplays the systemic injustices and challenges faced by Native communities during the process of urbanization and relocation. These traditional approaches tend to obscure the agency and resistance of Native Americans, portraying them as passive actors who willingly embraced modernity and external intervention. By emphasizing the supposed benefits of urbanization and the alleged cultural deficiencies of Native cultures, these narratives silence the historical reality of forced displacement, loss of land, and the violation of treaties. They fail to acknowledge the broader context of Native American history, including their resilience and efforts to preserve their cultures in the face of relocation and its effects.

With all of these mentioned ideas in mind imagine having your land taken away, your culture suppressed, and your way of life disrupted. This is the harsh reality that Native Americans faced following the tumultuous period of removal and relocation, particularly after 1890. As the dust settled on a nation rapidly expanding westward, it became increasingly clear that indigenous communities were bearing the brunt of this progress. Following this period marked by forced removals and relocations, these indigenous peoples found themselves grappling with an array of disparities that continued to persist long after their displacement. To begin this study, we will delve into the disparities experienced by Native Americans as a consequence of forced removal and relocation policies implemented during the late 19th century.

The late 19th century marked a pivotal period in the history of Native Americans in the United States, a time when government policies and actions began to create enduring disparities within indigenous communities. At the forefront of these policies was the Dawes Act of 1887, legislation with far-reaching consequences. With the aim of assimilating Native Americans into American society, the Dawes Act of 1887 symbolizes significant inequities and unjust policies imposed on them. This legislation had devastating consequences for indigenous groups by removing essential tribal lands necessary for survival, cultural practices, and economic stability under its allotment system. [1]As a result of inadequate inheritance in land resources, many families suffered from economic challenges leading to loss or dispossession over time. Furthermore, traditional languages and customs were interrupted through mandatory enrolment in boarding schools designed to wipe out native identities entirely. Furthermore, the act crippled governance structures within tribes creating complications when advocating their rights among native communities – currently manifesting itself today as disparities experienced daily by native Americans including poverty levels that remain high, lack quality healthcare access and education along with political under-representation all of which are core legacies felt as a result of the Dawes Act.

Continuing, the Dawes Act’s repercussions would later shape the 1950s’ relocation policies. One of its main outcomes was the relinquishment of valuable tribal lands, frequently transferred to non-Native settlers, limiting Native Americans’ entry into their traditional domains. This deprivation contributed significantly to various economic challenges faced by numerous indigenous communities for years afterward. As a result, decreased land ownership resulted in struggling tribal economies that made Native Americans susceptible and prone to hardships. The Dawes Act and the relocation policies in the 1950s both had assimilation as a central concept. The former sought to do so through land ownership while the latter aimed for urbanization, but their underlying goal was similar: making Native Americans conform to mainstream American society’s ideals. This reflected how federal authorities wanted to alter identities and lifestyles within Indigenous communities at that time. Basically, the Dawes Act set the stage for economic fragility and property deprivation which led to some policymakers finding urban relocation policies in the 1950s more desirable. The act’s effects of taking away land from Native Americans and interfering with their customary way of life established a foundation for inequalities and difficulties encountered by these communities. This ultimately made them easier targets for future initiatives focused on promoting urbanization or moving elsewhere during the 1950s.

Moving forward, in the 1950s, Native Americans were coerced into relocating to urban areas in order to achieve economic self-sufficiency and assimilate into mainstream society. Commissioned by Bureau of Indian Affairs commissioner Dillon S. Myer the relocation program was launched with the aim of relocating reservation-based Native Americans to urban environments, providing promises such as educational and occupational opportunities, transportation services, housing provisions and everyday necessities. Although this lured over thirty thousand participants; inadequate funding led to poor execution which left many re-locators facing inferior living conditions coupled with gender-segregated low-level jobs that eventually forced them back home. [2] Despite its shortcomings however it can’t be ignored that some relocated Native Americans thrived in cities securing upward socioeconomic mobility by being pro-active in the process of organizing and establishing themselves. As a result, these Native Americans were able to advocate for better livelihoods on reserves, but this was not a common happening. Ultimately, the 1950’s relocation policies failed to fulfill their objectives as many individuals lacked the necessary skills for city life due to emphasis on quantity over quality during recruitment. Consequently, they experienced racial discrimination and limited job opportunities while residing in low-income neighborhoods despite some meager benefits of relocation that favored those with initial job expertise. [3] This historic instance highlights disparities encountered by Native American communities through government policies that lacked adequate support which would eventually lead them towards developing pan-Indian social institutions amidst harsh living situations. These occurrences are consistent with the historical experiences of Native Americans within urban environments illustrating the overlooked complexities faced historically across the developmental stages of these regions.

Furthermore, after the failure of the relocation act and the increasing issues it caused in its attempt to force urbanization on to Native Americans the disparities they faced as result only increased. After the relocation act of the 1950’s, the 1960’s brought new hope to the Native Americans with the emergence of the Civil Rights movement. Despite the promises of social and political change during the Civil Rights era, Native American communities continued to face significant challenges. The termination policy, which aimed to assimilate Native Americans into mainstream society, led to the loss of tribal sovereignty and the dispossession of lands. This policy resulted in economic instability and the erosion of traditional cultural practices. Additionally, the forced relocation of many Native American families from reservations to urban areas disrupted their social fabric and often led to poverty and social marginalization. These challenges and the disparities faced by Native Americans would cease to end even as changes came about for other minority groups this is evident by the “Longest Walk” protest. In the 1970s, Native American activists staged a protest in Washington D.C. called the “Longest Walk,” which brought to light the longstanding disparities faced by their communities. These inequalities were largely impacted by governmental policies and legislation that threatened fundamental rights such as land ownership, access to water and fishing resources, treaty alteration or elimination of reservation systems. These protesters understood that these legal provisions weren’t just mere abstractions but intricately woven into cultural identity and economic sustenance for indigenous people’s survival.[4] Even though this was a peaceful demonstration it highlighted many unaddressed issues inherent with historical wrongdoings towards Indigenous peoples. This event serves as evidence of an ongoing struggle against oppression where multifaceted disparities continue to exist related not only within educational attainment gaps but also unequal healthcare opportunities due mainly because race-based discrimination persists even today. Additionally, the fact that such legislative proposals were considered as late as the 1970s emphasizes that even in modern times, Native Americans grapple with legislative threats that have the potential to perpetuate their marginalization, illustrating that these disparities remain relevant and pressing issues in the present day.

The disparities outlined in this section strongly demonstrate how the neglect and historical amnesia surrounding the forced removals have played a pivotal role in perpetuating the ongoing challenges faced by Native Americans. The Dawes Act of 1887 and the 1950s relocation policies, both driven by the goal of assimilation into mainstream American society, inflicted lasting damage on indigenous communities. These policies resulted in the loss of tribal lands, economic instability, cultural erosion, and social marginalization, creating a foundation of inequality that continues to shape Native American experiences. The subsequent civil rights era did not bring significant relief, as termination policies persisted, further undermining tribal sovereignty and land ownership. The “Longest Walk” protest of the 1970s highlighted the enduring disparities related to land, resources, and cultural identity that continue to plague Native communities. These historical injustices, neglected for so long, have left a lasting imprint, contributing to the disparities in education, healthcare, and political representation still experienced by Native Americans today, underscoring the argument that acknowledging and addressing this historical legacy is crucial to addressing these ongoing challenges.

Moving forward, in the previous section, we delved into the significant disparities that Native Americans experience across various domains, including healthcare, education, and socioeconomic status. However, it is important to recognize that these disparities are not isolated incidents but rather part of a larger pattern of underrepresentation faced by Native Americans in contemporary society. This section aims to shed light on this critical issue and explore how Native Americans continue to be marginalized and overlooked within systems that shape their lives. By examining the various aspects of underrepresentation, such as inadequate political representation, limited media visibility, and exclusion from decision-making processes, we can gain a comprehensive understanding of the multifaceted challenges faced by Native American communities today. Through an analysis of these underrepresented perspectives, we can contribute to ongoing efforts towards achieving greater equity and inclusivity for all individuals in our diverse society including the Native Americans.

To start off, the history of Native Americans has suffered from a consistent pattern of marginalization and misrepresentation in dominant societal narratives. This regrettable circumstance has resulted in many prevalent misunderstandings, misconceptions, and knowledge gaps when it comes to essential aspects related to the rich cultural heritage that defines each tribe’s unique traditions and experiences. Furthermore, this persistent systemic under-representation issue is not limited only to these crucial historical elements but also encompasses an immediate threat regarding indigenous languages’ endangered status along with their respective rituals or customary practices. Consequently, there exists a critical risk linked with the disappearing traditional elements integral towards forming Native identity amidst modern times – making preservation efforts necessary for combating culture erasure as well as safeguarding ancient customs vital toward uniquely defining those who still maintain them today. Additionally, it is important to acknowledge that the underrepresentation of Native Americans in modern discourse and media coverage not only pertains to historical injustices but also extends towards contemporary challenges faced by Indigenous communities. These adversities encompass issues such as poverty, healthcare disparities, and political obstacles which are oftentimes disregarded or downplayed within public discussions. The failure to adequately report on these matters impedes progress towards enacting effective policy changes and support systems for Native American peoples who continue to suffer from systemic marginalization.

An emblematic example of this broader issue of Native American underrepresentation in the United States is found in an examination of the lack of acknowledgement of Native American communities within the state by the state of Pennsylvania. The historical denial of the existence of Native Americans in Pennsylvania serves as a noteworthy example of underrepresentation perpetuated by public institutions. This denial results in Native American communities not receiving official recognition or acknowledgment, therefore rendering them largely invisible within the state’s records and narratives. The absence of official status places these groups at a disadvantage – lacking legal rights, resources and opportunities that come with full acknowledgement. Moreover, this lack further contributes to their underrepresentation. Denying their cultural contributions creates an even greater disconnection from history, amplifying this invisibility throughout public awareness about Pennsylvania’s past[5].  Furthermore, it is distressing to recount how societal pressure forced many members of Native American tribes in Pennsylvania into concealing ancestry leading towards the erasure of cultural identity  – ultimately creating another form of ongoing-under-representation for Native Americans.

Moving forward, we will be focusing primarily on the political underrepresentation of the Native Americans. However, it is important to understand that the underrepresentation of Native Americans is a multifaceted issue that transcends the and extends deep into various aspects of American society. While the lack of political representation is a significant concern, it is important to note that it is just one facet of a broader pattern of systemic inequity and marginalization that Native American communities grapple with. Recognizing that underrepresentation is not confined solely to the political arena, is crucial to adopt a thorough approach that addresses these interconnected issues.

Continuing on, the late 19th and early 20th centuries were a time of significant political transformation for the United States. As the nation grappled with industrialization, urbanization, and the expansion of democratic ideals, various marginalized groups strove to gain representation within the political arena. However, one group that often remains overlooked in this narrative is Native Americans. Despite their rich cultural heritage deeply intertwined with the American landscape, Native Americans found themselves systematically excluded from meaningful participation in the political process. Continuing into this section we will dive into an examination of how Native Americans experienced political underrepresentation during this crucial period. By shedding light on this lesser-known aspect of American history, we can better understand the complexities surrounding democracy’s development and confront enduring issues related to Indigenous rights and representation.

To truly gain and understand the development of the intense political underrepresentation of Native Americans we have to take step back in time, particular to the year 1878 when the Washington Constitutional Convention would convene. The Washington Constitutional Convention of 1878 stands as a pivotal moment in American history, particularly concerning the political underrepresentation of Native Americans. Held during a time when the nation was grappling with issues of equality and inclusion, this convention shed light on the deep-rooted injustices faced by indigenous communities. The proceedings not only highlighted the systemic marginalization of Native Americans but also sparked conversations that would shape future legislation and advocacy efforts aimed at rectifying these longstanding disparities. During this era, Native Americans across the United States were consistently denied their basic rights to political participation. Discriminatory policies and practices had effectively silenced their voices and hindered their ability to influence decisions that directly impacted their lives. This disenfranchisement was acutely felt in Washington state, where tribal nations faced numerous challenges in asserting their political power.

 At the convention, the creators of Washington State’s Constitution made significant choices that directly impacted Native American involvement in politics and representation. One such choice was excluding non-citizens from voting, which affected many Natives as their tribal affiliations rendered them ineligible for citizenship. This exclusion prevented a large portion of Native Americans from participating until 1924 when the Indian Citizenship Act was passed. Furthermore, although there were Indigenous representatives present at this meeting, they did not have any power to vote which resulted in inadequately considering native perspectives during constitution drafting – leading to underrepresentation within political processes across the state. In addition, the 1878 constitution confirmed Washington’s indigenous tribes’ limited sovereignty by placing them under strict jurisdiction where self-governance could be undermined. Additionally, voting restrictions imposed disproportionate property requirements on natives impeding fair opportunities towards meaningful participation or political representation. [6] Notably, these decisions continue impacting today’s policies & governance locally with these communities still facing challenges asserting rightful political rights while maintaining sufficient influence over local affairs.

Furthermore, after the convention in 1878, a significant period of political underrepresentation was set off in the United States. This era was characterized by a combination of legal, cultural, and socio-political factors that marginalized Native American voices in the national political landscape. As mentioned previously, after the passage of the Dawes Act in 1887, Native American lands were dramatically reduced through allotment, often leading to the loss of tribal communal ownership and self-governance. The imposition of citizenship and land ownership requirements for voting further disenfranchised Native Americans, as many were deemed unfit to vote due to their tribal affiliations or lack of individual property. For example, various state constitutions, such as North Dakota’s in 1889, introduced clauses demanding that Native Americans sever tribal ties to be eligible to vote. This effectively disconnected them from their tribal communities and cultural identities. Not only did this impact their involvement in tribal governance, but it also hindered their political representation in state and national politics. Additionally, the federal government’s policies of forced assimilation and the establishment of Indian boarding schools which aimed to eradicate Native cultures and languages also dealt a serious blow to the political representation of Native Americans. This cultural assault hindered Native Americans’ political participation by disconnecting them from their traditional forms of governance and communal decision-making. Native Americans were also not afforded equal opportunities for education and employment, which further in return additionally limited their political influence. [7] Continuing, the Indian Reorganization Act of 1934 represented a partial shift in federal policy, allowing tribes to reconstitute their governments and regain some measure of self-determination. This brief positive shift after 1934, however, wouldn’t be long lived as it following the trend would be undermined by the shift that would occur in the 1940’s.

The 1940s would mark a critical turning point in Native American policy in the United States, heralding a shift that significantly deepened political underrepresentation of Indigenous peoples. This era was characterized by a series of policy changes and legislative actions that not only neglected the voices and interests of Native American communities but actively marginalized them. During the 1940s, there was a significant transformation in government policies towards Native Americans. These changes led to a reduction of tribal sovereignty and autonomy as the government began considering terminating its responsibilities to these communities[8]. Influential members of Congress advocated for assimilating Native Americans into mainstream society while seeking to shift decision-making authority away from them. Simultaneously, states were pressuring federal authorities to withdraw their obligations regarding indigenous populations. The overarching objective was economic and social rehabilitation; however, such policies often disregarded unique cultural and political needs required by these communities. [9] This era is marked by a pivotal shift in Native American policy that had long-lasting consequences on their political representation and self-determination.

This shift would continue through the 1950’s with the previously mentioned relocation policies put in place. However, as we enter the 1960’s another shift occurs with the emergence of the African American Civil Rights movement. The Civil Rights Movement brought about a significant change in Native American political representation. Initially aimed at addressing the rights of African Americans, its principles resonated with other marginalized groups such as Native Americans, who also sought equal treatment and non-discrimination. The passing of two legislative acts – the Civil Rights Act of 1964 and Voting Rights Act of 1965 facilitated greater access to voting for minorities by eliminating discriminatory practices like voter literacy tests and poll taxes that had long plagued native communities. Inspired by these changes, activists emerged from within local tribes seeking self-determination which ultimately led to increased participation in politics resulting in greater engagement on all levels-locally statewide and even federally.

Although the Voting Rights Act of 1965 aimed to eradicate racial discrimination in voting and grant Native Americans full participation in elections, their communities still faced political underrepresentation due to various challenges. These obstacles encompassed issues such as gerrymandering, voter identification requirements, and limited access to polling places on reservations or rural areas. These circumstances adversely impacted Native American voters’ capacity for exercising their democratic rights effectively. [10] Furthermore, a lack of representation at both state legislatures and federal levels persisted throughout subsequent elections — underscoring an ongoing struggle toward inclusive politics that continues today. Even with advancements made through the Voting Rights Act, these barriers demonstrate how deep-seated inequities continue denying fair political representation for Indigenous peoples across America. A prime example of the continuing political underrepresentation that followed the Voting Rights Act is the campaigns for the election of 1972. The campaigns for the election of 1972 underscore the persistent lack of political representation for Native Americans, even in the aftermath of the 1965 Voting Rights Act. It is evident that Native American concerns remained marginalized as both presidential candidates in the 1972 election, George McGovern and Richard Nixon, primarily focused on broader national issues like foreign policy and economic reforms, neglecting specific Native American issues. The campaigns further highlight a historical pattern of unfulfilled promises and pledges of support, further indicating that Native American voices were not adequately heard or represented in the political discourse. Furthermore, Nixon’s decision to reduce the Bureau of Indian Affairs (BIA) budget by nearly $50 million exemplifies a lack of commitment to addressing the unique challenges and needs of Native American communities. Additionally, the campaigns brought attention to the historical trust-based relationship between the United States and Native Americans, which has often been unfulfilling and marked by neglected promises. [11] Overall, despite the enactment of the Voting Rights Act of 1965, the political underrepresentation of Native Americans persisted, as demonstrated by the government’s ongoing failure to address their specific concerns and needs as evidenced by the 1972 election campaigns.

Before we bring this study to an end, in order to provide a more comprehensive picture, it is important to acknowledge the rise of movement for Native American rights that began to develop during the end of the time frame discussed here. Serving as a culmination of the enduring disparities and underrepresentation faced by Native Americans for centuries the Red Power Movement developed in the late 1960s and 1970s. Emerging as a response to these long-standing injustices, the movement sought to address issues such as tribal sovereignty, land rights, cultural preservation, and political activism. The Red Power Movement played a crucial role in raising awareness and advocating for the rights of Native Americans in contemporary American society. While it paved the way for significant progress, Indigenous communities continue to grapple with ongoing challenges, including poverty, healthcare disparities, and political obstacles. [12]These disparities persist, emphasizing the need for continued advocacy and change. However, it’s important to note that the comprehensive examination of the Red Power Movement and its contemporary implications lies beyond the scope of this study, which primarily focuses on the historical context and challenges faced by Native Americans during earlier periods.

Ultimately, the underrepresentation both politically and in general detailed in this section intensely shines a light on how the neglect and amnesia surrounding the forced removals of Native Americans have played a significant role in perpetuating the disparities and challenges faced by these communities. The historical narrative reveals how Native Americans have consistently been excluded from meaningful participation in various aspects of American society, including politics, despite their rich cultural heritage and contributions to the nation. This exclusion extends to the denial of basic rights, voting restrictions, and the erosion of tribal sovereignty. Even after legislative efforts like the Voting Rights Act of 1965 which was aimed at ensuring equal political participation, barriers persisted such as gerrymandering and limited polling access, demonstrating ongoing obstacles to representation. In addition, the focus in the 1972 election campaigns serves as a poignant example of how Native American concerns have been marginalized in national politics. This pattern culminated in the emergence of the Red Power Movement in the late 1960s and 1970s which further pushed the need for advocacy and change in response to deep-rooted disparities. Additionally, this historical underrepresentation and discrimination contribute to the idea that acknowledging and addressing these past injustices and the pattern of underrepresentation are crucial steps toward rectifying the ongoing challenges faced by Native American communities and achieving greater equity and inclusivity.

In conclusion, the involuntary displacement of Native American communities from their traditional lands during the late 19th century and subsequent ignorance about this period in U.S. history have had significant repercussions that still impact Indigenous people today. The marginalization and lack of acknowledgment these occurrences received has contributed to ongoing inequalities, limited representation, and hardships faced by Native Americans and neglecting past injustices has continued a pattern of disregard for indigenous  peoples’ rights which perpetuates further neglect and subordination meant toward them.

Furthermore, the research has highlighted that government policies, such as the Dawes Act and relocation policies of the 1950s, had profound and lasting effects on Native American communities. These policies aimed at assimilation and urbanization disrupted traditional ways of life, eroded tribal sovereignty, and contributed to economic instability. The consequences of these policies are high poverty levels with limited access to quality healthcare and education and political underrepresentation which affects Native Americans even today. Moreover, along with these consequences, light is shed on the matter of political underrepresentation faced by Native Americans throughout history. Starting from exclusionary policies adopted at Washington Constitutional Convention in 1878 to harmful transformations in federal policy during the 1940s; Native Americans have been systemically oppressed within political procedures. Continuing, despite having The Voting Rights Act introduced in 1965, hindrances such as gerrymandering, and voter ID requirements still hinder their impact over politics.

Additionally, the research has also highlighted how the underrepresentation of Native Americans extends beyond politics and encompasses various aspects of American society, including education, healthcare, employment, and media representation. Recognizing the impact of historical amnesia and underrepresentation, it becomes clear that addressing current challenges and fostering development within Native American communities is essential. By shedding light on these historical oversights and their implications for present-day disparities, this research aims to contribute to broader efforts toward achieving equity and justice for Native American populations. Acknowledging their rich cultural heritage, enduring resilience, ongoing struggle for rights and representation are crucial steps towards rectifying past injustices while building a more inclusive society equitable for everyone including the Native Americans.

Burt, Larry W. “Roots of the Native American Urban Experience: Relocation Policy in the 1950s.” American Indian Quarterly 10, no. 2 (1986): 85–99. https://doi.org/10.2307/1183982.  

“Dawes Act of 1887.” National Archives Catalog , 2016. https://catalog.archives.gov/id/5641587.  

Jacobs, Michelle R. Indigenous memory, urban reality stories of American Indian relocation and reclamation. New York: New York University Press, 2023.

Legislative Review 1, no. 12 (1972). https://jstor-org.rider.idm.oclc.org/stable/community.28145368.   

Minderhout, David, and Andrea Frantz. “Invisible Indians: Native Americans in Pennsylvania.” Human Organization 67, no. 1 (2008): 61–67. http://www.jstor.org.rider.idm.oclc.org/stable/44127040.

“Resolution Regarding Native Americans Adopted at the Washington Territory Constitutional Convention, July 17, 1878.” University of Washington Libraries. Special Collections Division. ; Washington Territory Records. Accession No. 4284-001, Box 3. Accessed September 26, 2023. https://search.ebscohost.com/login.aspx?direct=true&db=edsbas&AN=edsbas.73849FE4&site=eds-live&scope=site.  

Treuer, David. “The Heartbeat of Wounded Knee: Native America from 1890 to the Present.” Amazon, 2020. https://www.amazon.com/Heartbeat-Wounded-Knee-America-Present/dp/0399573194.  

Tyler, S. Lyman. A history of indian policy. Washington D.C.: United States Department of the Interior, Bureau of Indian Affairs, 1973.

Wolfley, Jeanette. “Jim Crow, Indian Style: The Disenfranchisement of Native Americans.” American Indian Law Review 16, no. 1 (1991): 167–202. https://doi.org/10.2307/20068694.   “‘Longest Walk,’ Protest March to Oppose Abrogation of All Native American Treaties and the Genocide of Indian People.” Accessed September 26, 2023. https://jstor.org/stable/community.34557616


[1]  “Dawes Act of 1887,” National Archives Catalog , 2016, https://catalog.archives.gov/id/5641587.

[2] Larry W. Burt, “Roots of the Native American Urban Experience: Relocation Policy in the 1950s,” American Indian Quarterly 10, no. 2 (1986): 85–99, https://doi.org/10.2307/1183982.

[3] Michelle R. Jacobs, Indigenous Memory, Urban Reality Stories of American Indian Relocation and Reclamation (New York: New York University Press, 2023).

[4] “‘Longest Walk,’ Protest March to Oppose Abrogation of All Native American Treaties and the Genocide of Indian People,” accessed September 26, 2023, https://jstor.org/stable/community.34557616 .

[5] David Minderhout and Andrea Frantz, “Invisible Indians: Native Americans in Pennsylvania,” Human Organization 67, no. 1 (2008): 61–67, http://www.jstor.org.rider.idm.oclc.org/stable/44127040.

[6] “Resolution Regarding Native Americans Adopted at the Washington Territory Constitutional Convention, July 17, 1878,” University of Washington Libraries, Special Collections Division, Washington Territory Records, Accession No. 4284-001, Box 3, accessed September 26, 2023.”

[7] S. Lyman Tyler, A History of Indian Policy (Washington D.C.: United States Department of the Interior, Bureau of Indian Affairs, 1973).

[8] David Treuer, “The Heartbeat of Wounded Knee: Native America from 1890 to the Present,” Amazon, 2020, https://www.amazon.com/Heartbeat-Wounded-Knee-America-Present/dp/0399573194.

[9] S. Lyman Tyler, A History of Indian Policy (Washington D.C.: United States Department of the Interior, Bureau of Indian Affairs, 1973).

[10] Jeanette Wolfley, “Jim Crow, Indian Style: The Disenfranchisement of Native Americans,” American Indian Law Review 16, no. 1 (1991): 167–202, https://doi.org/10.2307/20068694.

[11] Legislative Review 1, no. 12 (1972), https://jstor-org.rider.idm.oclc.org/stable/community.28145368.

[12] David Treuer, “The Heartbeat of Wounded Knee: Native America from 1890 to the Present,” Amazon, 2020, https://www.amazon.com/Heartbeat-Wounded-Knee-America-Present/dp/0399573194.


 

 

New York’s Education Wars a Century Ago Show How Content Restrictions Can Backfire

Bill Greer

 Reprinted with permission from https://historynewsnetwork.org/article/185878

Matthew Hawn, a high school teacher for sixteen years in conservative Sullivan County, Tennessee, opened the 2020-21 year in his Contemporary Issues class with a discussion of police shootings.  White privilege is a fact, he told the students.  He had a history of challenging his classes, which led to lively discussions among those who agreed and disagreed with his views.  But this day’s discussion got back to a parent who objected.  Hawn apologized – but didn’t relent.  Months later, with more parents complaining, school officials reprimanded him for assigning “The First White President,” an essay by Ta-Nehisi Coates, which argues that white supremacy was the basis for Donald Trump’s presidency.  After another incident in April, school officials fired him for insubordination and unprofessional behavior.

Days later, Tennessee outlawed his teaching statewide, placing restrictions on what could be taught about race and sex.  Students should learn “the exceptionalism of our nation,” not “things that inherently divide or pit either Americans against Americans or people groups against people groups,” Governor Bill Lee announced.  The new laws also required advance notice to parents of instruction on sexual orientation, gender identity, and contraception, with an option to withdraw their children.

Over the past three years, at least 18 states have enacted laws governing what is and is not taught in schools. Restricted topics mirror Tennessee’s, focusing on race, gender identity, and sexual orientation.  In some cases, legislation bans the more general category of “divisive concepts,” a term coined in a 2020 executive order issued by the Trump administration and now promoted by conservative advocates.  In recent months, Florida has been at the forefront of extending such laws to cover political ideology, mandating lessons that communism could lead to the overthrow of the US government.  Even the teaching of mathematics has not escaped Florida politics, with 44 books banned for infractions like using race-based examples in word problems.

In a sense the country is stepping back a century to when a similar hysteria invaded New York’s schools during the “Red Scare” at the end of World War I, when fear of socialism and Bolshevism spread throughout the US.  New York City launched its reaction in 1918 when Mayor John Francis Hylan banned public display of the red flag.  He considered the Socialist Party’s banner “an insignia for law hating and anarchy . . .  repulsive to ideals of civilization and the principles upon which our Government is founded.”

In the schools, Benjamin Glassberg, a teacher at Commercial High School in Brooklyn, was cast in Matthew Hawn’s role.  On January 14, 1919, his history class discussed Bolshevism.  The next day, twelve students, about one-third, signed a statement that their teacher had portrayed Bolshevism as a form of political expression not nearly so black as people painted it.  The students cited specifics Glassberg gave them – that the State Department forbade publishing the truth about Bolshevism; that Red Cross staff with first-hand knowledge were prevented from talking about conditions in Russia; that Lenin and Trotsky had undermined rather than supported Germany and helped end the war.  The school’s principal forwarded the statement to Dr. John L. Tildsley, Associate Superintendent of Schools, who suspended Glassberg, pending a trial by the Board of Education.

Glassberg’s trial played out through May.  Several students repeated the charges in their statement, while others testified their teacher had said nothing disrespectful to the US government.  Over that period, the sentiments of school officials became clear.  Dr. Tildsley proclaimed that no person adhering to the Marxian program should become a teacher in the public schools, and if discovered should be forced to resign.  He would be sending to everyone in the school system a circular making clear that “Americanism is to be put above everything else in classroom study.”  He directed teachers to correct students’ opinions contrary to fundamental American ideas. The Board of Education empowered City Superintendent William Ettinger to undertake an “exhaustive examination into the life, affiliations, opinions, and loyalty of every member” of the teachers union.  Organizations like the National Security League and the American Defense Society pushed the fight against Bolshevism across the country.

After the Board declared Glassberg guilty, the pace picked up.  In June, the city’s high school students took a test entitled  Examination For High Schools on the Great War.  The title was misleading.  The first question was designed to assess students’ knowledge of and attitude toward Bolshevism.  The instructions to principals said this question was of greatest interest and teachers should highlight any students who displayed an especially intimate knowledge of that subject.  The results pleased school officials when only 1 in 300 students showed any significant knowledge of or leaning toward Bolshevism.  The “self-confessed radicals” would be given a six-month course on the “economic and social system recognized in America.”  Only if they failed that course would their diplomas be denied.

In September, the state got involved.  New York Attorney General Charles D. Newton called for “Americanization,” describing it as “intensive instruction in our schools in the ideals and traditions of America.”  Also serving as counsel to the New York State Legislative Committee to Investigate Bolshevism, commonly known as the Lusk Committee after its chairman, Newton was in a position to make it happen.  In January 1920, Lusk began hearings on education.  Tildsly, Ettinger, and Board of Education President Anning S. Prawl all testified in favor of an Americanization plan.

In April, the New York Senate and Assembly passed three anti-Socialist “Lusk bills.”  The “Teachers’ Loyalty” bill required public school teachers to obtain from the Board of Regents a Certificate of Loyalty to the State and Federal Constitutions and the country’s laws and institutions.  “Sorely needed,” praised the New York Times, a long-time advocate for Americanization in the schools.  But any celebration was premature.  Governor Alfred E. Smith had his objections.  Stating that the Teacher Loyalty Bill “permits one man to place upon any teacher the stigma of disloyalty, and this even without hearing or trial,” he vetoed it along with the others.  Lusk and his backers would have to wait until the governor’s election in November when Nathan L. Miller beat Smith in a squeaker.  After Miller’s inauguration, the Legislature passed the bills again.  Miller signed them in May despite substantial opposition from prominent New Yorkers.

Over the next two years, the opposition grew.  Even the New York Times backed off its unrelenting anti-Socialist stance.  With the governor’s term lasting only two years, opponents got another chance in November, 1922, in a Smith-Miller rematch.  Making the Lusk laws a major issue, Smith won in a landslide.  He announced his intention to repeal the laws days after his inauguration.  Lusk and his backers fought viciously but the Legislature finally passed repeal in April.  Calling the teacher loyalty law (and a second Lusk law on private school licensing) “repugnant to the fundamentals of American democracy,” Smith signed their repeal.

More than any other factor, the experience of the teachers fueled the growing opposition to the Teachers’ Loyalty bill.  After its enactment, state authorities administered two oaths to teachers statewide.  That effort didn’t satisfy Dr. Frank P. Graves, State Commissioner of Education.  In April 1922, he established the Advisory Council on Qualifications of Teachers of the State of New York to hear cases of teachers charged with disloyalty.  He appointed Archibald Stevenson, counsel to the Lusk committee and arch-proponent of rooting out disloyalty in the schools, as one member.  By summer the Council had earned a reputation as a witch hunt.  Its activities drew headlines such as Teachers Secretly Quizzed on Loyalty and Teachers Defy Loyalty Court.  Teachers and principals called before it refused to attend.  Its reputation grew so bad that New York’s Board of Education asked for its abolishment and the President of the Board told teachers that they need not appear if summoned.

A lesson perhaps lies in that experience for proponents of restrictions on what can be taught today.  Already teachers, principals, and superintendents risk fines and termination from violating laws ambiguous on what is and is not allowed.  The result has been a chilling environment where educators simply avoid controversial issues altogether.  Punishing long-time and respected teachers – like Matthew Hawn, whom dozens of his former students defend – will put faces on the fallout from the laws being passed.  How long before a backlash rears up, as it did in New York over Teachers’ Loyalty?


“Just a Few Thousand” – the Moral Questions Facing New Teachers

Mark Pearcy

I taught for nineteen years in public schools before joining higher education, and I can honestly say that I was never more shocked than I was in my second year, during a class in U.S. history. That year, I had a student named Chris. Likable, athletic (a pitcher on the baseball team), Chris wasn’t particularly gifted or hardworking, content with regular C’s and the occasional B. He didn’t talk much in class, except to girls; rarely participating in class discussions. This changed when we started our unit on the Holocaust.

            All the students knew the basic history of the topic, some more informed than others—but all students were thoroughly engaged when we talked about the death camps, the experiments, and the usual round of questions: “Why didn’t more fight back?” “Did they ever catch the ones who did it?” “How many died?”

            It was the last question that brought Chris into the discussion. A student had called out the question, and another had spontaneously answered: “Millions.” Chris raised his hand; surprised, I called on him.

            “Actually, I heard it was different than that,” he said.

            “Well, that’s true,” I responded. Privately, I was delighted he was taking part—while the Holocaust is a grim subject, it usually serves the pedagogical purpose of getting quiet students off the sideline and into the argument. “The total number killed in the Holocaust was around eleven million. Jewish victims made up six million of those.”

“No, actually I heard it was less.”

            “Really?”

            “Yeah, I heard it was just a few thousand.” He nodded in response to my surprised look. “I heard they got the number ‘six million’ by adding up all the generations of kids that would have been born to the actual victims.”

I was stunned. This was not only patently, demonstrably absurd—it was also directly from the rhetoric of neo-Nazis and Holocaust deniers. Trying hard to maintain composure, I asked him: “Where did you hear that?”

            He shrugged again.  “My father.”

The question facing a new teacher like me was difficult—should I have corrected Chris? Should I have told him his father was flatly wrong? Or worse, should I have told him that his father was repeating nauseating rhetoric that had been zoned off for the worst, vilest purveyors of bigotry? Incidentally, to make matters more complicated, I knew Chris’ father—like his son, an amiable, likeable man, who certainly didn’t seem to me the type of person who would repeat wildly inaccurate beliefs about the Holocaust. But what should be done?

I corrected Chris. Quickly, and bluntly, in front of the class. “That’s wrong,” I told him, and proceeded to drill him with the facts and evidence in my corner. I’m certain there are many teachers that would dispute my decision, and say that dealing with Chris’ error in that manner was too direct; or, even more likely, that dealing with it at all, especially in the second year of my career, was skirting the possibility of professional suicide, especially today, when the pressure and scrutiny aimed at teachers is worse than ever before.

All this would be reasonable criticism. Certainly, I make no grand claims to courage, seeing as I how I was teaching in an era of educator independence which, nowadays, we can seemingly only remember through the misty lens of nostalgia. My reaction was instantaneous precisely because I didn’t think about professional consequences. In fact, I had only one thought about Chris at the moment—“I can’t let him go on believing that.”

The lesson of Chris, and “just a few thousand,” is one of which new teachers are aware. There is a moral component to what we do in the classroom, one that applies to all subject areas. When we teach, we not only want to foster academic skills and achievement, we want to help children develop into good people. This is a concept of which many teachers are leery, and it’s hard to blame them—since for many, both in the classroom and out, it can sound quite a lot like indoctrination. But when we, through our schools, produce adults who are incapable of critically analyzing the issues of the world and their own lives—that would be the product of indoctrination. Instead, our goal, as Nel Noddings puts it, is invested in “a commitment to building a world in which it is both possible and desirable for children to be good—a world in which children are happy” (Noddings, 2003, p. 2).

Certainly, helping students find a worthwhile and lucrative career is important, as is helping them to acquire the habits of mind that accompany any field of study. But all teachers, in all disciplines, will sooner or later face situations where students believe an idea, or adopt a behavior, which endangers the successful achievement of the goal we seek, a world in which children can be “good.”

But how do we know what that means, to be “good?” Isn’t this is a matter of debate, and isn’t it dangerous for teachers to put themselves in the midst of such debate?

Of course. But that’s part of the job, as much as helping students learn to multiply and divide, or write clear sentences, or construct a logical argument. As teachers, we are representatives of a broader culture, one committed to a series of values that, as a community, we’ve deemed worth promoting and defending. Yes, there are gray areas, but far more often, the answers we have are clearer than we might want to accept.

Supreme Court Justice Potter Stewart, in the 1964 case Jacobellis v. Ohio, offered a succinct definition of obscenity—“I know it when I see it.” When confronting with morally impermissible views, teachers are a bulwark of civilization and morality—and though very often there may be debate about whether or not we should intervene, often (perhaps too often) there is no debate at all. We know it when we see it, and we should have the courage of our own convictions, and faith in the goals of our profession, to act.

References

Noddings, N. (2003). Happiness and education. Cambridge, UK: Cambridge University Press. United States Supreme Court. (1964). Jacobellis v. Ohio. Retrieved from http://www.law.cornell.edu/supct/html/historics/USSC_CR_0378_0184_ZC1.html

New Jersey’s Slavery Past

Deborah P. Carter

The Howe House on Claremont Avenue in Montclair

Reprinted with permission from New Jersey Monthly, “Montclair’s Howe House a Testament to NJ’s Uncomfortable and Dark Past,” https://njmonthly.com/articles/towns-schools/history/montclair-howe-house/

In 1831, James Howe was deeded 6 acres and a small house on Claremont Avenue in Montclair. That house still stands. For many years, the worn clapboard house was known locally as the slave house. James Howe was owned by Nathaniel Crane. A member of one of the town’s founding families, Crane left the property to Howe (rumored to be his son) upon his death.

American slavery began in 1619 and eventually spread to all 13 colonies. By the late 1700s, Garden State neighbors like Pennsylvania, New Hampshire, and Massachusetts, followed 20 years later by New York, began adopting policies to abolish legal human bondage. New Jersey, however, was slow to outlaw the practice and adopted brutal laws restricting rights, including reading, writing, and ownership of firearms and property, for the nearly 12,000 enslaved Africans who lived here at the turn of the 19th century. After 185 years of slavery in New Jersey, in 1804 the state passed the Act for the Gradual Abolition of Slavery. The mandate required enslaved men born after July 4, 1804, to serve 25 years, and enslaved women, 20 years before manumission. By the start of the Civil War in 1861, records indicate slavery in New Jersey had dwindled, but remained legal. In 1866, the state ratified the Thirteenth Amendment to the Constitution, making it the last Northern state to end slavery.

Today, historically significant properties like the Howe House bear witness to New Jersey’s past. The nonprofit Friends of Howe House (FHH) are seeking historic landmark status and recently rallied support to purchase the building. “We are forming a steering committee and seeking community input to determine the next steps for Howe House,” says committee member Kimberly Latortue, adding turning it into a house museum is an option. The town “prides itself on being the epitome of diversity,” says Aminah Toler, a Montclair native and founding member of FHH. “We want to ensure that the Howe House remains to tell the story of the African American history that shaped this town and this country.”

IBM and Auschwitz: New Evidence

Edwin Black

Reprinted with permission from https://historynewsnetwork.org/article/1035

Edwin Black is author of IBM and the Holocaust, The Strategic Alliance between Nazi Germany and America’s Most Powerful Corporation (Crown Publishers 2001 and Three Rivers Press 2002). This article is drawn from Mr. Black’s just released and updated German paperback edition. The new edition includes the discovery of hard evidence linking IBM to Auschwitz. The evidence, detailed here, will be appended to his English language editions at the next reprinting in the new future.

The infamous Auschwitz tattoo began as an IBM number. In August 1943, a timber merchant from Bendzin, Poland, arrived at Auschwitz. He was among a group of 400 inmates, mostly Jews. First, a doctor examined him briefly to determine his fitness for work. His physical information was noted on a medical record. Second, his full prisoner registration was completed with all personal details. Third, his name was checked against the indices of the Political Section to see if he would be subjected to special punishment. Finally, he was registered in the Labor Assignment Office and assigned a characteristic five-digit IBM Hollerith number, 44673.

The five-digit Hollerith number was part of a custom punch card system devised by IBM to track prisoners in Nazi concentration camps, including the slave labor at Auschwitz.

The Polish timber merchant’s punch card number would follow him from labor assignment to labor assignment as Hollerith systems tracked him and his availability for work, and reported the data to the central inmate file eventually kept at Department DII. Department DII of the SS Economics Administration in Oranienburg oversaw all camp slave labor assignments, utilizing elaborate IBM systems.

Later in the summer of 1943, the Polish timber merchant’s same five-digit Hollerith number, 44673, was tattooed on his forearm. Eventually, during the summer of 1943, all non-Germans at Auschwitz were similarly tattooed. Tattoos, however, quickly evolved at Auschwitz. Soon, they bore no further relation to Hollerith compatibility for one reason: the Hollerith number was designed to track a working inmate—not a dead one. Once the daily death rate at Auschwitz climbed, Hollerith-based numbering simply became outmoded. Soon, ad hoc numbering systems were inaugurated at Auschwitz. Various number ranges, often with letters attached, were assigned to prisoners in ascending sequence. Dr. Josef Mengele, who performed cruel experiments, tattooed his own distinct number series on “patients.” Tattoo numbering schemes ultimately took on a chaotic incongruity all its own as an internal Auschwitz-specific identification system.

However, Hollerith numbers remained the chief method Berlin employed to centrally identify and track prisoners at Auschwitz. For example, in late 1943, some 6,500 healthy, working Jews were ordered to the gas chamber by the SS. But their murder was delayed for two days as the Political Section meticulously checked each of their numbers against the Section’s own card index. The Section was under orders to temporarily reprieve any Jews with traces of Aryan parentage.

Sigismund Gajda was another Auschwitz inmate processed by the Hollerith system. Born in Kielce, Poland, Gajda was about 40 years of age when on May 18, 1943, he arrived at Auschwitz. A plain paper form, labeled “Personal Inmate Card,” listed all of Gajda’s personal information. He professed Roman Catholicism, had two children, and his work skill was marked”mechanic.” The reverse side of his Personal Inmate Card listed nine previous work assignments. Once Gajda’s card was processed by IBM equipment, a large indicia in typical Nazi Gothic script was rubber-stamped at the bottom: “Hollerith erfasst,” or “Hollerith registered.” Indeed, that designation was stamped in large letters on hundreds of thousands of processed Personal Inmate Cards at camps all across Europe. The Extermination by Labor campaign itself depended upon specially designed IBM systems that matched worker skills and locations with labor needs across Nazi-dominated Europe. Once the prisoner was too exhausted to work, he was murdered by gas or bullet. Exterminated prisoners were coded “six” in the IBM system.

The Polish timber merchant’s Hollerith tattoo, Sigismund Gajda’s inmate form, and the victimization of millions more at Auschwitz live on as dark icons of IBM’s conscious 12-year business alliance with Nazi Germany. IBM’s custom-designed prisoner-tracking Hollerith punch card equipment allowed the Nazis to efficiently manage the hundreds of concentration camps and sub-camps throughout Europe, as well as the millions who passed through them. Auschwitz’ camp code in the IBM tabulation system was 001.8

Nearly every Nazi concentration camp operated a Hollerith Department known as the Hollerith Abteilung. The three-part Hollerith system of paper forms, punch cards and processing machines varied from camp to camp and from year to year, depending upon conditions. In some camps, such as Dachau and Storkow, as many as two dozen IBM sorters, tabulators, and printers were installed. Other facilities operated punchers only and submitted their cards to central locations such as Mauthausen or Berlin. In some camps, such as Stuthoff, the plain paper forms were coded and processed elsewhere. Hollerith activity, whether paper, punching or processing, was frequently—but not always–located within the camp itself, consigned to a special bureau called the Labor Assignment Office, known in German as the Arbeitseinatz. The Arbeitseinsatz issued the all-important life-sustaining daily work assignments, and processed all inmate cards and labor transfer rosters.

IBM did not sell any of its punch card machines to Nazi Germany. The equipment was leased by the month. Each month, often more frequently, authorized repairmen, working directly for or trained by IBM, serviced the machines on-site–whether in the middle of Berlin or at a concentration camp. In addition, all spare parts were supplied by IBM factories located throughout Europe. Of course, the billions of punch cards continually devoured by the machines, available exclusively from IBM, were extra.

IBM’s extensive technological support for Hitler’s conquest of Europe and genocide against the Jews was extensively documented in my book, IBM and the Holocaust, published in February 2001 and updated in a paperback edition. In March of this year, The Village Voice broke exclusive new details of a special IBM wartime subsidiary set up in Poland by IBM’s New York headquarters shortly after Hitler’s 1939 invasion. In 1939, America had not entered the war, and it was still legal to trade with Nazi Germany. IBM’s new Polish subsidiary, Watson Business Machines, helped Germany automate the rape of Poland. The subsidiary was named for its president Thomas J. Watson.

Central to the Nazi effort was a massive 500-man Hollerith Gruppe, installed in a looming brown building at 24 Murnerstrasse in Krakow. The Hollerith Gruppe of the Nazi Statistical Office crunched all the numbers of plunder and genocide that allowed the Nazis to systematically starve the Jews, meter them out of the ghettos and then transport them to either work camps or death camps. The trains running to Auschwitz were tracked by a special guarded IBM customer site facility at 22 Pawia in Krakow. The millions of punch cards the Nazis in Poland required were obtained exclusively from IBM, including one company print shop at 6 Rymarska Street across the street from the Warsaw Ghetto. The entire Polish subsidiary was overseen by an IBM administrative facility at 24 Kreuz in Warsaw.

The exact address and equipment arrays of the key IBM offices and customer sites in Nazi-occupied Poland have been discovered. But no one has ever been able to locate an IBM facility at, or even near, Auschwitz. Until now. Auschwitz chief archivist Piotr Setkiewicz finally pinpointed the first such IBM customer site. The newly unearthed IBM customer site was a huge Hollerith Büro. It was situated in the I.G. Farben factory complex, housed in Barracks 18, next to German Civil Worker Camp 7, about two kilometers from Auschwitz III, also known as Monowitz Concentration Camp. Auschwitz’ Setkiewicz explains, “The Hollerith office at IG Farben in Monowitz used the IBM machines as a system of computerization of civil and slave labor resources. This gave Farben the opportunity to identify people with certain skills, primarily skills needed for the construction of certain buildings in Monowitz.”

By way of background, what most people call “Auschwitz” was actually a sprawling hell comprised of three concentration camps, surrounded by some 40 subcamps, numerous factories and a collection of farms in a surrounding captive commercial zone. The original Auschwitz became known simply as Auschwitz I, and functioned as a diversified camp for transit, labor and detention. Auschwitz II, also called Birkenau, became the infamous extermination center, operating gas chambers and ovens. Nearby Auschwitz III, known as Monowitz, existed primarily as a slave labor camp. Monowitz is where IBM’s bustling customer site functioned.

Many of the long-known paper prisoner forms stamped Hollerith Erfasst, or” registered by Hollerith,” indicated the prisoners were from Auschwitz III, that is, Monowitz. Now Auschwitz archivist Setkiewicz has also discovered about 100 Hollerith machine summary printouts of Monowitz prisoner assignments and details generated by the I.G. Farben customer site. For example, Alexander Kuciel, born August 12, 1889, was in 1944 deployed as a slave carpenter, skill coded 0149, and his Hollerith printout is marked “Sch/P,” the Reich abbreviation for Schutzhäftling/Pole. Schutzhäftling/Pole means “Polish political prisoner.” The giant Farben facilities, also known as “I.G. Werk Auschwitz,” maintained two Hollerith Büro staff contacts, Herr Hirsch and Herr Husch. One key man running the card index systems was Eduard Müller. Müller was a fat, aging, ill-kempt man, with brown hair and brown eyes. Some said, “He stank like a polecat.” A rabid Nazi, Müller took special delight in harming inmates from his all-important position in camp administration.

Comparison of the new printouts to other typical camp cards shows the Monowitz systems were customized for the specific coding Farben needed to process the thousands of slave workers who labored and died there. The machines were probably also used to manage and develop manufacturing processes and ordinary business applications. The machines almost certainly did not maintain extermination totals, which were calculated as “evacuations” by the Hollerith Gruppe in Krakow. At press time, the diverse Farben codes and range of machine uses are still being studied. It is not known how many additional IBM customer sites researchers will discover in the cold ashes of the expansive commercial Auschwitz zone.

A Hollerith Büro, such as the one at Auschwitz III, was larger than a typical mechanized concentration camp Hollerith Department. A Büro was generally comprised of more than a dozen punching machines, a sorter and one tabulator. Leon Krzemieniecki was a compulsory worker who operated a tabulator at the IBM customer site at the Polish railways office in Krakow that kept track of trains going to and from Auschwitz. He recalls, “I know that trains were constantly going from Krakow to Auschwitz–not only passenger trains, but cargo trains as well.” Krzemieniecki, who worked for two years with IBM punchers, card sorters and tabulators, estimates that a punch card operation for so large a manufacturing complex as Farben “would probably require at least two high-speed tabulators, four sorters, and perhaps 20 punchers.” He added, “The whole thing would probably require 30-40 persons, plus their German supervisors.”

The new revelation of IBM technology in the Auschwitz area constitutes the final link in the chain of documentation surrounding Big Blue’s vast enterprise in Nazi-occupied Poland, supervised at first directly from its New York headquarters, and later through its Geneva office. Jewish leaders and human rights activists were again outraged. “This latest disclosure removes any pretext of deniability and completes the puzzle that has been IBM and Auschwitz: New Evidence.

“When put together about IBM in Poland,” declared Malcolm Hoenlein, vice president of the New York-based Conference of Presidents of Major Jewish Organizations. “The picture that emerges is most disturbing,” added Hoenlein.” IBM must confront this matter honestly if there is to be any closure.”

Marek Orski, state historian of the museum at Poland’s Stuthoff Concentration Camp, has distinguished himself as that country’s leading expert on the use of IBM technology at Polish concentration camps. “This latest information,” asserts Orski,”proves once more that IBM’s Hollerith machines in occupied Poland were functioning in the area of yet another concentration camp, in this case Auschwitz-Monowitz–something completely unknown until now. It is yet another significant revelation in what has become the undoubted fact of IBM’s involvement in Poland. Now we need to compile more documents identifying the exact activity of this Hollerith Büro in Auschwitz Monowitz.”

Krzemieniecki is convinced obtaining such documents would be difficult. “It would be great to have access to those documents,” he said, “but where are they?” He added, “Please remember, I witnessed in 1944, when the war front came closer to Poland, that all the IBM machines in Krakow were removed. I’m sure the Farben machines were being moved at the same time. Plus, the Germans were busy destroying all the records. Even still,” he continues, “what has been revealed thus far is a great achievement.”

Auschwitz historians were originally convinced that there were no machines at Auschwitz, that all the prisoner documents were processed at a remote location, primarily because they could find no trace of the equipment in the area. They even speculated that the stamped forms from Auschwitz III were actually punched at the massive Hollerith service at Mauthausen concentration camp. Indeed, even the Farben Hollerith documents had been identified some time ago at Auschwitz, but were not understood as IBM printouts. That is, not until the Hollerith Büro itself was discovered. Archivists only found the Büro because it was listed in the I.G. Werk Auschwitz phone book on page 50. The phone extension was 4496.”I was looking for something else,” recalls Auschwitz’ Setkiewicz,”and there it was.” Once the printouts were reexamined in the light of IBM punch card revelations, the connection became clear.

Setkiewicz says, “We still need to find more similar identification cards and printouts, and try to find just how extensive was the usage in the whole I.G. Farben administration and employment of workers. But no one among historians has had success in finding these documents.”

In the current climate of intense public scrutiny of corporate subsidiaries, IBM’s evasive response has aroused a renewed demand for accountability. “In the day of Enron and Tyco,” says Robert Urekew, a University of Louisville professor of business ethics, “we now know these are not impersonal entities. They are directed by people with names and faces.” Prof. Urekew, who has studied IBM’s Hitler-era activities, continued, “The news that IBM machines were at Auschwitz is just the latest smoking gun. For IBM to continue to stonewall and hinder access to its New York archives flies in the face of the focus on accountability in business ethics today. Since the United States was not technically at war with Nazi Germany in 1939, it may have been legal for IBM to do business with the Third Reich and its camps in Poland. But was it moral?”

Even some IBM employees are frustrated by IBM’s silence. Michael Zamczyk, for example, is a long-time IBM employee in San Jose, California, working on business controls. A loyal IBMer, Zamczyk has worked for the company for some 28 years. He is also probably the only IBM employee who survived the Krakow ghetto in 1941 and 1942. Since revelations about IBM’s ties to Hitler exploded into public view in February 2001, Zamczyk has been demanding answers—and an apology–from IBM senior management.

“Originally,” says Zamczyk,”I was just trying to determine if it was IBM equipment that helped select my father to be shipped to Auschwitz, and if the machines were used to schedule the trains to Auschwitz.

Zamczyk started writing letters and emails, but to no avail. He could not get any concrete response about IBM’s activities during the Hitler era.”I contacted senior management, all the way up to the president, trying to get an answer,”states Zamczyk. “Since then, I have read the facts about IBM in Poland, about the railroad department at 22 Pawia Street in Krakow, and I read about the eyewitnesses. Now I feel that IBM owes me, as an IBM employee, an apology. And that is all I am looking for.”

Zamczyk was met by stony silence from IBM executives.” The only response I got,” he relates, “was basically telling me there would be no public or private apology. But I am still waiting for that apology and debating what to do next.”

Repeated attempts to obtain IBM reaction to the newest disclosure were rebuffed by IBM spokesman Carol Makovich. I phoned her more than a dozen times, but she did not respond, or grant me permission to examine Polish, Brazilian and French subsidiary documents at the company’s Somers, New York archives. Nor has the company been forthcoming to numerous Jewish leaders, consumers and members of the media who have demanded answers.

At one point, Makovich quipped to a Reuters correspondent, “We are a technology company, we are not historians.”


 

Local History: The American Revolution in the Finger Lakes

Reprinted from New York Almanack based on an essay from the National Park Service’s Finger Lakes National Heritage Area Feasibility Study. https://www.newyorkalmanack.com/2023/09/american-revolution-finger-lakes/#more-98398

Initially, the Haudenosaunee Confederacy (Iroquois) claimed neutrality during the conflict between Britain and the colonists, seeing the disagreement as a civil war and valuing loyalty to their families and to their lands above all else. When the political discontent erupted into the American Revolutionary War, the member nations of the Haudenosaunee Confederacy split their support between the British and newly formed American forces. The majority of nations and individual members supported the British under the belief that those nations would be more likely to keep their relative independence and land under continued British rule, while the Oneida and Tuscarora backed the American Colonists.

As with many American families, alliance was not clear-cut, and in some cases, allegiance was split on a person-by-person basis, which destabilized the clan-based society. What had started as a European civil war on North American soil soon turned the Confederacy against itself, undermining the social unity and political stability that the Six Nations had enjoyed for centuries. In 1778, Loyalists and members of the British-backed nations participated in destructive raids that crippled Continental forces and destroyed frontier settlements in New York and Pennsylvania. Fearing that the New York frontier would be pushed east to the Hudson River if divisive action was not taken, General George Washington ordered General John Sullivan to lead four brigades of men — a sizable portion of the Continental Army — on a scorched-earth campaign that would limit the Haudenosaunee’s ability to attack in the future.

Washington tasked Sullivan with launching a terror campaign to destroy the food supply of the Cayuga and Seneca Nations in the heart of the Finger Lakes and to reduce the Cayuga and Seneca’s forces. Smaller expeditions were tasked with destroying Seneca settlements in western Pennsylvania and Onondaga settlements in Central New York. General Sullivan and his second-in-command, General James Clinton met in Tioga near the Pennsylvania-New York border and began their campaign by destroying the Munsee Delaware settlement of Chemung in present-day Chemung County. Instead of deploying the guerrilla tactics that long served Haudenosaunee well, Confederacy war chiefs and the meager British forces available to counterattack decided to retaliate with a standing battle.

The Battle of Newtown on August 29, 1779, ended in a British and Indian retreat and destroyed morale for the British-backing Confederacy Nations, who now chose to proactively flee to other nearby settlements. For the next two weeks, Sullivan’s forces moved from Seneca Lake to Canandaigua Lake to Chenussio — a Seneca stronghold near present-day Leicester in Livingston County that included 128 multi-family longhouses. By the end of the campaign, Sullivan’s men destroyed more than 40 Haudenosaunee villages, at least 160,000 bushels of corn, countless pounds of stored vegetables and fruit, and only suffered 40 casualties.

While the American forces did not take Haudenosaunee prisoners, the Sullivan Campaign destroyed the nations’ capacity to wage war. By the end of September 1779, more than 5,000 nation members had arrived at the British Fort Niagara expecting food, clothing, and shelter in the face of their catastrophic losses at the hands of the Americans. Instead of lessening the threat to frontier settlements, the Sullivan Campaign increased the animosity of Natives and British alike, laying the ground for fierce fighting within the New York frontier of British-backed Indian raids during the 1780s.

Local History: The Great Depression in New York City

Reprinted from New York Almanack based on an article from the Blackwell’s Almanac, a publication of the Roosevelt Island Historical Society. https://www.newyorkalmanack.com/2023/09/great-depression-in-new-york-city/

As the 1920s advanced, the economy soared. But with that dramatic expansion came irrational exuberance and unchecked speculation: stock prices reached levels that had no basis in reality; margin purchases were rampant; banks handed out loans lavishly and imprudently; and giddy product production resulted in a vast oversupply of goods. On Tuesday, October 29, 1929, it all came crashing down. This is the story of the Great Depression in New York City.

After an erratic week in which stocks, including blue chip stocks, mostly declined, waves of panicked investors sold off their shares, driving the market ever downward. On that one day, now known as Black Tuesday, the market lost $14 billion in value; over the ensuing week, it erased another $30 billion — eventually suffering the staggering loss of 89.2% over its peak in early September.

Bank failures and business bankruptcies followed, presaging a decade of unprecedented economic hardship. New York City came to be viewed as “the symbolic capital of the Depression, the financial capital where it had started, and the place where its effects were most keenly felt.” Many residents lost their savings, their jobs and their homes. By 1932, half the city’s factories were closed, almost one-third of New Yorkers were unemployed (vs. one-quarter of the rest of the country and over one-half in Harlem), and some 1.6 million residents were on relief. Those who remained employed and therefore ineligible for the dole were often forced to take severe pay cuts.

At the time of the crash, under Mayor Jimmy Walker, there were few centralized municipal services that could be tapped for jobs or rescue: there were no central traffic, highway or public works department; street-cleaning was a function of individual boroughs; there were five separate parks departments; unemployment insurance was non-existent and, in the beginning, the Department of Public Welfare had no funds available. New York City, like most cities, was dependent on charitable institutions and alms houses to succor the poor, the homeless and the hungry. Yet these organizations publicly admitted their inability to meet the heavy demands being made of them.

In March 1930, 35,000 out-of-work protesters marched toward City Hall as part of International Unemployment Day organized by the Communist Party. They were met with violent attack by the New York Police Department. Several years later, it was the Black and Latino population’s turn. In addition to being jobless, they had to deal with blatant discrimination, including exclusion from more than 24 of the city’s trade unions and rejection at public work sites. With tempers boiling, a furious Harlem mob vandalized white-owned stores. Some 4,000 individuals took part, inflicting over $2 million in damages, resulting in 30 hospitalizations and several deaths. While an investigation into discriminatory practices was launched, little came of it and the situation continued unchanged.

Riots in New York flared and petered out. What didn’t peter out was the sheer fight to survive – for the hungry, the need to eat, and for the homeless, the need to find shelter. Breadlines and soup kitchens were one aspect of the fight. People lined up daily in long, snaking queues outside bakeries or pantries to score a ration of day-old bread or thin soup. To hide their humiliation from neighbors, many would leave their homes dressed up as if they were going to work. Once on the line, they just stared straight ahead, refusing to interact with their downtrodden peers — in fact, refusing to admit to themselves where they were.

Thousands evicted from their homes took to living in shacks in parks or backstreets. As more and more homeless joined these camps, they grew into little shantytowns nicknamed “Hoovervilles” in condemnation of the inactivity of President Herbert Hoover to remedy the situation. The largest such settlement was located next to the Reservoir in Central Park. Ironically, many of the Hooverville men were construction tradesmen — bricklayers, stone masons, carpenters — who had helped build the luxury buildings surrounding the park and who now set to building their own shanties out of scavenged materials. Despite the skill and artistry with which these abodes were constructed, they were illegal; so both local and federal authorities regularly raided the settlements, destroying the shelters and scattering their inhabitants.

Conditions were dire and pleading letters from city officials and residents alike piled up in the Mayor’s office. Finally, in October 1930, Jimmy Walker created the Mayor’s Official Committee for Relief of the Unemployed and Needy, and things started to happen. By November there was:

  • a City Employment Bureau, which obviated the problem of job-seekers having to pay private employment firms;
  • a stop to the eviction of poor families for rent arrears;
  • a large-scale investigation by the police to determine needs in all 77 precincts;
  • a windfall of contributions to unemployment relief from police and other city employees;
  • an expansion of city lodging facilities; and
  • a special Cabinet Committee to deal with questions of food, clothing and rent.

In the first eight months of its existence, the Committee raised some $1.6 million. Direct relief funds were paid to 11,000 families, while 18,000 tons of food, including Kosher food, was given out to almost a million families. (Night patrolmen spent a good part of their shifts packing and wrapping these food parcels.) The money also paid for coal, shoes and clothing. Another city agency, the Welfare Council, disbursed over $12 million for relief and emergency work wages. These funds too came from voluntary donations. Private citizens contributed; sports teams organized exhibition matches (for example Notre Dame football vs. the New York Giants); and Broadway staged special benefit performances.

For a while spirits rose and hopes of normalcy returned. But by April 1931, it was clear that private welfare measures and one-off City actions could not keep up with the growing distress. Help was needed and it came from a now-familiar individual — Franklin Delano Roosevelt, not as president, but as Governor of New York State. Despairing of any constructive efforts by the Federal government, Roosevelt, unique among governors to accept liability for his constituents, declared: “upon the State falls the duty of protecting and sustaining those of its citizens who, through no fault of their own, find themselves… unable to maintain life.” By August 1931, foreshadowing elements of the future New Deal, a robust public works program was in effect to reduce unemployment. State income tax was increased by 50% and the Comptroller authorized the issuance of revenue bonds at both the state and local level. Some would say that New York City was in better shape than many other cities. Yet it was still on the critical list.

It wasn’t until 1932, when Walker resigned amid an investigation for graft and Herbert Hoover was voted out of office, that the way was paved for major innovations. Newly elected President FDR embodied the optimism of his catchy campaign song, “Happy Days Are Here Again.” Within a couple of years, he promulgated the historic, blockbuster New Deal, and working in close partnership with newly elected Mayor Fiorello LaGuardia, transformed both the country and the City. The “New Deal” New York — the most populous American city with almost seven million residents — was the single greatest beneficiary of the New Deal’s Works Project Administration (WPA) in the entire U.S.

Under the WPA, more than a dozen federal agencies paid for the labor and materials to support hundreds of projects designed to put New Yorkers back to work. The New Deal built housing, schools, courthouses, roads, hospitals and health clinics, libraries, post offices, bridges, and highways. It was the impetus and money behind the Triborough Bridge, LaGuardia Airport, the Lincoln Tunnel, and the East River (FDR) Drive. It also gave the city an extensive system of recreational facilities, including swimming pools, playgrounds, ball fields, hiking trails, and parks.

But construction wasn’t its only recipient. FDR, Eleanor Roosevelt and Harry Hopkins (head of the WPA) recognized that funding culture and practitioners of culture was just as important. (“Hell, they’ve got to eat just like other people,” Hopkins is reported to have said). So, jobless artists, designers, craftsmen and photographers were hired to embellish public spaces with murals and sculptures, while posters publicized other WPA programs, and illustrations, photos and crafts found their way into newly opened galleries and respected museums. Playwrights, writers, actors and singers were paid to create theatrical shows — even Yiddish and German theater. And out-of-work musicians and composers of all stripes (classical, folk, jazz, light opera) were employed to give concerts indoors and out. At the same time, New Deal legislation began strengthening workers’ rights by allowing them to organize, earn a minimum wage and, as discussed below, obtain unemployment compensation and sign up for Social Security.

When Frances Perkins, a fierce advocate of social justice and economic security, was tapped as Secretary of Labor, she brought a list of proposals for FDR’s approval. Among them were unemployment insurance and what she called “old age” insurance. Both of them knew that the development of such programs would encounter many obstacles, not the least of which would be challenges to their constitutionality.

Be that as it may, in 1935, the enabling legislation passed overwhelmingly and FDR authorized the establishment of unemployment insurance and Social Security. And in 1937, the Supreme Court affirmed the constitutionality of levying taxes to fund both programs. IBM won the bid to create the largest and most complicated data processing system ever built. It even designed novel equipment for the unprecedented task of enrolling some 30 million employers and workers, and registering their contributions into the Social Security system for later retirement payouts. According to Perkins, “Nothing [other than the Great Depression] would have bumped the American people into a social security system except something so shocking, so terrifying, as that depression.”

Above and beyond the homeless, 30% of the City’s housed population lived in deteriorating, squalid tenements. There were other slums deemed “unfit for human habitation.” The National Recovery Act of 1933 authorized the clearance of slums, repair of salvageable structures and construction of low cost housing. And the country’s very first “public housing” — a previously unheard of concept — was built in New York under the newly formed New York City Housing Authority (NYCHA). The first three public projects were: First Houses, between First Avenue and Avenue A, from Second to Third Streets in the East Village; Williamsburg Houses, Scholes to Maujer Streets, Leonard Street to Bushwick Avenue, Williamsburg, Brooklyn, Harlem River Houses, Seventh Avenue to Macombs Place, Harlem River Drive, and 151st to 153rd Streets in Harlem. Their public ownership represented a radical step that both created jobs and sheltered people in up-to-date homes. By 1941, nine such projects had been developed in New York City, providing 11,570 units. They are all still with us and the first three have been designated New York City landmarks.

The sheer range of educational programs implemented by the New Deal was remarkable. From kindergarten to college (for example, Hunter College, Brooklyn College, the Merchant Marine Academy in the Bronx), new buildings expanded the student population. Thousands of teachers were hired, and adjunctive programs such as preschool, work-study programs for young people, and vocational classes for adults were instituted. Community education classes were held in libraries, settlement houses, local facilities, trade union halls, park buildings, and even on the radio. There was no end to what a willing individual could learn, including driving, English, home arts, visual arts and new vocational skills. Much of the funds secured for New York City can be directly attributed to LaGuardia’s force of personality. According to Roosevelt, he would show up in Washington “and tell me a sad story. The tears run down my cheeks and tears run down his cheeks and the first thing I know he has wrangled another $50,000,000.”

For many City residents, lack of work had devolved into declining health, malnutrition, and increasing rates of infant mortality. New Deal funding produced new hospitals and neighborhood health clinics. The latter were often located in or near public housing developments and provided free medical and dental care, including immunizations, for all ages. The clinic doctors and nurses also visited homes and schools, and gave classes in healthy living. The clinics even sent housekeepers to help out where parents were ill. Access to regular health care was a first for many New Yorkers and its effects were incontestable: decreased infant mortality, a drop in serious illness and a decline in the suicides that so darkened the Depression years. It took entry into the Second World War to completely obliterate the Great Depression. Tens of thousands of men went off to battle, while the rest of the country was galvanized into full employment by the war effort. Still, the New Deal, with its plethora of alphabet soup subsidiaries, was nothing short of miraculous. It carried the country and New York City through one of the most challenging eras in our history. It transformed the relationship of government to its citizens — embodying a dynamism that has strengthened New York through the years and continues to empower it to this day

The Trumpist Supreme Court: Off the Rails of Democracy

Norman Markowitz

Rage and confusion over the recent Supreme Court decisions is sweeping the nation. The Roe v. Wade decision (1973) establishing women’s reproductive rights has been repealed. A New York State law prohibiting the carrying of concealed guns, passed in response to escalating shootings and deaths, has been declared unconstitutional. The court has sharply reduced the regulatory powers of the Environmental Protection Agency, established in 1970. This comes after decades of scientific research showing the dangers of climate change and global warming.

What is the logic behind this? There is a standard used in philosophy which should be applied to the Court’s recent decisions. Statements, or assertions, should be judged by their “validity and reliability.” Are they true statements in terms of logic, reason, and consistency (validity)? Is the evidence (facts, data) used to support the statement true (reliability)? I will use this standard to look at the Court’s rulings.

The Constitution was a political compromise among merchant capitalists, landlords, slaveholders, creditors, and debtors on a variety of issues — slavery, the payment of debts, and the regulation of trade. It cannot be interpreted like the Jewish Torah, the Christian Gospels, or the Muslim Koran — sacred, unchanging texts. And the Supreme Court has no right to interpret legislation passed by Congress or the directives of the president, since the Constitution did not give the Court the power of judicial review.

However, that power was in effect taken by the Court in 1805 in a brilliant maneuver by Chief Justice John Marshall in Marbury v. Madison. The court has maintained the power of judicial review for over two centuries, often adjusting its interpretations to major changes in society.

The representatives who drafted and approved the Constitution, much less the former colonies/states which ratified it, all rejected the principle of universal suffrage. The leaders of the revolution associated the term “democracy” with mob rule. Property qualifications for voting in federal elections was the established rule. If one took the original intent seriously, the Court would have the power to establish property qualifications for voting, since there is no constitutional amendment abolishing property qualifications for voting, just as there are constitutional amendments abolishing slavery and giving women the right to vote.

When the Constitution was drafted and enacted, English common law defined life as existing when a fetus could be felt moving or kicking in the mother’s womb, called “quickening.” If the mother claimed that the fetus had been aborted before this “quickening,” she was held harmless. Laws banning abortion and contraception, and pamphlets and manuals about both in the mails, were enacted at the state and federal levels in the late 19th century as part of a movement led by the Reverend Anthony Comstock, organizer of the Society for the Suppression of Vice. These laws were part of a backlash against the growing movement for women’s civil rights, equality under the law, and the right to vote. The women’s rights/women’s liberation movement of the 1960s, following in the path of the civil rights/Black liberation movement, led the successful campaign to repeal these laws, which finally resulted in Roe v. Wade, a century after they began to be enacted.

The Court’s decision invalidating a New York state law prohibiting the carrying of concealed handguns is also unreliable. Here the evidence is direct and incontrovertible. The Second Amendment to the Constitution states, “A well-regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.” But in English law and in colonial theory and practice, as Joshua Zeitz in an excellent analysis argues, the amendment never meant that all citizens had the right to bear arms. This right “was inextricably connected to the citizen’s obligation to serve in a militia and to protect the community from enemies domestic and foreign.” And “well-regulated militias” meant militias constituted by legitimate authorities, not private groups like the later KKK, Nazi storm troopers, or self-proclaimed state militias.

Zeitz makes the important point that James Madison, a major author of the Constitution and the Bill of Rights, had earlier drafted legislation in the Virginia legislature barring individuals from openly carrying and displaying guns, like the present New York State law that the Court has declared unconstitutional. The purpose of the amendment was clearly to prevent a government from doing what Britain did in the aftermath of the Boston Tea Party: disperse the colonial legislature and its militia and in effect declare martial law. Also, the guns in question fired single “balls,” not bullets, and had very limited range and accuracy. Today’s AR-15 rifles, for example, used in recent mass shootings, have greater fire power and accuracy than the assault rifles used during World War II and the Korean War.

The Supreme Court’s other decisions on the regulatory powers of the Environmental Protection Agency, and the right of a school employee to engage in religious action, are neither valid in their relationship to the Constitution nor reliable in regard to their factual assertions. They are a repudiation of more than a century of law and policy of the federal regulation of industry and the post–Civil War 14th Amendment defending the civil rights and liberties of citizens from their infringement and/or denial by the states.

The Supreme Court and the judiciary have been the most conservative section of the federal government throughout most of U.S. history. The fact that the justices are not elected and can be removed only through impeachment, resignation, or death explains this.

The courts have in the past and once more in recent decades used the Commerce Clause of the Constitution to declare unconstitutional legislation that regulates business and promotes social welfare. Beginning in the 1880s, they declared corporations “persons” to give them 14th Amendment protections from regulation and taxation by the states, and have over and over again used the 10th Amendment to support states’ rights.

The political nature of the Supreme Court from its very inception is indisputable. The Court, for example, represented the interests of the slaveholder class from the administration of George Washington (himself a slaveholder) up to the Civil War. But as the nation changed, industrial capitalism grew, and the anti-slavery movement became broader, the demands of the slaveholders and the actions of their Supreme Court became more extreme. The Dred Scott decision (1857), which in effect repealed the earlier restrictions on the expansion of slavery in the Western territories, supporting legislation advanced by pro-slavery congresses and presidents, reflected this development. As an afterthought, the slaveholder-dominated Supreme Court claimed that the authors of the Constitution had not intended any Black person, slave or free, to have the rights of an American citizen, an expression of “original intent” which both enraged and strengthened the increasingly militant anti-slavery national coalition.

With the defeat of the Confederacy, slavery was abolished through constitutional amendment in all the states, and the former Confederate states now under Union army occupation had to ratify the amendment to regain admission to the Union. With the support of President Andrew Johnson, a pro-Union former senator from Tennessee (and himself a former slaveholder), they did so while enacting labor codes that in effect declared the former slaves to be unemployed vagrants and returned them to the “custodial care” of their former owners.

In response to these acts, Thaddeus Stevens, Charles Sumner, and other militant anti-slavery leaders of the Republican Party proposed a second constitutional amendment to establish national citizenship and protect the civil rights and civil liberties of the nearly 4 million former slaves. They did this for two reasons. They feared that President Johnson would veto the civil rights legislation they were advancing in Congress. And even if they were able to override his veto, they feared that the Supreme Court, where the now former slaveholders remained a powerful force, would declare such legislation unconstitutional.

The 14th Amendment establishing national citizenship was passed, followed by the 15th, which extended the right to vote. However, the war was a victory for the industrial capitalists and their banker allies, who within a generation betrayed both the former slaves and the workers and farmers who saw Civil War policies like the Homestead Act and the creation of land grant colleges as advancing their class interests.

The Supreme Court and the federal judiciary in the aftermath of the Civil War fiercely defended the interests of “big business” against organized farmers, workers, state governments, and the federal government. In the 1880s, the Supreme Court in a series of decisions invalidated the civil rights acts of the Reconstruction era and the 14th Amendment’s protection of citizenship rights from state government policies. States were permitted to ignore the Civil Rights Act of 1875, which banned exclusion and discrimination in public accommodations. That protection would only be restored by the Civil Rights Act of 1964 after a century of de jure segregation.

In 1896, the Plessy v. Ferguson decision gave states the right to establish segregation by law, using as a cover the principle of “separate but equal” under such laws, although it was clear to everyone that the systematic exclusion of African Americans from public schools, public employment, public transportation, and commercial establishments was crudely unequal. The courts also endorsed state laws which denied the overwhelming majority of Black people the right to vote; the convict lease system, a form of slave labor for prisoners; and state “poll taxes,” which primarily discriminated against poor whites (in most places African Americans had been already disenfranchised).

At the same time, the Court in the 1880s took the 14th Amendment’s defense of the rights of “persons” and applied it to business and corporations, declaring state laws regulating business to be unconstitutional.  At the time the 14th Amendment was proposed and enacted, everyone understood that the “persons” referred to were the 4 million former slaves, no longer under law, but not yet citizens.

But this was just the beginning. An early modest federal income tax (a surcharge on high incomes) was declared unconstitutional in the Pollock case. It negated the Sherman Anti-Trust Act (1890) by declaring that the federal government and the states could only regulate commerce — not manufacture — under the Constitution. In an industrial society, regulation became a farce.

Decades later, a constitutional amendment gave the federal government the right to levy income taxes, and Congress passed legislation that, to a limited extent, regulated trade and restructured the banking system. However, the Court routinely declared unconstitutional state laws protecting the right of workers to organize unions, providing for the health and safety regulation of workplaces, minimum wages, and the 1916 federal law outlawing child labor.

It was not until the Great Depression of the 1930s, which saw the great upsurge of labor with the Communist Party playing a central role, that the New Deal government enacted the most important labor and social welfare legislation since the abolition of slavery and battled to compel the judiciary to accept these major reforms in the interests of the working class and the whole people.

The struggle for major judicial reform went back to the late 19th century. It sought to de-emphasize precedence, the “dead hand” of previous decisions, and make the law respond to social changes and realities, to connect the “facts” as they existed in the present with past decisions under the law. Law professor Roscoe Pound and attorney Louis Brandeis were the champions of this approach to law, called “legal realism.” Brandeis especially popularized the doctrine in leading campaigns against corporate monopolistic price fixing and business corruption of public officials, which earned him the name “the People’s Attorney.”

He also developed a legal brief which incorporated social research (the Brandeis brief) in arguing cases. His fame in the early 20th-century Progressive movement led Woodrow Wilson to appoint him to the Supreme Court, where he joined with Justice Oliver Wendell Holmes to represent a minority that supported the regulation of industry, social legislation, and the defense of First Amendment civil liberties. Regarding civil liberties, the minority supported freedom of speech, assembly, and association unless, in Holmes’s language, there was a “clear and present danger” to society, and not just a “dangerous tendency” that certain acts might lead to others, which was the conservative position.

In the 1936 elections, Roosevelt campaigned against the old-guard Court and the “economic royalists” whom they represented, reviving the language of the American revolution in his and the New Deal’s sweeping victory. Roosevelt sought to expand the court for every justice over the age of 70, which would have increased its size to 15 justices.

Conservatives fought back, wrapping the Court in the Constitution, attacking his court reorganization plan as “court packing.” In the Court fight, conservative Southern Democrats, including many who had worked behind the scenes against the New Deal like senators Tom Connally of Texas and Walter George of Georgia, along with the vice president, John Nance Garner, turned against Roosevelt. The weakened GOP let the Democrats carry the ball, but it was from this court fight that the informal conservative coalition of Southern Democrats and Republicans began to take shape.

Faced with the attack, the Court, which had four Coolidge/Hoover “Business of America is Business” conservatives, three urban liberals, and two moderate conservatives, shifted. In 1936 the Court had voted 6-3 against the New York minimum wage law. But in 1937 the Court upheld by a vote of 5 to 4 a similar Washington State minimum wage law, ruled in favor of the Wagner Act in the Jones and Laughlin Steel case, and upheld the Social Security Act and unemployment insurance. In all these rulings, Owen Roberts and Chief Justice Charles Evans Hughes changed their votes to side with Roosevelt.

By the end of 1937, as the old-guard conservatives began to retire, Roosevelt, defeated in the reorganization fight, began to replace them with New Dealers and by the time of the Pearl Harbor attack had forged a New Deal majority. The new Court moved away from the old doctrines of constitutional original intent associated with the corporate-dominated courts of the post–Civil War era toward a view that the Court must change with changing economic and social conditions. Most of all, the Court retreated from its support for business and its defense of the absolute right of freedom of contract. Instead, a law was to be “presumed constitutional” on questions concerning economic power and government regulation — constitutional regulation came to be seen, as one decision put it, as regulation for the “public good.” Economic freedom was no longer the preferred freedom of the court, and economic activity was no longer local and thus not regulatable.

The court also upheld in the Fair Labor Standards Act minimum wages for all citizens, whereas later it vetoed state minimum wage legislation for women, refused to apply the anti-trust laws to unions, and outlawed the sit-down strike in 1939 (NLRB v. Fansteel Metallurgical Corp.), but in a decision that defended and established peaceful picketing.

At the same time, the Court under New Deal leadership began to develop a new doctrine of preferred freedoms, a doctrine that stressed the need to protect the rights of political dissenters and minorities. In late 1937, the Court declared unconstitutional state laws barring speech and assembly that had been used to convict and imprison Communist Party activists like Angelo Herndon in Georgia, later explicitly defended religious freedom in the case of Jehovah’s Witnesses’ refusal to swear allegiance to the flag and revived the clear and present danger criteria to protect free speech and assembly. In 1938 the Court, for the first time since the end of Reconstruction, enforced some civil rights claims when it contended that the state of Missouri, by not supplying legal education for Black students had violated the separate but equal doctrine of Plessy (Missouri had offered to pay part of their tuition). While the decision didn’t challenge segregation, it pressured Southern states to increase educational programs under segregation for African Americans.

In the Hague case, the Court declared unconstitutional a local Jersey City ordinance against picketing and demonstrations which had been used for mass arrests — subsequently, this was defined to mean peaceful picketing. In U.S. v. Carolene Products (1938), the majority ruled that the court would no longer apply “heightened scrutiny” to economic legislation; however, in a footnote, Harlan Fiske Stone added that the Court was obligated to apply a “more exacting judicial scrutiny” in cases where laws or regulations contradicted the Bill of Rights or adversely affected minorities. The famous “footnote 4” had important implications for Bill of Rights freedoms for dissenters and minorities.

Following the recession of 1937 and the business-conservative counterattack and backlash of 1938, the New Deal was politically stalemated in Congress and without a clear program. However, by this time, the labor social welfare program was consolidated, at least for the short term. Further, the great fortress of conservative power protected from the electoral process — the Supreme Court — was overthrown.

Democratic President Harry Truman’s appointees set back the Court’s support for civil liberties, especially in the 1950–51 Eugene Dennis case, where the Court upheld the convictions and imprisonment of the leadership of the CPUSA under the 1940 Smith Act. The appointments of Earl Warren as Chief Justice and William Brennan by Republican President Dwight Eisenhower, however, greatly strengthened the Court’s progressive majority at a time when Cold War policies moved Congress and the president to the right.

In the Brown decision (1954), the Court declared school segregation unconstitutional. The Supreme Court also in the Yates and other decisions made illegal some of the worst aspects of state and federal anti-Communist policies, leading the FBI to establish its secret Cointelpro program. In the later Miranda and Gideon decisions the Court limited police power to interrogate and hold suspects without formally charging them and reading them their rights, including their right to legal representation or a court-appointed attorney to represent them. The Court also rejected early challenges to the Civil Rights Acts of 1964 and 1965. Although Richard Nixon’s election to the presidency and his appointments moved the Court in a more conservative direction over time, Court decisions in the early 1970s effectively abolished the death penalty in the U.S. and, in Roe v. Wade, legalized abortion.

Even before Ronald Reagan gained the presidency, the Nixon-influenced Court began to move to the right. In 1976, the court gave states the right to reestablish the death penalty (subsequently the death penalty would be established at the federal level in a more extensive way than at the state level). In 1980, the Supreme Court upheld an amendment to the funding of Medicaid in 1976 which barred the use of Medicaid funds for abortions, a cruel blow to the rights of low-income and poor women.

Over the following four decades, a series of decisions chipped away at civil rights and civil liberties; weakened the regulation of commerce, industry, and finance; and removed restrictions on the use of money in elections. The Court’s conservative majority became more militantly reactionary, destroying earlier compromise decisions brokered by conservatives. Donald Trump, who gained the presidency in large part because of the deeply undemocratic nature of U.S. politics, failed to implement his far-right domestic policies, which both large numbers of Americans and people throughout the world saw as “neofascism.” However, his “success” in appointing three Supreme Court judges is now his “legacy,” in that they are doing what he failed to accomplish.

First, we must understand that a large majority of the people oppose these decisions, just as in 1857 and 1936 a large majority of the people opposed the Supreme Court’s pro-slavery Dred Scott decision and its decisions declaring New Deal regulatory and social legislation unconstitutional. The Republican Party mobilized opposition to the Dred Scott decision to win the 1858 congressional elections. More than 70 years later, the Democratic Party mobilized opposition to the conservative Court’s decisions to propel Roosevelt to an overwhelming victory in the 1936 national elections. The same kind of united opposition must be organized now. We must point out that the present Court has set the nation back and may continue to block progress regarding immediate issues such as inflation, health care, or the cost of energy and transportation. Were the government to attempt, for example, to establish price controls, create a national public health system, and expand public transportation, the Court would not be on the people’s side.

The trade union movement, all civil rights and women’s rights organizations, and all environmental organizations must mobilize supporters and communities throughout the nation to vote against the Republican senators and congresspeople who over decades have created this judiciary. Such an electoral victory is necessary but not in itself sufficient. Many today are calling for an expansion of the Court. Congress and the president have the power to do that, since the number 9 is not in the Constitution. We should begin to think about a larger expansion of the federal judiciary itself. Since the 1980s, the conservative Federalist Society has advanced the doctrine of original intent as a cover to restore Court rulings opposing federal regulation of business and social welfare legislation. A government committed to restoring what the Court had represented in the New Deal–Great Society era should actively appoint attorneys who support those positions.

Finally, the question of judicial review itself could be formally ended by Congress and the president. As was contended earlier, it is not a part of the Constitution, and there is no evidence that the Constitutional Convention intended it to be established. The Court has acted to strike down and take away from the people major social protections and rights. As such its power of judicial review can and should be taken away from it.

Era 5 – Engaging High School Students in Global Civic Education Lessons in U.S. History

New Jersey Council for the Social Studies

www.njcss.org

The relationship between the individual and the state is present in every country, society, and civilization. Relevant questions about individual liberty, civic engagement, government authority, equality and justice, and protection are important for every demographic group in the population.  In your teaching of World History, consider the examples and questions provided below that should be familiar to students in the history of the United States with application to the experiences of others around the world.

These civic activities are designed to present civics in a global context as civic education happens in every country.  The design is flexible regarding using one of the activities, allowing students to explore multiple activities in groups, and as a lesson for a substitute teacher. The lessons are free, although a donation to the New Jersey Council for the Social Studies is greatly appreciated. www.njcss.org

The development of the industrial United States is a transformational period in our history. The United States became more industrial, urban, and diverse during the last quarter of the 19th century. The use of fossil fuels for energy led to mechanized farming, railroads changed the way people traveled and transported raw materials and goods, the demand for labor saw one of the largest migrations in world history to America, and laissez-faire economics provided opportunities for wealth while increasing the divide between the poor and rich. During this period local governments were challenged to meet the needs of large populations in urban areas regarding their health, safety, and education.  

The Patrons of Husbandry, or the Grange, was founded in 1867 to advance methods of agriculture, as well as to promote the social and economic needs of farmers in the United States. The financial crisis of 1873, along with falling crop prices, increases in railroad fees to ship crops, and Congress’s reduction of paper money in favor of gold and silver devastated farmers’ livelihoods and caused a surge in Grange membership in the mid-1870s. Both at the state and national level, Grangers gave their support to reform-minded groups such as the Greenback Party, the Populist Party, and, eventually, the Progressives.

The social turmoil that the Western farmers were in was mainly a result of the complete dependence on outside markets for the selling of their produce. This meant that they had to rely on corporately owned railroads and grain elevators for the transport of their crops. To make matters worse, “elevators, often themselves owned by railroads, charged high prices for their services, weighed and graded grain without supervision, and used their influence with the railroads to ensure that cars were not available to farmers who sought to evade elevator service.” In 1871, Illinois created a new constitution allowing the state to set maximum freight rates but the railroads simply refused to follow the mandates of the state government.

The Grangers became political by encouraging friends to elect only those officials with the same views. Furthermore, while Republicans and Democrats had already been bought out by corporations looking to curry favor in the government, Grangers vowed to create their own independent party devoted to upholding the rights of the general populace.

On Independence Day, 1873 (known as the Farmer’s Fourth of July), the Grangers read their Farmer’s Declaration of Independence, which cited all of their grievances and in which they vowed to free themselves from the tyranny of monopoly.  The Supreme Court decision in Munn v. Illinois stated that businesses of a public nature could, in accordance with the federal constitution, be subject to state regulation. Following this ruling, several pieces of legislation, collectively known as the Granger Laws, were passed. Unfortunately, many of these laws were repealed.

Though the organization did not last, it demonstrated the effects that monopolies have on society. It subjugated these individuals to its whims, and then forced them to take action against it. 

The Yellow Vests Protest in France

Donning the now-famous fluorescent waistcoats that are mandatory in French cars, the  Yellow Vests staged 52 consecutive weeks of protests against economic hardship, mounting inequality and a discredited political establishment. They manned roundabouts across the country night and day, took to the streets on every Saturday since November 17, and at their peak in December even stormed the Arc de Triomphe in central Paris, amid scenes of chaos not witnessed since May ’68. The movement had an indelible mark on France, forcing the government into billions of euros of tax breaks.

“The picture that emerged was that of a movement made up largely of workers and former workers in a situation of financial insecurity, with relatively few unemployed,” said Gonthier. Yellow Vests were present across France, but strongest in small towns and rural areas. They came from all walks of life, but liberal professions were underrepresented, while small business owners and employees, craftspeople and care workers formed the bulk of the movement. About two thirds of respondents earned less than the average wage, and a slightly higher percentage registered as having a “deficit of cultural resources and social links”. This in turn “conditioned the way they defined themselves, and helped distance them from traditional social movements”, Gonthier added.

Another defining feature was the high proportion of women, who made up roughly half the Yellow Vests, whereas social movements traditionally tend to be male-dominated. Gonthier said this reflected the significant mobilization of women in care work, “most notably hospital workers from a public health sector that is plunging deeper into crisis”. They included a high number of single mothers who couldn’t go out and protest, or were scared away by the police’s heavy-handed response, but who supported the movement online.

  1. Are monopolies harmful to a growing economy or are they a necessary ‘evil’?
  2. Is it inevitable that an oppressed people will revolt and attempt to destroy that which has kept them down?
  3. How can governments best address poverty and inequality?
  4. If a significant minority feels oppressed, do they have a right to overthrow their government by protest or violence if they cannot get satisfaction through the process of elections?
  5. Do you support the Grangers, Yellow Vests, both or neither?

The Granger Revolution

The Grange Movement

A Brief Essay on the Grange Movement

Who are France’s Yellow Vest Protestors and What do they Want?

The Yellow Vest Movement Explained

Activity #2: Munn-Wabash Railroad in Illinois and the Trans-Siberian Railroad in Russia

Route of the Wabash Railroad in the Midwest

The Wabash Railroad Company went bankrupt and was sold. The new Toledo and Wabash Railroad Company was chartered October 7, 1858. The Wabash and Western Railroad was chartered on September 27 and acquired the Indiana portion on October 5. On December 15, the two companies merged as the Toledo and Wabash Railway, which merged with the Great Western Railway of Illinois. The right of continuous transportation from one end of the country to the other is essential in modern times to that freedom of commerce. The Commerce Clause in the U.S. Constitution gives Congress the power to regulate commerce among the States and with foreign nations. If Illinois or any other state within whose were permitted to impose regulations concerning the price, compensation, or taxation, or any other restrictive regulation it would be harmful to commerce between states.

The Trans-Siberian Road in Russia

Trans-Siberian Railroad Crossing a large river in Siberia

The construction of the longest railway in the world  was launched in April 1891 and was completed in 1894. Three years later the section between Vladivostok to Khabarovsk with a length of 772km was opened in November 1897. The Central Siberian Railway from the River Ob to Irkutsk with a length of 1839km was built in 1899. The construction involved more than 100,000 workers, including prisoners, and the work was carried out by hand using shovels, axes, crowbars, saws. Despite the many challenges of the taiga, mountains, wide rivers, deep lakes, and floods, the tracks were built with amazing speed – around 740km per year.

  1. Does the protection of technology for the efficiency of commerce justify federal regulations over state regulations?
  2. If a corporation is losing money, do they have a right or obligation to raise rates to become profitable?
  3. Do authoritarian governments have an advantage or disadvantage in the construction of large infrastructure projects?

Consolidation of Railroads in Four States

The Supreme Court Strikes Down Railroad Regulation

Interstate Commerce Act (1887)

Construction of the Trans-Siberian Railroad

History of the Trans-Siberian Road

No crisis of the Cleveland presidencies exceeded the magnitude of the financial panic that gripped the nation at the start of his second term in 1893, and which presaged a depression that still lingered when he left office in March 1897.

The Constitution granted Congress the power “to coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures.” (Article 1, Section 8) Article I, Section 8, and Clause 2 The Congress shall have power to borrow money on the credit of the United States. In the 14th Amendment, Section 4, it states that “the validity of the public debt of the United States, authorized by law … shall not be questioned.”

In the century preceding 1893, Congress experimented with two central banks, a national banking system, laws regulating so-called “wildcat banks,” paper money issues, legalized suspension of specie payments, and fixed ratios of gold and silver. Gold and silver rose to prominence as the predominant monies of the civilized world because of their scarcity and value. Under the direction of Alexander Hamilton, the federal government adopted an official policy of bimetallism and a fixed ratio of 15 to 1 in 1792.

In 1875, the newly-formed National Greenback Party called for currency inflation through the issuance of paper money tied, at best, only minimally to the stock of specie. The proposal attracted widespread support in the West and South where many farmers and debtors joined associations to lobby for inflation, knowing that a reduction in the value of the currency unit would alleviate the burden of their debts.

When President Cleveland assumed office on March 4, 1893, the Treasury’s gold reserve stood at the historic low of $100,982,410 — slightly above the $100 million minimum required for protecting the supply of greenbacks. The Panic of 1893 began when the gold reserves fell below $100,000,000. Stocks fell and factories closed with many going bankrupt. Unemployment rose to 9.6%, nearly three times the rate for 1892. By 1894, the unemployment rate was almost 17%. The Sherman Silver Purchase Act was repealed in support of gold as a stable currency.

Cleveland’s position on sound money was not supported by his Democratic Party. The Gold Standard Act of 1900 resulted in a stable gold standard and economic growth. Cleveland’s position on sound money worked.

Hyperinflation in Germany

Under the Treaty of Versailles Germany was forced to make a reparations payment in gold-backed Marks. On June 24, 1922, Walter Rathenau, the foreign minister was assassinated. The French sent their army into the Ruhr to enforce their demands for reparations and the Germans were powerless to resist. More than inflation, the Germans feared unemployment. A cheaper Mark, they reasoned, would make German goods cheap and easy to export, and they needed the export earnings to buy raw materials abroad. Inflation kept everyone working.

The price increases began to be dizzying. Menus in cafes could not be revised quickly enough. For example, a student at Freiburg University ordered a cup of coffee at a café for 5,000 Marks. He had two cups but when the bill came, it was for 14,000 Marks. When the 1,000-billion Mark note came out, few bothered to collect the change when they spent it. By November 1923, with one dollar equal to one trillion Marks, the breakdown was complete. The currency had lost meaning and value.

Although the currency was worthless, Germany was still a rich country — with mines, farms, factories, forests. The backing for the new Rentenmark was the value of the land for mortgages and bonds for the factories. Since the factories and land couldn’t be turned into cash or used abroad the value of one Rentenmark was equal to one billion of the former Marks. People lost their savings and homes.

Questions:

  1. Is a sound currency policy, where the dollar is backed by gold or some other form of credit, always the best policy for governments to follow?

    2. Does the financial debt of a country matter if its economy is growing?  Does it matter in times of war or the recovery from a natural disaster?

    3. In a financial crisis, a depression, does everyone suffer equally or are some more affected than others?

    4. Which problem should the government address first? High Unemployment of 8% or rising inflation of 5%? Why?

    5. Is foreign investment in a country’s economy necessary to maintain a balance of payments?

    6. Based on the U.S. Constitution, is the debt of our government limited or unlimited?

    The Panic of 1893 and the Election of 1896

    Price Stability and the Fed

    The Weimar Republic

    The German Hyperinflation, 1923

    Hyperinflation in Germany

    Historians often call the period between 1870 and the early 1900s the Gilded Age. This was an era of rapid industrialization, laissez-faire capitalism, and no income tax. Captains of industry like John D. Rockefeller and Andrew Carnegie made fortunes. They also preached “survival of the fittest” in business.

    By the late 1800s, however, monopolies, not competing companies, increasingly controlled the production and prices of goods in many American industries.

    Workers’ wages and working conditions were unregulated. Millions of men, women, and children worked long hours for low pay in dangerous factories and mines. There were few work-safety regulations, no worker compensation laws, no company pensions, and no government social security.

    Starting in the 1880s, worker strikes and protests increased and became more violent. Social reformers demanded a tax on large incomes and the breakup of monopolies. They looked to state and federal governments to regulate capitalism. They sought legislation on working conditions, wages, and child labor.

    Railroad builders accepted grants of land and public subsidies in the 19th century. Industries facing strong competition from abroad have appealed for higher tariffs. American agriculture benefited with land grants and government support. State governments helped finance canals, railroads, and roads.

    It is difficult to separate government intervention, regulation, and laissez-faire in American history. It is likely even more difficult to find the proper balance between government and free enterprise. Perhaps the most serious violations occurred during this era in America’s history with land grants to railroads, regulating the rates railroads could charge, mandating time zones, and allowing paper currency.

    1. Why is limited government and laissez-faire economics popular in the United States over time and today?
    2. Should the federal government regulate education and schools or should this be left to the local and state governments?
    3. Does laissez-faire economics bridge or widen the income gap between the social classes?
    4. Who benefits the most from increasing government regulation?

    Laissez-faire Economics in Practice

    Social Darwinism and Laissez-faire Capitalism in America

    Defending the Free Market from Laissez-faire?