The Exploitation of Enslaved Women During The 18th Century Colonial America

Logan Stovall

Logan Stovall is an eighth grade student at Montclair Kimberley Academy in Montclair, NJ

The 18th century represents a dark period in American history when the institution of slavery thrived, and the exploitation of enslaved Black women flourished. The cruel realities endured by Black women during this time were not only a consequence of their enslavement but were magnified by both their race and gender, perpetuating a cycle of inequality and suffering. Beyond the physical captivity, these women endured a complex oppression that not only involved grueling labor but also made them victims of sexual violence. The harsh reality of this oppression becomes evident when one reflects on how the clothing worn by enslaved Black women served as a physical manifestation of their fragile existence. The clothes they wore were not just rags or pieces of fabric used to cover their bodies; they represented a system that dehumanized and abused them.  During the 18th century, an enslaved Black woman’s gender and race primarily affected the way she lived and thrived in an illiberal society. Understanding the exploitation of enslaved Black women during the American colonial era requires a closer look into the sweat of their daily labor, the sexual abuse they endured, and the clothing they wore that bound them to such a harsh life.

However, before any analysis regarding the exploitation of enslaved Black women is made, one must first consider that the racial stereotypes and discriminatory practices against enslaved Black women during the colonial era were the underlying causes of their mistreatment. The widely accepted racist ideas of Antebellum white slaveholders led them to think of their enslaved people as both biologically and culturally inferior. Due to their understanding of the social hierarchy at this time, slaveholders often whipped and physically mistreated enslaved women under their supervision.[i] In addition to the racist beliefs they held, slaveholders also created various stereotypes about enslaved Black women. One such popular stereotype was the “Mammy” caricature. The “mammy caricature” depicted enslaved Black women as enjoying their servitude, being physically unattractive, and only fit to be domestic workers.[ii]

In contrast to the “mammy caricature”, slave owners also created a more promiscuous stereotype of enslaved Black women: the “Jezebel” figure. The Jezebel caricature was used during slavery to justify a slaveholder’s objectification and sexual exploitation of enslaved Black women.[iii] The Mammy and Jezebel caricatures, along with various other derogatory stereotypes that plagued enslaved Black women, heavily influenced how the rest of the White population during the Antebellum period perceived and treated Black women. Sadly, these caricatures endured for decades even after colonial times.

With racial stereotyping forming the underlying cause of discrimination against Black women, a significant amount of White slave masters often subjected Black women to harsh labor conditions. Enslaved women were often forced to work in the fields from sunrise to sunset where they endured physical and emotional abuse. On larger farms and plantations, for example, women were forced to perform tasks like hoeing and ditching entire fields. These were the most exhausting and uninteresting forms of fieldwork.[iv] Slaveholders also held enslaved women accountable for cleaning and tidying communal areas like stables and expected them to spread manure as a fertilizer.[v] Moreover, slave owners frequently questioned how much time off enslaved women needed to adequately take care of their families and children. When not offered any downtime by their slaveholders, enslaved women had to bring their children with them to the fields and strap them to their backs as they worked tirelessly. [vi]

Black women’s exploitation extended beyond the fields. In many instances, the labor performed by enslaved women was prolonged and complicated. For example, many enslaved women began to work for slaveholders at a very young age. There was little free time for enslaved women to rest, given that most women worked for their master five to six days a week. This included keeping the owner’s homes clean, cooking food, and washing their clothes.[vii] In short, enslaved women were expected to work tirelessly, both in the fields and in the house. The slave masters did not care about  the well-being of their enslaved women and exploited them for their free labor.  For Black women, slavery in the southern colonies meant long days performing menial, exhausting tasks, sometimes in the hot, baking sun.  After working prolonged, hard days for the slaveholders, these women had to care for their own families, which was often a physical and mental challenge due to the absence of time to rest. When enslaved women did not meet the expectations for their work by their enslaver, they would oftentimes be taken advantage of sexually or physically assaulted as a form of punishment. Unfortunately, this possibility became a reality for many enslaved Black women.

Indeed, as the slave population in America grew larger through the importation of slaves, enslaved Black women primarily as reproducers of a valuable labor force rather than merely a part of the labor force.  The sexual exploitation of Black women extended from sexual gratification of their White slaveholders to include reproducing offspring that would expand their workforce. Though slave owners valued enslaved women as laborers, they were also well aware that female slaves could be used to successfully reproduce new labor (more children who would grow up to be slaves) by continuing their role as full-time mothers.[viii] This presented slaveholders with a dilemma because West African women usually had some prior agricultural experience (like growing tobacco and rice) which could be used to the slaveholders’ benefit.[ix]

In 1756, Reverend Peter Fontaine of Charles City County, Virginia, stated that Black females were “far more prolific than…white women.” This form of racial stereotyping made enslaved women extremely vulnerable to physical assault.[x] Many white enslavers raped Black women for sexual pleasure, as well as for their ability to produce children who would become slaves and ultimately increase their wealth. Instead of perpetuating the stereotype that all enslaved Black women were unattractive and were only fit to be domestic workers, they now were feeding into the stereotype that Black women were promiscuous and desired for the reproduction of enslaved children who could be used or sold. This form of physical exploitation was pervasive throughout the Antebellum South.

In addition to labor and sexual exploitation, clothing was another form of exploitation that enslaved Black women were forced to endure. While these women often knitted or otherwise made beautiful garments for White women and their children, the fabrics that enslaved Black women wore themselves offered minimal protection from the weather and had to be inexpensive and easy to make.[xi] Their clothing was so cheap in quality that it often disassembled or tore within weeks. As a result, enslaved women often borrowed clothing from one another or even stole clothing from the slave master’s house. They did this to give themselves or their families warm, sustainable garments, and sometimes, to blend into the free population. Oppressors often made enslaved women wear poor, rugged clothing to symbolize a Black woman’s low status and to cultivate racial stereotypes depicting Black women as inferior. Indeed, one reason why enslaved women wanted to steal White people’s clothes was because they wanted to appear as free Black people with increased status.[xii]

Despite being subjected to clothing exploitation, many enslaved women nevertheless tried to continue to be connected to their former culture by wearing West African garments. Enslaved women working in slaveholders’ homes were expected to cover their heads with lightweight white caps, which other members of the household also wore. However, to continue the West African tradition, many enslaved women also chose to wear brightly colored head wraps that surrounded their heads and were secured with knots and tucking’s.[xiii] They also sometimes wore cowrie shells in their hair; which were very expensive and far more valuable than money. These cowrie shells also appeared in spirit bundles as parts of clothing and jewelry, implying their use as amulets.

Black women not only wore these West African garments to remain connected with their former cultures, but they also wore the garments as a form of resistance against enslavement.[xiv] Enslaved Black women despised their status as slaves but were able to feel proud about and connect to their former West African heritage when they wore their cultural headdresses. The significance of these garments likely gave Black women a feeling of strength and empowerment as they were emotionally frightened by the abuse they faced from their enslavers.

During the 18th century, the exploitation of enslaved Black women through their gender and race greatly influenced the way they survived and flourished in a prejudicial society. Enslaved women were exploited in numerous ways and were expected to address the needs of others to the detriment of caring for themselves and their families. They worked extremely hard, both in the house and in the field, and did whatever they were commanded to do withstanding both physical and emotional abuse. They were often raped through their shabby clothing and physically assaulted by their master’s for punishment, as a means to increase their profit in human labor. But still, an enslaved Black woman was able to overcome these acts of exploitation non-violently and create her own peace by wearing and displaying garments that were distinct to her West African culture. Given all that these enslaved women endured, we should respect and admire their ability to overcome such incredible hardships.

Smithsonian, and National Museum of African American History and Culture. “Cowrie Shells and Trade Power.” National Museum of African American History and Culture. Accessed November 15, 2023. https://nmaahc.si.edu/cowrie-shells-and-trade-power#:~:text=Europeans%20in%20the%2016th%20century,at%20their%20use%20as%20amulets .


[i] LDHI, “Hidden Voices: Enslaved Women in the Lowcountry and U.S. South,” LDHI, accessed November 27, 2023, https://ldhi.library.cofc.edu/exhibits/show/hidden-voices/enslaved-womens-work.

[ii] LDHI, “Hidden Voices,” LDHI.

[iii] LDHI, “Hidden Voices,” LDHI.

[iv] Jennifer Hallam, “The Slave Experience: Men, Women & Gender,” Slavery and the Making of America, accessed November 27, 2023, https://www.thirteen.org/wnet/slavery/experience/gender/history.html.

[v] Emily West, Enslaved Women in America: From Colonial Times to Emancipation (Lanham: Rowman & Littlefield Publishers, 2017), 29

[vi] West, Enslaved Women, 28.

[vii] LDHI, “Hidden Voices,” LDHI.

[viii] West, Enslaved Women, 28.

[ix] West, Enslaved Women, 29

[x] West, Enslaved Women, 31.

[xi] Daina Ramey Berry and Deleso A. Alford, eds., Enslaved Women in America: An Encyclopedia enhanced credo edition ed. (Santa Barbara, CA: Greenwood, 2012), 34 and 35.

[xii] Katherine Gruber, ed., “Clothing and Adornment of Enslaved People in Virginia,” Encyclopedia Virginia, last modified December 7, 2020, accessed November 5, 2023, https://encyclopediavirginia.org/entries/slave-clothing-and-adornment-in-virginia/.

[xiii] Gruber, “Clothing and Adornment,” Encyclopedia Virginia.

[xiv] Smithsonian and National Museum of African American History and Culture, “Cowrie Shells and Trade Power,” National Museum of African American History and Culture, accessed November 15, 2023, https://nmaahc.si.edu/cowrie-shells-and-trade-power#:~:text=Europeans%20in%20the%2016th%20century,at%20their%20use%20as%20amulets.


Berry, Daina Ramey, and Deleso A. Alford, eds. Enslaved Women in America: An Encyclopedia. Enhanced Credo edition ed. Santa Barbara, CA: Greenwood, 2012.

Gruber , Katherine, ed. “Clothing and Adornment of Enslaved People in Virginia.” Encyclopedia Virginia. Last modified December 7, 2020. Accessed November 5, 2023. https://encyclopediavirginia.org/entries/slave-clothing-and-adornment-in-virginia/.

Hallam, Jennifer. “The Slave Experience: Men, Women & Gender.” Slavery and the Making of America. Accessed November 27, 2023. https://www.thirteen.org/wnet/slavery/experience/gender/history.html.

LDHI. “Hidden Voices: Enslaved Women in the Lowcountry and U.S. South.” LDHI. Accessed November 27, 2023. https://ldhi.library.cofc.edu/exhibits/show/hidden-voices/enslaved-womens-work.

West, Emily. Enslaved Women in America: From Colonial Times to Emancipation. Lanham: Rowman & Littlefield Publishers, 2017.

Sally Hemings’ Legacy of Freedom and Motherhood

Ms. Aquino is an eighth grade student at Montclair Kimberley Academy in Montclair, NJ

Sally Hemings led an extraordinarily complex life, yet her story inspires thousands of women, myself included. Despite the intricacies, she fought against the notion of becoming just another enslaved individual in her family’s generational cycle. Sally sought to change the trajectory of her children’s lives, offering them opportunities beyond enslavement. Instead of securing her own freedom, she made a selfless choice to promise freedom to her future children—a decision that stands out as a remarkable act of heroism. Sally Hemings’s life, sacrifices, and ability to persuade Thomas Jefferson into making her a  promise was an act of heroism towards her children. Her story is a testament to the profound strength of a mother’s love and the power of quiet rebellion against an oppressive system.

Born into slavery, Sally began her journey as one of Polly’s , Thomas Jefferson’s daughter, maid, and caretaker. Over time, she developed a close relationship with Polly, potentially even her aunt as well.[1] During their time in Paris, where Sally accompanied Polly in her studies, Thomas Jefferson expressed reservations about Sally’s  ability to care for his daughter because she was so young, fourteen at the time. However, although she was well-trained in caring for people, Thomas Jefferson expressed that she was “wholly incapable of looking after” his daughter and could not do it “without some superior to direct her.”[2]  Despite Jefferson’s doubts about her abilities, Sally gracefully navigated the unfamiliar Parisian landscape and spent twenty-six months in Paris, also reuniting with her brother James. She contracted smallpox but received proper care and was compensated for her work. Sally also learned French during her stay, though her literacy in both languages remains uncertain.[3]

In Paris, at the age of fourteen, Sally’s  involvement in a sexual relationship with Thomas Jefferson, whose wife died in 1782, resulted in her pregnancy, which shifted her trajectory dramatically. While accompanying Thomas Jefferson’s daughter, Polly, to Paris, Hemings was caught in a complex web of power dynamics and his unspoken desires. Yet, a fateful encounter with Jefferson forever altered her life. Madison Hemings, Sally Hemings’s son, stated that his mother became Mr. Jefferson’s concubine in France. Though in France, slavery was not legal, so Sally was considered a free person. Torn between the possibility of freedom in Paris and the promise of a better future for her children, Sally made a heart-wrenching choice. She negotiated an extraordinary deal: freedom for her future children at 21, sacrificing her own chance at escape. In the face of unimaginable hardship, this selflessness began her quiet rebellion. She did not try to negotiate for freedom for herself.[4] Additionally, Thomas Jefferson wrote about Sally as they continued their “relationship” after returning to Monticello. He wrote, “It is well known that the man whom it delighted the people to honor, keeps, and for many years past has kept, as his concubine, her name is Sally.”[5] Jefferson clearly stated that Sally was his concubine, his mistress. In his eyes, Sally was just another woman.

After returning to Monticello with Jefferson and his daughters in 1789, she became a household servant and lady’s maid.[6] In addition, Madison Hemings stated, “It was her duty, all her life which I can remember, up to the time of father’s death, to take care of his chamber and wardrobe, look after us children and do such light work as sewing.” As well as being a maid, Sally’s job was cleaning Jefferson’s closet and sewing. Also, upon returning to Monticello, Sally’s relationship with Jefferson, though shrouded in secrecy, was an undeniable reality. Sally Hemings’s relationship with Thomas Jefferson was well-known throughout Monticello. Some of Jefferson’s friends and even political colleagues knew about them. However, this new sexual relationship did not come as a surprise to people. It was, unfortunately, widespread for white men to have sexual activity with enslaved women, let alone enslavers with enslaved women. However, society could ignore Thomas Jefferson and Sally Hemings if he kept them discreet, so he never acknowledged the rumors, and they continued their “relationship.”[7] Their relationship lasted until Jefferson died on July 4, 1826.

She  bore him six children, each carrying the weight of their father’s legacy and the burden of slavery. Although, only four survived to adulthood, Harried, Beverly, Madison and Eston. Despite her duties as a servant and Jefferson’s “concubine,” Sally nurtured her children with unwavering love and a fierce determination to see them free. Madison Hemings said, “She gave birth to four others, and Jefferson was the father of all. They were Beverly, Harriet, Madison (myself), and Eston – three sons and one daughter.”[8] The oldest, Beverly Hemings, worked as a carpenter for the duration of his enslaving. He was also into music, more specifically, the violin.[9] Harriet Hemings was born a few years after Beverly in 1801. She grew up enslaved, spinning wood. After Harriet, Madison is the child that had the most to say about his mother’s life and what he thinks about their relationship. Lastly, there is Eston Hemings, the youngest son out of them all. He obtained knowledge in woodworking and was granted freedom in 1829. After Jefferson’s death, Martha, his daughter, allowed Sally to leave the plantation to live with her younger sons, Madison and Eston, in Charlottesville, Virginia. Madison and Eston gladly took their mother in with open arms and loving hearts. They initially passed as white for the U.S. Census, but later Sally identified as “free mulatto.” Sally lived freely with her sons until she died in 1835.[10] 

             Throughout her life, Sally Hemings made decisions that transformed her children’s lives and impacted women at large. Her selfless act in Paris, negotiating freedom for her unborn children, inspires women and their own children. In the course of her life, just like many other enslaved women, Sally Hemings’s children were fathered by her owner. In  the context of the era where enslaved women lacked legal rights,[11] Sally’s story reflects the harsh reality of exploitation. The dynamic between her and Jefferson can vary, though, taking into consideration age and consent. Sally was fourteen, and  Jefferson was about forty years old.[12] Additionally, enslaved women often were raped and sexually harassed without being able to speak up or say no. Despite  these challenges, she rose above and stands as a stark motivation for women across the globe.  

            Sally Hemings’s story is a personal triumph and a beacon of hope for all who fight against injustice. Pulitzer Prize-winning historian Annette Gordon-Reed also said, “Though enslaved, Sally Hemings helped shape her life and the lives of her children, who got an almost 50-year head start on emancipation, escaping the system that had engulfed their ancestors and millions of others. Whatever we may feel about it today, this was important to her.” The measures Sally took to ensure emancipation for her children were significant and display the unconditional love she had for them. For a mother to surrender her own freedom, her only chance to escape, for her children was selfless. Her quiet defiance, her unwavering love for her children, and her ability to negotiate freedom within the confines of slavery inspire generations of women and mothers. Her life, sacrifices, and ability to persuade Thomas Jefferson into making her a promise was an act of heroism towards her kids. While inspiring many women worldwide, the most significant impact was on her children. Ones who exclaimed the great things she did for them. On the other hand, her children were not the only ones who spoke highly of her.  Her story carries a historical significance and profound lessons about the human spirit’s capacity for resilience and love. A woman who defied the odds and shaped the destiny of her children, leaving behind a legacy that continues to resonate  with many women and children today.

Hemings, Madison. “Sally Hemings” [Sally Hemings]. https://monticello.org. Accessed November 9, 2023. https://www.monticello.org/sallyhemings/.

“The Memoirs of Madison Hemings” [The Memoirs of Madison Hemings]. https://www.pbs.org. Accessed December 17, 2023. https://www.pbs.org/wgbh/pages/frontline/shows/jefferson/cron/1873march.html.

Adams, William Howard. The Paris Years of Thomas Jefferson.

Gordon-Reed, Annette. The Hemingses of Monticello.

“Life Story: Sally Hemings” [Life Story: Sally Hemings]. https://nyhistory.org. Accessed December 14, 2023. https://wams.nyhistory.org/building-a-new-nation/american-woman/sally-hemings/#:~:text=Sally%20lived%20in%20Paris%20long,together%20when%20they%20reached%20adulthood .

Thorson, David. “Beverly Hemings” [Beverly Hemings]. https://www.monticello.org/. Accessed December 17, 2023. https://www.monticello.org/research-education/thomas-jefferson-encyclopedia/beverly-hemings-2/.

The University of Virginia. “The Hemings Family” [The Hemings Family]. https://monticello.org. Accessed November 6, 2023. https://www.monticello.org/slavery/paradox-of-liberty/enslaved-families-of-monticello/the-hemings-family/ .


[1] William Howard Adams, The Paris Years of Thomas Jefferson, Page 220

[2] William Howard Adams, The Paris Years of Thomas Jefferson, Page 220

[3] Madison Hemings, “Sally Hemings” [Sally Hemings], Monticello.org, accessed November 9, 2023, https://www.monticello.org/sallyhemings/.

[4] Hemings, “Sally Hemings,” Monticello.org.

[5] Hemings, “Sally Hemings,” Monticello.org.

[6] Hemings, “Sally Hemings,” Monticello.org.

[7] “Life Story: Sally Hemings” [Life Story: Sally Hemings], nyhistory.org, accessed December 14, 2023, https://wams.nyhistory.org/building-a-new-nation/american-woman/sally-hemings/#:~:text=Sally%20lived%20in%20Paris%20long,together%20when%20they%20reached%20adulthood.

[8] Hemings, “Sally Hemings,” https://monticello.org.

[9] David Thorson, “Beverly Hemings” [Beverly Hemings], https://www.monticello.org/, accessed December 17, 2023, https://www.monticello.org/research-education/thomas-jefferson-encyclopedia/beverly-hemings-2/.

[10] Life Story,” https://nyhistory.org

[11] Hemings, “Sally Hemings,” https://monticello.org.

[12] Hemings, “Sally Hemings,” https://monticello.org.


The Social Cost of Deindustrialization: Postwar Trenton, New Jersey

Patrick Luckie

Studying local history is something that is often overlooked and underestimated in social studies classrooms around the country. Think about it—do you have any memory of learning about your own local community in a coordinated school or social studies effort? Big ideas like imperialism, global culture, and other themes of the past and present usually take precedence over learning about one’s own local history in the high school. As part of my undergraduate senior research project at Rider University, I grappled with this fact and produced a short study of my own local history which I used to inform my instruction in the classroom. This article will present the research I have done and will end with a short analysis of how my research project on local history has affected my instruction in Ewing High School and how it can change the way we think about teaching local history in all American high school social studies classrooms.

These powerful words were written by Dr. Jack Washington, a teacher of Social Studies in Trenton public schools for over 40 years and author of, The Quest for Equality: Trenton’s Black Community 1890-1965 which traces racial struggle and movements for equality over the city’s history. Trenton’s uniqueness as Washington describes, is a product of its deep history, rooted in the American Revolution, World War II, and the Civil Rights Movement of the 1960s. Trenton was once a manufacturing powerhouse, home to multiple industries which forged the urban landscape of the state’s capital and produced thousands of union jobs for its inhabitants. These included the mighty John A. Roebling’s Sons Company, which aided in the creation of the Brooklyn Bridge and whose factory in West Chambersburg served as a symbol of innovation and opportunity for decades. Trenton’s pottery industry was also one of the largest and most successful in the whole nation alongside its iron, steel, rubber, and textile companies. Together, these industries provided enough stable employment and pay to support a rapidly growing population of mostly first and second generation European immigrants from Italy, Ireland, Germany, Poland, and Hungary, to name a few. Trenton’s manufacturing prowess was best showcased in 1917 with the first lighting of the famous “Trenton Makes, The World Takes” sign on the Lower Trenton Bridge, a symbol which still stands today in 2023.

 The “golden age” of the city, as historian John T. Cumbler describes it, lasted from around 1850 to 1920 when Trenton established itself as one of the manufacturing capitals of the nation.[2] Almost perfectly situated between two of America’s largest cities in New York and Philadelphia, Trenton industrialists used its strategic geographic location along the Delaware River to tap into large markets and supply the massive manufacturing needs of the east coast. Trenton at this time was truly a symbol of the American dream, and people flocked to the city in search of opportunities. By 1920, the population of the city surpassed 119,000 people and it was amongst the most densely populated places in the state of New Jersey.[3]

The first signs of the city’s decline came with the weakening of its labor movement. By the 1920s, the age of mechanization had begun and the economic shift from factory work to mechanized manufacturing began weakening labor unions overtime. Worker’s unions and cooperation between owners and workers alike had been central to the functioning of the local economy and the glue by which the city binded itself together. Overtime, businesses could no longer maintain the standards of work they had previously upheld and conditions within the city started to slowly deteriorate. From 1910-1920 Trenton underwent its largest leap in population within a decade and shortly thereafter it began experiencing some of its greatest economic struggles. Plants began relocating outside of the city and unionized jobs were becoming more and more difficult to attain. Economic historians have grappled with this shift in the post-war era, claiming “US corporations aggressively sought to break free of expensive union contracts and to seek out ways to pay lower wages and allied social costs in order to increase profits.”[4] This is a persistent trend in this study. With great increases in population and the changing state of the local and national economy, Trenton suffered meaningful losses in employment and manufacturing output.

With the Great Depression beginning in 1929 and the waging of the Second World War in 1939, Trenton retreated back to manufacturing and away from addressing the issues surrounding labor which had marked its initial decline. The waging of the war meant a massive nation-wide mobilization of industry towards fueling the war effort. The war-time economy of Trenton temporarily revitalized the city. Roebling’s Sons employed droves of new workers, opportunities for overtime became more available, unions strengthened, worker’s pay went up, and the largest wave of black migrants in the city’s history began making their way to Trenton beginning in the 1940s.[5] These migrants came to Trenton and other cities in what is known as The Great Migration. That is the movement of millions of African Americans predominantly from the rural southern states to the urban north and midwest between 1910-1970.

This temporary boom did not yield long-term progress for Trenton in the post-war period. During the 1950s, many of the city’s largest industries began relocating outside the city limits and the economy did not adequately support its largest ever population of over 129,000 people.[6] In 1952, Trenton’s most popular employer Roebling’s Sons was sold to Colorado Fuel and Iron Company which over the next decade cut its employment numbers in Trenton and relocated its major manufacturing and business centers outside the city limits. This was the fate for many of the most popular industries within the city which sold their shares to larger corporations after WWII, leaving the fate of the city’s economy in the hands of interests which had little to no connection to it. The rubber, steel, iron, and pottery industries which had defined the city of Trenton and produced its “golden age” became shadows of their former selves and the physical conditions of the city reflected this change. Overtime, thousands of industrial jobs were lost and the population of Trenton dropped 13,382 people from 1950 to 1960 and an additional 9,381 people the following decade.[7] Population decline continued to the year 2000 and stabilized between 80,000 to 90,000 in the 21st century. 

This study seeks to answer two fundamental questions: 1) What were the major effects of deindustrialization on Trenton, NJ in the decades immediately following WWIII? 2) How were these effects felt by the people living within the city at this time? In answering these questions, this study will provide a lens through which race and class come to the forefront of the discussion. Trenton’s decline overlaps with the migration of thousands of African Americans to the city in search of economic opportunities. This demographic shift was the largest in the city’s history and was not met with opportunity but rather inequality and increased racial tension. The major effects of deindustrialization on Trenton, NJ in the post-war period were economic destabilization, movement to the suburbs, and increased racial tensions between white and black Trentonians. Each subsection of this work will dive into these effects individually as well as their overall impact on life in Trenton. It is important to recognize that this movement away from manufacturing and its effects were not phenomena restricted to certain areas or regions. Rather it was a national trend which all rust belt cities like Trenton grappled with in the 21st century. In addition to deindustrialization broadly,  the age of mechanized labor, the shifting of the U.S. economy towards greater support for large corporations, and the social movements of the 1960s all played extremely important roles in shaping American cities in the post-war era.

Secondary source literature on the decline of U.S. cities in the post-WWII period falls into the fields of American urban, economic, and social history. One of the most popular works on these subjects is historian Thomas J. Sugrue’s The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit, which examines the many ways in which American cities began to decline following WWII with specific focus on racial inequality and division. In his work, Sugrue states that Trenton, like Detroit and other rust belt cities of the time, experienced hundreds of thousands of layoffs in manufacturing jobs nationwide due to the changing state of the U.S. economy and the lack of government spending allocated towards Northern cities.[8] These conditions radically transformed urban environments into almost unrecognizable versions of their industrial heights. Sugrue explores the connections between suburbanization, demographic change, and the racial attitudes of northern whites to produce an all-encompassing case study of the decline of Detroit. At the heart of his argument is that racial segregation and inadequate political responses to signs of crisis determined the fate of the city. The importance of this historical research cannot be overstated. Before this book was originally published in 1996, the stories of Detroit and other American cities who suffered from the consequences of deindustrialization and racial division in the post-war period were largely untold. The Origins of Urban Crisis continues to be one of the most influential modern studies of American urban history and is without doubt one of the most cited pieces of literature in the field.

Jefferson Cowie and Joseph Heathcott, who together produced Beyond the Ruins: The Meanings of Deindustrialization,built on the historical research of Sugrue by studying the impact of post-war deindustrialization across the nation. This book seeks to progress the conversation of historic decline to modern solutions for urban decay and economic instability. In doing so, it compiles a collection of essays from historians and other professionals to further explore deindustrialization and its impact on American cities.[9] From this perspective, the authors identify a complexity of causes and effects of urban decline which vary from city to city but share many similarities nationally. The value of this work is in its wide-scope. By compiling essays from multiple professionals in a variety of related disciplines, the image of declining cities in the U.S. following WWII becomes more clear than ever.

The most recognized work on post-war deindustrialization in specifically Trenton, New Jersey lies within historian John T. Cumbler’s A Social History of Economic Decline: Business, Politics, and Work in Trenton. This book outlines a long trajectory of economic conditions in Trenton beginning in the 1920s with focus on the Great Depression and researches the changing nature of the city up until the book’s publishing in 1989. One of Cumbler’s main arguments includes the notion that America experienced a gradual economic shift from civic to national capitalism following the Great Depression which empowered large corporations while simultaneously destroying the small businesses which held many industrial cities together.[10] He also explores the rich history of the city’s most impactful industries, politicians, union leaders, and manufacturing workers to provide a comprehensive view of Trenton’s economic and social decline. This work provides the foundation of historical knowledge on Trenton required to produce further research on this topic. However, Cumbler’s history of Trenton does not extend as far into the social consequences and effects of deindustrialization as one might expect. Nevertheless, virtually any modern historical literature on the city of Trenton cites this work. This points to the undying credibility of Cumbler as a historian and shows the importance and relevance of his arguments to the continued study of the city’s history.

More recent historical literature on related topics has largely focused on national trends of suburbanization and racial conflict. One such journal article titled “The Rural Past-in-Present and Postwar Suburban Progress” by University of Waterloo professor Stacy Denton studies the shift towards suburbanization following WWII. The author highlights the transformation of previously rural spaces to suburban landscapes and the implications of such transformations on national attitudes and beliefs towards race, culture, and class.[11] In a similar light, economic historian Leah Platt Bouston’s 2007 work “Black Migration, White Flight: The Effect of Black Migration on Northern Cities and Labor Markets” studies the effects of The Great Migration on northern cities and their economies. She also dives into the racist attitudes of northern whites which manifested themselves in movements out of increasingly diversifying cities and into the surrounding suburbs as part of a process termed “white flight.”[12] Both these works of history are incredibly valuable to this study of post-war Trenton for the topics and findings of their research are amongst the greatest effects of deindustrialization on the city.

The research done in this paper will synthesize the secondary source material on the decline of U.S. cities and apply their findings to a specific case study of Trenton, New Jersey. In doing so, it will paint a clearer picture of the more immediate social and economic effects of deindustrialization on the city in the decades following WWII. This will add to the historiography of urban history and Trenton historical study by compiling primary and secondary source documents to more deeply understand the major effects of deindustrialization and economic transformation on the city.  These major effects include economic destabilization, massive suburbanization, and increased racial tension. These symptoms of deindustrialization were felt most harshly by the city’s poor ethnic-white and growing black population. More specifically, economic decline in Trenton coincided with the arrival of black migrants which compounded racist attitudes and practices within the city. This is most clear in workplace and housing segregation which new migrants had to face upon their arrival.

Industry leaving Trenton following WWII radically changed the city’s local economy. Unionized factory jobs became harder to attain, poor residents were left with fewer options, and Trenton’s growing black community was segregated in their employment. Long-time union workers like those who worked in the pottery and steel plants found themselves in an unfamiliar situation. As Cumbler explained, “Those workers thrown out of work by plant closings had the hardest time finding work and represented the largest number of Trenton’s unemployed.”[13]

The selling of corporations like Roebling’s Sons produced a much weaker focus on the city’s manufacturing growth and output and instead, large corporations sought for the relocation of facilities and workers to outside the city. This left the existing workforce in the city out to dry and decreased options for employment, especially among the lower-income white and minority black populations.

 One action taken by the state and local government to fill this gap created by fleeing industry was growth in the employment of state workers and other public jobs. New Jersey state workers were in the 1950s and 60s, as they still are in the present day, centralized in the capital city of Trenton. Cumbler described this shift from manufacturing to public work as, “Blue Collar to White Collar and White Smock.”[14] This provided some relief to the city’s unemployment problem which exceeded the national average through the 1950s and 60s but it did not come close to meeting the pay and benefit standards that manufacturing jobs had produced just a decade prior. Additionally, the large majority of state workers employed at this time were disproportionately white men. Despite these changes, public and state employment was not enough to lift the city out of its economic slump nor its inherent issues with workplace discrimination.

A large part of the story of economic destabilization in Trenton as a product of deindustrialization was the negative consequences on its black community. Former Trentonian and author Helen Lee Jackson published her autobiography in 1978 charting her experience with racial discrimination as a black woman seeking meaningful employment in the city. Her description of Trenton reads as follows:

In 1940, Trenton was an industrial city with many potteries. Steel mills, factories, and a large auto plant, but the production lines were almost solidly white. Black men swept the floors, moved heavy equipment and shipping crates, and performed other burdensome tasks. In the business sections, they were almost invisible except as window cleaners, janitors, or elevator operators. There were no black salespeople in the stores, banks, or business offices. They were hired as maids, package wrappers, or seamstress. Even the five-and-ten-cent stores refused to hire blacks, except to sweep, dust, or move stock.[15]

Jackson’s firsthand experience with racial segregation and inequality in the city in the 1940s is a reflection of the racial attitudes and prejudices in Trenton and other northern cities earlier in the 20th century. Racist attitudes towards black migrants who largely came from the south was a characteristic of many industrial cities in the U.S. at this time as is highlighted in Sugrue’s work on Detroit and other rust belt cities. With greater numbers of black migrants entering northern cities, the problem of racial discrimination and inequality intensified and the competition for jobs in short supply fuel racist attitudes. According to Sugrue, a combination of factors including employer bias, the structure of the industrial work place, and the overarching ideologies and beliefs of racism and black inferiority contributed to this workplace segregation.[16] For Trenton, these differences in employment were visible to the observer and significantly impacted the lives of those seeking stable income. With the collapse of industry happening simultaneously with a dramatic increase in the city’s black population, this problem compounded. Black residents were not only excluded from whatever factory jobs were left on the basis of their race but they were also labeled as the source of the city’s problems altogether.

In a 1953 study of community services in Trenton, researchers found that the average black resident experienced twice as much unemployment and earned on average 30% less total income than the average white person at this time despite only a one year difference in their average acquired education.[17] These statistics are proof of income inequality and workplace discrimination and provide insight into the lived experiences of black people in Trenton at this time. Furthermore, research from The Journal of Economic History, suggests “black workers were channeled into negro jobs and faced limited opportunities for promotion.”[18] Access to financial resources and meaningful employment were among the largest reasons for black migration to Trenton and other northern cities. Upon their arrival however, they were met with egregious workplace discrimination and were given very little opportunities to climb the economic ladder. Black women specifically made up, “The least utilized pool of potential industrial labor power having much less than proportionate representation with her white counterpart” according to a 1950s study titled, The Negro in the Trenton Labor Market.[19] Many black women, including Helen Lee Jackson, struggled even more so than black men to find employment within the city. These conditions forced economically disadvantaged men and women alike to scramble for jobs and income in order to support themselves and their families.

Changes to the manufacturing economy and workplace discrimination created great instability in Trenton during the 1950s and 60s. Old union workers were suddenly left jobless and the fruits of their loyal labor to the city’s largest industries were now gone. Attempts to revitalize the economy largely failed and economic decline impacted the poor and minority black population of the city more harshly than anyone else in the form of unequal pay and limited job opportunities. With this knowledge, it becomes clear that deindustrialization and the exodus of industry destroyed the economy of Trenton that was historically forged by large-scale manufacturing and robust labor unions and disproportionately affected the new and growing black community.

Another major consequence of postwar deindustrialization on America’s rustbelt cities was the creation of and migration to the suburbs. Suburbs are the areas where urban centers like Trenton, NJ extend into previously rural environments where new housing developments, industries, and townships began to populate with greater and greater numbers of prior city-dwelling individuals. Historian Kenneth T. Jackson’s work on suburbanization titled,

Crabgrass Frontier: The Suburbanization of the United States,  provides the best historical analysis of this phenomenon which swept the nation in the 20th century. Among many important factors, he claims that the roots of suburbanization can be traced to the boom of the automobile industry in the 1920s which enabled those who could afford it to move further and further away from the cities in which they worked. Jackson states, “Indeed the automobile had a greater spatial and social impact on cities than any technological innovation since the development of the wheel” He goes further to explain, “After 1920 suburbanization began to acquire a new character as residential developments multiplied, as cities expanded far beyond their old boundaries, and as the old distinctions between city and country began to erode.”[20]

For Trenton NJ, this shift towards the suburbs was gradual beginning in the 1920s and peaking during the 1950s. It is important to note that suburbanization in Trenton and in cities across the nation happened gradually into the late 20th century. This coincided with a decline in major industries and jobs. Historical research on suburbanization has also revealed that many of these white suburbanites moved to the suburbs to create a physical barrier between them and their racial counterparts.[21] As a result of these factors, thousands of residents with the financial freedom to do so began expanding into the towns on the periphery like Hamilton, Ewing, and Lawrence. Many of whom continued to work as state workers or in other capacities inside Trenton while living outside the city. These towns saw unprecedented growth in the post-WWII years in housing developments thanks to VA and FHA loans which were granted to veterans of the war as part of president Franklin Delano Roosevelt’s New Deal Reforms.[22] It is important to note that these New Deal programs were especially beneficial to white service members and much historical literature has been written about the exclusionary practices associated with housing loans in relation to African Americans. This is relevant because during and shortly after WWII, the largest wave of black migrants traveled from predominantly southern states to Trenton and other northern cities in search of employment opportunities associated with the mobilization of industry towards the war effort. This search for opportunity overlapped with the decay of Trenton’s largest industries, leaving many black migrants below the poverty line, working menial jobs as opposed to fruitful unionized jobs, and in some cases, out of work completely. Compounding these issues was the inaccessibility of reasonable home loans for members of the black community.

The effects of suburbanization on the local economy of Trenton and its inhabitants can be seen through analysis of the popular media. Pride Magazine was a Trenton-based publication which centered its content around black businesses and black business owners. This specific magazine concerned itself with the failure of local politicians to enact positive change in the form of urban renewal plans which were targeted at improving the infrastructure, housing, and employment opportunities within the city. In March of 1972, Pride Magazine issued a publication titled, “Black Businesses Need Your Help!” which featured a section written by the magazine’s publisher Vance Phillips, who received his college education in Trenton. He wrote, “What are we doing to fill the vacuum of the cities which was created by relocation of the established business” He then goes on to say, “After spending 5 years of planning and developing new programs for structural and economic changes, Trenton Model Cities program has failed to meet the potential growth of new and old businesses in our community.”[23] Phillips like many black Americans living in Trenton during the 1970s saw visible signs of the city’s decline through the failure of local businesses. He believed what was needed to fix this problem was a stronger government response along with increased civic action from specifically the black community.[24]

 In this same publication, Phillips expressed his belief that, “a person who lives within the city should have preference over persons living outside of the cities in terms of employment.”[25] Here the author is addressing those who live in the surrounding suburbs but continue to fill job positions within the city limits. This would have been a popular message to Trenton’s black business owning population due to the negative effects that rapid suburbanization had on small businesses within the city.  In this magazine article, Phillips touches on an number of topics which are extremely relevant to this study. For one, the instability of small businesses in the wake of mass-suburbanization which he observed was largely due to the relocation of both industry and people to outside the city. Mostly ethnically-white Trentonians were leaving the city for the suburbs and taking with them their spending power. With population decline being spearheaded by movements to the suburbs, there simply was not enough money being circulated throughout the city to adequately support the small businesses which propped up its local economy.

Another popular message within this passage highlights that with most of Trenton’s workforce shifting into the surrounding suburbs, so too did its voting power.[26] This left black communities who resided within the urban centers even more powerless as a minority to change their own political environment. Suburbanization brought with it a massive decrease to the city’s population and tax-base. The previously 100,000+ populated city now had just around 80,000 inhabitants by 1970.[27] This rapid population decrease meant that the tax revenue generated was not enough to effectively grapple with the issues facing the economy and the evolving workforce.

Furthermore, local culture within the city which had been forged by America’s largest waves of European immigration in the 19th and early 20th century suffered as a result of deindustrialization and suburbanization. Many of the small businesses and social institutions which had historically characterized the city of Trenton were established by first and second generation Italian, Irish, Polish, and Hungarian immigrants. Many of whom traveled from the larger cities of New York and Philadelphia to find industrial jobs in Trenton. Dennis J. Starr’s book, The Italians of New Jersey, outlines the effects of suburbanization on the “old immigrants” of New Jersey, stating:

The movement to the suburbs and smaller urban places paralleled a major transformation of the state’s urban political economy. Following the war, the state’s largest cities did not participate in the postwar prosperity and economic development. Instead, their industrial bases eroded, their mercantile bases moved to suburban shopping malls and their overall, especially affluent white, populations shrank.[28]

The effect of suburbanization on the local culture of Trenton’s longest serving residents is a source of some historical debate. Cumbler notes that, “Despite suburbanization of the more successful Italians and Slavs, many of Trenton’s ethnic neighborhoods seemed as entrenched as ever in the 1950s.”[29] However, the following decades of the 1950s would see even more of Trenton’s staple “old immigrant” communities relocating to the suburbs and with them their cultural values and traditions. That being said, the cultural diversity of Trenton, New Jersey created by its ethnic melting pot of a history can still be felt today in 2023. Walking the streets of some of its most popular neighborhoods like Chambersburg, one can still see and feel the Italian influence of churches, social clubs, and bar-restaurants in the area. The main point here is that culture did suffer as a result of suburbanization and population decline, but it did not die, it rather faded into a less obvious and less present version of its former self.

            Looking at suburbanization as a major effect of postwar de-industrialization on the city of Trenton provides valuable insight into the cities rise and decline as a manufacturing powerhouse. Like many other rust belt cities of this time period, the trend of suburbanization caused unprecedented changes to the city’s local economy and demographics. The loss of unionized industry jobs encouraged many Trentonians to relocate to the surrounding towns which had recently seen great increases in housing development. In the process, those who left the city unintendedly left Trenton out to dry. Money from the pockets of those who moved to the suburbs was desperately needed to support small businesses in the city and their tax dollars could have been used to make meaningful change to the city’s failing infrastructure. As previously discussed, the local culture of the city also suffered as a result of these consequences which only compounded with each decade of further suburbanization and relocation away from the city. With a decreasing population, aging workforce, and a new wave of migrants without sufficient employment opportunities, the city began to decline into an unrecognizable version of its “Golden Age” of the 1920s.

Trenton’s deindustrialization and its history of racism and inequality are inextricably linked. In 1986, Historian Dennis J. Starr published, History of Ethnic and Racial Groups in Trenton, New Jersey: 1900 – 1960, which acts as one of the foremost important pieces of historical literature on Trenton race-relations. This research clearly establishes a link between deindustrialization and increased racial tensions by claiming:

As industries closed down or reduced their work force it became harder for Afro-American migrants to get a toe hold on the traditional ladder of social mobility–a factory job. Meanwhile the city’s sizable Italian, Polish and Hungarian communities became fearful lest their jobs be eliminated, their neighborhoods integrated. A siege mentality developed in light of the population shifts and exodus of industries, commercial businesses, colleges and government offices.[30]

This “siege mentality” was amplified overtime with the overcrowding of black communities in Trenton and the extension of black-owned or rented residences into shrinking ethnically white neighborhoods.

Between 1950 and 1960, Trenton’s black population rose to 22.8 percent of the total population. As discussed earlier, Trenton was a historically segregated city but in the 1950s and 60s this racial division took on a whole new light given the increases in population and decreases in economic opportunities and industry.[31] Trenton historian Jack Washington described Trenton following WWII stating, “That the 1950s was a period of benign neglect for the Black community is an understatement, for Black people were forgotten while their economic and political troubles continued to mount.”[32] These economic troubles can be seen most clearly through examination of housing segregation in the city and its continued influence on the lives of Trentonians. Along with housing and workplace discrimination, ethnically white residents used black migrants as scapegoats for their city’s economic misfortunes and decline.

            Housing in Trenton, NJ after the postwar years can be characterized as both segregated and worse for wear. Following the largest influx of black immigrants to the city in the late 1940s and early 50s, this new population was largely forced to live in the Coalport and Five Points areas of the city on its interior.[33] Housing opportunities for black residents were few and far between and were in most cases aged and deteriorated. Starr shed light on this inequality revealing, “By 1957 over 80 per cent of the city’s housing was over 50 years old and 20 percent of all housing units were dilapidated or had deficient plumbing.”[34] This was a problem for all city-dwellers and stood as a marker of the city’s decline following deindustrialization. For the black community, this problem was especially real given that the neighborhoods with the worst physical damage and infrastructure were those areas in which they settled. A 1950s survey of the city titled, Negro Housing in Trenton found, “the percentage of substandard housing among the Negro population is four times higher than that for the general population.”[35] Not only were black Trentonians limited in their occupation but also in the location and quality of their housing. This same study of housing in Trenton concluded that 1,200 new residential spaces would have to be erected in order to meet the needs and standards of the city. These spaces were not created and public housing efforts did not meet the requirements of the new growing population.[36]

With little options for housing, a lack of policy action to create new housing, and increases to the population, black migrants had no choice but to expand into Trenton’s old ethnically-white neighborhoods. In the eyes of many in the white majority, black migrants were the corrupting force which acted to take down their beloved city. Declining social and economic conditions in the city paired with old racist tendencies to produce conflict between ethnic groups. Cumbler eloquently explains this clash stating:

The decline of their industrial base narrowed the boundaries of choice for both white and black Trentonians, and in doing so it intensified conflict between them. Increasingly, Trenton’s problems became defined by the city’s white residents in terms of growth of its black population. Actually, its problems had other sources: the loss of its tax base with the closing down of factories, dilapidation of the existing housing stock, and the declining income of its citizens of whatever color.[37]

This excerpt captures the situation in Trenton during the 1950s and 60s in terms of race relations and the overall decline of the city. Racist attitudes were not a new trend in Trenton but were compounded with the arrival of large populations of black migrants. From the white perspective, black migrants were aiding in the destruction of the city. From the black perspective, Trenton did not provide the necessary resources for which they traveled north in search of in the first place.

The 1960s and the Civil Rights era was the historical boiling point for racial tensions and division in Trenton. The influence of the NAACP and other organizations for the advancement of racial equality along with intense riots brought race and class to the forefront of Trenton’s post-industrial issues. Most impactful, Trenton race riots following the assassination of Dr. Martin Luther King Jr. exploded in early April of 1968. These riots lasted for multiple days and resulted in fires erupting around the city as well as over 7 million dollars in damage to over 200 different businesses in Trenton at the time. During the chaos, around 300 mostly young black men were arrested by Trenton Police. The devastating damage to the downtown section of the city caused many to flee and abandon it altogether in the years that followed.[38] It would be unfair to say that these riots were a direct result of deindustrialization in postwar Trenton. However, the city’s history of racial inequality and the compounding forces of racial tension as a result of deindustrialization point to the creation of fertile ground for public outrage. Of course, the assassination of Dr. Martin Luther King Jr. served as the catalyst for race riots in the city but the broader history of discrimination and inequality in Trenton suggests an intense decades-long build up to the events that unfolded in April of 1968.

Trenton’s rise and fall as an American industrial city is truly a fascinating case study of the post-war era in U.S. history. What was once a manufacturing powerhouse along the Delaware River strategically placed between the two large cities of New York and Philadelphia was reduced to a shadow of its former glory by the 1950s and 60s. The causes of this decline can be found in the removal of industry away from the city following the war effort and signs of economic decline can be traced as far back as the 1920s. The effects of this shift however, remain the most significant in the broader history of the city. Rapid deindustrialization meant that wages and opportunities were significantly limited for all Trentonians but especially for its segregated black community. Many of those who could afford it elected to move to the surrounding suburbs, bringing with them their tax dollars, their votes, and their culture. Lastly, deindustrialization and the consequences of a radically transformed Trenton increased racial tensions in the form of housing and workplace discrimination.

These effects offer new insights into the Trenton of today. Trenton now has a black majority and interestingly, those same areas which housed black migrants in the 1950s on the city’s interior are still today in 2023 the site of high unemployment and low opportunities. Walking the streets of Trenton, one is quickly reminded of its rich history with many of its houses and abandoned factories still standing today as a reminder of the city’s complicated history. A hopeful message could be that a greater understanding of Trenton’s post-war history could provide the necessary insight to create better living conditions and opportunities for all its residents. However, today Trenton remains a city in an intense state of recovery from its industrial past. Historical research has been done to show that urban renewal plans have largely failed to revitalize the city’s economy in the 20th and 21st centuries and issues such as crime, poverty, drug abuse, poor infrastructure, among others continue to loom over the once prosperous city.

            Today, the “Trenton Makes, The World Takes” sign on the Lower Trenton Bridge still stands bright but its meaning has drastically changed since the last century. What was once a beacon of promise and stability is now a constant reminder of how far the city has fallen from its industrial and manufacturing heights.

Upon completing this research paper on Trenton, I gave a lesson to high school world history students at Ewing High school as part of my undergraduate co-teaching field work. Ewing is one of the border towns to the city of Trenton and was one of the most popular destinations for suburbanites who left the city in the 20th century at least in part because of deindustrialization and the city’s overall decline. The proximity of the topic and the familiarity students  had with popular street names, businesses, and buildings in the city created a feeling of relevance that sparked engagement. Students were surprised to be learning about a topic so close to home and they responded with passionate discussion and the creation of meaningful connections which were sparked through a mix of group and whole class discussions.

For social studies teachers, this successful shift from world history topics to a more grass roots approach to teaching local history can be used as a template for future lessons. Topics frequently come up during different units throughout the school year which deeply relate to the local history of wherever kids go to school. For Ewing students, Trenton’s decline as an industrial city directly related to their lived experiences. Many of my students had lived in or around Trenton for most of their lives. This practice of teaching local history to students is not overwhelming nor is it undoable. The same amount of effort it takes to create a lesson in a world history or AP class can be channeled into research dealing with one’s own local environment and history.

This template for teaching local history can be used to generate engagement in the classroom which is unique to any other topic. Once students are given the opportunity to learn and ask questions about their own town, city, home, etc. they begin to view the world through a more historical lens which is the goal of many if not all high school social studies teachers. Overall, my experience with this approach was overwhelmingly positive and I encourage any and all educators to shift their focus for at least one day of the year towards exploring their own local history and connecting it to larger themes within our discipline.

Black Businesses Need Your Help!. Pride Magazine. Trenton Public Library. March 1972. https://www.trentonlib.org/trentoniana/microfilm-newspapers/

Dwyer, William. This Is The Task. Findings of the Trenton, New Jersey Human Relations Self-Survey (Nashville: Fisk University, 1955).

Lee, Helen J. Nigger in the Window. Library of Congress, Internet Archive 1978.

Negro Housing in Trenton: The Housing Committee of the Self Survey. Trenton Public Library. Trentoniana Collection. Ca 1950.

“Negro in the Trenton Labor Market,” Folder: Community Services in Trenton, Box: Trenton Council on Human Relations, Trentoniana Collection, Trenton Public Library.

“Study of Community Services in Trenton,” Folder: Community Services in Trenton, Box: Trenton Council on Human Relations, Trentoniana Collection, Trenton Public Library.

Trenton Council of Social Agencies, Study of Northeast Trenton: Population, Housing, Economic, Social and Physical Aspects of the Area. Folder: Study of Northeast Trenton. Box 1: African American Experience. Trentoniana Collection. Trenton Public Library. 1958.

Boustan, Leah Platt. “Black Migration, White Flight: The Effect of Black Migration on Northern Cities and Labor Markets.” The Journal of Economic History 67, no. 2 (2007): 484–88. http://www.jstor.org/stable/4501161.

Cowie, J. & Heathcott, J. Beyond the Ruins: The Meaning of Deindustrialization. Cornell University Press, 2003.

Cumbler, John T. A Social History of Economic Decline: Business, Politics, and Work in Trenton (New Brunswick: Rutgers University Press, 1989).

Denton, Stacy. “The Rural Past-in-Present and Postwar Sub/Urban Progress.” American Studies 53, no. 2 (2014): 119–40. http://www.jstor.org/stable/24589591.

Division of Labor Market and Demographic Research. New Jersey Population Trends 1790 to 2000 (Trenton, NJ: New Jersey State Data Center, August 2001).

Gibson, Campbell. U.S. Bureau of the Census: Population of the 100 Largest Cities and Other Urban Places in the United States: 1790 – 1990, (Washington D.C.: U.S. Bureau of the Census, 1998).

Jackson, Kenneth T. Crabgrass Frontier: The Suburbanization of the United States. Oxford University Press, 1985.

Leynes, Jennifer B. “Three Centuries of African-American History in Trenton.” Trentoniana Collection. Trenton Historical Society. 2011.

Starr, Dennis J. “History of Ethnic and Racial Groups in Trenton, New Jersey, 1900-1960,” Trentoniana Collection. 1986.

Starr, Dennis J. The Italians of New Jersey: A Historical Introduction and Bibliography. New Jersey Historical Society. Newark, NJ. 1985.

Strangleman, Tim, James Rhodes, and Sherry Linkon. “Introduction to Crumbling Cultures: Deindustrialization, Class, and Memory.” International Labor and Working-Class History, no. 84 (2013): 7–22. http://www.jstor.org/stable/43302724.

Sugrue, Thomas J. The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit. (Revised Ed.). Princeton University Press, 2005. Originally published 1996.

Washington, Jack. The Quest for Equality: Trenton’s Black Community 1890-1965. Africa World Press. 1993.


[1] Jack Washington, The Quest for Equality: Trenton’s Black Community 1890-1965, Africa World Press, 1993, 56.

[2] John T. Cumbler, A Social History of Economic Decline: Business, Politics, and Work in Trenton, (New Brunswick: Rutgers University Press, 1989), 9.

[3] Division of Labor Market and Demographic Research, New Jersey Population Trends 1790 to 2000 (Trenton, NJ: New Jersey State Data Center, August 2001), 23.

[4] Tim Strangleman, James Rhodes, and Sherry Linkon, “Introduction to Crumbling Cultures: Deindustrialization, Class, and Memory.” International Labor and Working-Class History, no. 84 (2013), 19.

[5] Cumbler, A Social History, 132-133.

[6] Campbell Gibson, U.S. Bureau of the Census: Population of the 100 Largest Cities and Other Urban Places in the United States: 1790 – 1990, (Washington D.C.: U.S. Bureau of the Census, 1998)

[7] Division of Labor, New Jersey Population Trends, 26.

[8] Thomas J. Sugrue, The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit, (Revised Ed.), Princeton University Press, 2005, Originally published 1996, 128.

[9] Jefferson, Cowie & Joseph Heathcott, Beyond the Ruins: The Meaning of Deindustrialization, Cornell University Press, 2003. 1-3.

[10] Cumbler, A Social History, 93-95.

[11]Stacy Denton, “The Rural Past-in-Present and Postwar Sub/Urban Progress,” American Studies 53, no. 2 (2014): 119.

[12]Leah P. Boustan, “Black Migration, White Flight: The Effect of Black Migration on Northern Cities and Labor Markets.” The Journal of Economic History 67, no. 2 (2007): 484-485.

[13] Cumbler, A Social History, 147-148.

[14] Cumbler, A Social History, 145.

[15] Helen J. Lee, N—-r in the Window, Library of Congress, Internet Archive 1978, 131.

[16] Sugrue, Urban Crisis, 93-94.

[17] “Study of Community Services in Trenton,” Folder: Community Services in Trenton, Box: Trenton Council on Human Relations, Trentoniana Collection, Trenton Public Library, 8.

[18] Leah P. Boustan, “Black Migration, White Flight” 485-486.

[19] “Negro in the Trenton Labor Market,” Folder: Community Services in Trenton, Box: Trenton Council on Human Relations, Trentoniana Collection, Trenton Public Library, 33-34.

[20] Kenneth T. Jackson. Crabgrass Frontier: The Suburbanization of the United States, Oxford University Press, 1985, 188.

[21] Stacy Denton, “The Rural Past-in-Present,” 119.

[22] Cumbler, A Social History, 139.

[23] Black Businesses Need Your Help!. Pride Magazine. Trenton Public Library. March 1972, 5

[24] Black Businesses, Pride Magazine, 6

[25] Black Businesses, Pride Magazine, 6-7

[26] Black Businesses, Pride Magazine, 6-7.

[27] Gibson, U.S. Bureau of the Census, 43.

[28] Dennis J. Starr, The Italians of New Jersey: A Historical Introduction and Bibliography, New Jersey Historical Society, Newark, NJ 1985, 54.

[29] Cumbler, A Social History, 148-150.

[30] Dennis J. Starr, “History of Ethnic and Racial Groups in Trenton, New Jersey, 1900-1960,” Trentoniana Collection, 1986, 16-17.

[31] Cumbler, A Social History, 153.

[32] Washington, The Quest for Equality, 136.

[33] Trenton Council of Social Agencies, Study of Northeast Trenton: Population, Housing, Economic, Social and Physical Aspects of the Area, Folder: Study of Northeast Trenton, Box 1: African American Experience, Trentoniana Collection, Trenton Public Library, 1958, 53-54.

[34] Starr, Ethnic and Racial Groups in Trenton, 15.

[35] Negro Housing in Trenton: The Housing Committee of the Self Survey, Trenton Public Library, Trentoniana Collection, ca 1950s , 63.

[36] Negro Housing, Housing Committee, 67.

[37] Cumbler, A Social History, 156.

[38] Jennifer B. Leynes, “Three Centuries of African-American History in Trenton,” Trentoniana Collection, Trenton Historical Society. 2011, 3-4.


Forgotten Trails: Unmasking the Legacy of Native American Removal and its Contemporary Implications

Once, in the vast and untamed lands of what is now known as the United States, there thrived a multitude of Native American communities. These diverse and vibrant nations had cultivated rich cultures, deep-rooted traditions, and an intricate understanding of their surroundings. However, as the 19th century unfolded, a dark cloud loomed over these indigenous peoples. In the late 19th century, following a series of conflicts and broken treaties, Native American communities faced forced complete removal from their ancestral lands. The government policies aimed at assimilation and expansion systematically uprooted these communities, displacing them from their homes and severing their ties to their traditions and in 1890, a turning point occurred in Native American history with the forced removal of their communities from their ancestral lands. This displacement was not merely an isolated event but rather part of a broader pattern of marginalization that had persisted for centuries and continues to persist. Yet, despite its undeniable significance, this chapter of American history has largely been forgotten or intentionally overlooked.

The historical marginalization and lack of mainstream attention to the forced removal of Native American communities in U.S. history after 1890 has had profound effects on their social, economic, and political development in contemporary society. This study aims to explore how this neglect and amnesia surrounding the forced removals have contributed to ongoing disparities, underrepresentation, and challenges faced by Native Americans. By relegating this significant chapter of American history to obscurity, society unintentionally perpetuates the cycle of neglect and underrepresentation experienced by Native Americans. The absence of acknowledgment and understanding of the removal policies and their consequences has hindered the recognition of indigenous rights, cultural contributions, and the unique challenges faced by these communities. This research seeks to shed light on this historical oversight and highlight its implications for present-day disparities within Native American communities. By recognizing the impact of historical marginalization, it becomes possible to address current challenges effectively and foster development within these marginalized communities. Through an exploration of relevant literature, primary sources, and historiography, this research will provide a comprehensive understanding of how historical amnesia has shaped the experiences of Native Americans today. By uncovering the underlying causes of ongoing disparities, underrepresentation, and challenges they face, this study aims to contribute to broader efforts towards achieving equity and justice for Native American populations.

The study of the removal of Native Americans after 1890 has long been approached from various perspectives, often reflecting prevailing societal attitudes and biases. Traditional approaches to this topic have tended to focus on a few main ideas namely the notion that Native Americans desired urbanization and the belief that non-Native Americans were providing assistance in their transition. One common argument put forth by traditional studies is that Native Americans willingly sought relocation to urban areas. Proponents of this perspective suggest that indigenous communities recognized the benefits of modernity and sought opportunities for economic advancement through urbanization
        Another idea frequently emphasized in traditional approaches is the assumption that Native Americans were uneducated or culturally deficient compared to non-Native Americans. This perspective suggests that native cultures were inherently inferior and needed intervention from more advanced societies to progress. Consequently, it portrays non-Native American efforts as benevolent attempts to elevate indigenous populations through education, religious conversion, and exposure to Western technologies. In these traditional interpretations, non-Native American involvement was often depicted as an act of assistance rather than forced displacement. Advocates argue that government policies such as the Dawes Act of 1887 aimed at breaking up tribal landholdings into individual allotments were well-intentioned steps toward promoting private property ownership among Native Americans. Similarly, boarding schools designed to eradicate indigenous languages and cultural practices were presented as educational endeavors meant to “civilize” Native American children.

Lastly, the final common approach seen with the study of Native Americans on a broader scale is that Native American history stopped after 1890. Traditional approaches to the study of Native Americans have often treated Native American history as if it came to a standstill after the infamous Wounded Knee Massacre in 1890, perpetuating a skewed and incomplete narrative. This historical tunnel vision neglects the rich and complex tapestry of Native American experiences and contributions beyond that point. It wrongly reinforces the notion that Native Americans exist solely in a historical context, overlooking their vibrant and evolving cultures, traditions, and communities. This approach inadvertently marginalizes contemporary Native voices and their ongoing struggles, creating an inaccurate portrayal of their identity and relevance in modern America.

Overall, the approaches described, in addition to the obvious are problematic because they contribute to historical amnesia surrounding the removal of Native Americans by perpetuating a narrative that downplays the systemic injustices and challenges faced by Native communities during the process of urbanization and relocation. These traditional approaches tend to obscure the agency and resistance of Native Americans, portraying them as passive actors who willingly embraced modernity and external intervention. By emphasizing the supposed benefits of urbanization and the alleged cultural deficiencies of Native cultures, these narratives silence the historical reality of forced displacement, loss of land, and the violation of treaties. They fail to acknowledge the broader context of Native American history, including their resilience and efforts to preserve their cultures in the face of relocation and its effects.

With all of these mentioned ideas in mind imagine having your land taken away, your culture suppressed, and your way of life disrupted. This is the harsh reality that Native Americans faced following the tumultuous period of removal and relocation, particularly after 1890. As the dust settled on a nation rapidly expanding westward, it became increasingly clear that indigenous communities were bearing the brunt of this progress. Following this period marked by forced removals and relocations, these indigenous peoples found themselves grappling with an array of disparities that continued to persist long after their displacement. To begin this study, we will delve into the disparities experienced by Native Americans as a consequence of forced removal and relocation policies implemented during the late 19th century.

The late 19th century marked a pivotal period in the history of Native Americans in the United States, a time when government policies and actions began to create enduring disparities within indigenous communities. At the forefront of these policies was the Dawes Act of 1887, legislation with far-reaching consequences. With the aim of assimilating Native Americans into American society, the Dawes Act of 1887 symbolizes significant inequities and unjust policies imposed on them. This legislation had devastating consequences for indigenous groups by removing essential tribal lands necessary for survival, cultural practices, and economic stability under its allotment system. [1]As a result of inadequate inheritance in land resources, many families suffered from economic challenges leading to loss or dispossession over time. Furthermore, traditional languages and customs were interrupted through mandatory enrolment in boarding schools designed to wipe out native identities entirely. Furthermore, the act crippled governance structures within tribes creating complications when advocating their rights among native communities – currently manifesting itself today as disparities experienced daily by native Americans including poverty levels that remain high, lack quality healthcare access and education along with political under-representation all of which are core legacies felt as a result of the Dawes Act.

Continuing, the Dawes Act’s repercussions would later shape the 1950s’ relocation policies. One of its main outcomes was the relinquishment of valuable tribal lands, frequently transferred to non-Native settlers, limiting Native Americans’ entry into their traditional domains. This deprivation contributed significantly to various economic challenges faced by numerous indigenous communities for years afterward. As a result, decreased land ownership resulted in struggling tribal economies that made Native Americans susceptible and prone to hardships. The Dawes Act and the relocation policies in the 1950s both had assimilation as a central concept. The former sought to do so through land ownership while the latter aimed for urbanization, but their underlying goal was similar: making Native Americans conform to mainstream American society’s ideals. This reflected how federal authorities wanted to alter identities and lifestyles within Indigenous communities at that time. Basically, the Dawes Act set the stage for economic fragility and property deprivation which led to some policymakers finding urban relocation policies in the 1950s more desirable. The act’s effects of taking away land from Native Americans and interfering with their customary way of life established a foundation for inequalities and difficulties encountered by these communities. This ultimately made them easier targets for future initiatives focused on promoting urbanization or moving elsewhere during the 1950s.

Moving forward, in the 1950s, Native Americans were coerced into relocating to urban areas in order to achieve economic self-sufficiency and assimilate into mainstream society. Commissioned by Bureau of Indian Affairs commissioner Dillon S. Myer the relocation program was launched with the aim of relocating reservation-based Native Americans to urban environments, providing promises such as educational and occupational opportunities, transportation services, housing provisions and everyday necessities. Although this lured over thirty thousand participants; inadequate funding led to poor execution which left many re-locators facing inferior living conditions coupled with gender-segregated low-level jobs that eventually forced them back home. [2] Despite its shortcomings however it can’t be ignored that some relocated Native Americans thrived in cities securing upward socioeconomic mobility by being pro-active in the process of organizing and establishing themselves. As a result, these Native Americans were able to advocate for better livelihoods on reserves, but this was not a common happening. Ultimately, the 1950’s relocation policies failed to fulfill their objectives as many individuals lacked the necessary skills for city life due to emphasis on quantity over quality during recruitment. Consequently, they experienced racial discrimination and limited job opportunities while residing in low-income neighborhoods despite some meager benefits of relocation that favored those with initial job expertise. [3] This historic instance highlights disparities encountered by Native American communities through government policies that lacked adequate support which would eventually lead them towards developing pan-Indian social institutions amidst harsh living situations. These occurrences are consistent with the historical experiences of Native Americans within urban environments illustrating the overlooked complexities faced historically across the developmental stages of these regions.

Furthermore, after the failure of the relocation act and the increasing issues it caused in its attempt to force urbanization on to Native Americans the disparities they faced as result only increased. After the relocation act of the 1950’s, the 1960’s brought new hope to the Native Americans with the emergence of the Civil Rights movement. Despite the promises of social and political change during the Civil Rights era, Native American communities continued to face significant challenges. The termination policy, which aimed to assimilate Native Americans into mainstream society, led to the loss of tribal sovereignty and the dispossession of lands. This policy resulted in economic instability and the erosion of traditional cultural practices. Additionally, the forced relocation of many Native American families from reservations to urban areas disrupted their social fabric and often led to poverty and social marginalization. These challenges and the disparities faced by Native Americans would cease to end even as changes came about for other minority groups this is evident by the “Longest Walk” protest. In the 1970s, Native American activists staged a protest in Washington D.C. called the “Longest Walk,” which brought to light the longstanding disparities faced by their communities. These inequalities were largely impacted by governmental policies and legislation that threatened fundamental rights such as land ownership, access to water and fishing resources, treaty alteration or elimination of reservation systems. These protesters understood that these legal provisions weren’t just mere abstractions but intricately woven into cultural identity and economic sustenance for indigenous people’s survival.[4] Even though this was a peaceful demonstration it highlighted many unaddressed issues inherent with historical wrongdoings towards Indigenous peoples. This event serves as evidence of an ongoing struggle against oppression where multifaceted disparities continue to exist related not only within educational attainment gaps but also unequal healthcare opportunities due mainly because race-based discrimination persists even today. Additionally, the fact that such legislative proposals were considered as late as the 1970s emphasizes that even in modern times, Native Americans grapple with legislative threats that have the potential to perpetuate their marginalization, illustrating that these disparities remain relevant and pressing issues in the present day.

The disparities outlined in this section strongly demonstrate how the neglect and historical amnesia surrounding the forced removals have played a pivotal role in perpetuating the ongoing challenges faced by Native Americans. The Dawes Act of 1887 and the 1950s relocation policies, both driven by the goal of assimilation into mainstream American society, inflicted lasting damage on indigenous communities. These policies resulted in the loss of tribal lands, economic instability, cultural erosion, and social marginalization, creating a foundation of inequality that continues to shape Native American experiences. The subsequent civil rights era did not bring significant relief, as termination policies persisted, further undermining tribal sovereignty and land ownership. The “Longest Walk” protest of the 1970s highlighted the enduring disparities related to land, resources, and cultural identity that continue to plague Native communities. These historical injustices, neglected for so long, have left a lasting imprint, contributing to the disparities in education, healthcare, and political representation still experienced by Native Americans today, underscoring the argument that acknowledging and addressing this historical legacy is crucial to addressing these ongoing challenges.

Moving forward, in the previous section, we delved into the significant disparities that Native Americans experience across various domains, including healthcare, education, and socioeconomic status. However, it is important to recognize that these disparities are not isolated incidents but rather part of a larger pattern of underrepresentation faced by Native Americans in contemporary society. This section aims to shed light on this critical issue and explore how Native Americans continue to be marginalized and overlooked within systems that shape their lives. By examining the various aspects of underrepresentation, such as inadequate political representation, limited media visibility, and exclusion from decision-making processes, we can gain a comprehensive understanding of the multifaceted challenges faced by Native American communities today. Through an analysis of these underrepresented perspectives, we can contribute to ongoing efforts towards achieving greater equity and inclusivity for all individuals in our diverse society including the Native Americans.

To start off, the history of Native Americans has suffered from a consistent pattern of marginalization and misrepresentation in dominant societal narratives. This regrettable circumstance has resulted in many prevalent misunderstandings, misconceptions, and knowledge gaps when it comes to essential aspects related to the rich cultural heritage that defines each tribe’s unique traditions and experiences. Furthermore, this persistent systemic under-representation issue is not limited only to these crucial historical elements but also encompasses an immediate threat regarding indigenous languages’ endangered status along with their respective rituals or customary practices. Consequently, there exists a critical risk linked with the disappearing traditional elements integral towards forming Native identity amidst modern times – making preservation efforts necessary for combating culture erasure as well as safeguarding ancient customs vital toward uniquely defining those who still maintain them today. Additionally, it is important to acknowledge that the underrepresentation of Native Americans in modern discourse and media coverage not only pertains to historical injustices but also extends towards contemporary challenges faced by Indigenous communities. These adversities encompass issues such as poverty, healthcare disparities, and political obstacles which are oftentimes disregarded or downplayed within public discussions. The failure to adequately report on these matters impedes progress towards enacting effective policy changes and support systems for Native American peoples who continue to suffer from systemic marginalization.

An emblematic example of this broader issue of Native American underrepresentation in the United States is found in an examination of the lack of acknowledgement of Native American communities within the state by the state of Pennsylvania. The historical denial of the existence of Native Americans in Pennsylvania serves as a noteworthy example of underrepresentation perpetuated by public institutions. This denial results in Native American communities not receiving official recognition or acknowledgment, therefore rendering them largely invisible within the state’s records and narratives. The absence of official status places these groups at a disadvantage – lacking legal rights, resources and opportunities that come with full acknowledgement. Moreover, this lack further contributes to their underrepresentation. Denying their cultural contributions creates an even greater disconnection from history, amplifying this invisibility throughout public awareness about Pennsylvania’s past[5].  Furthermore, it is distressing to recount how societal pressure forced many members of Native American tribes in Pennsylvania into concealing ancestry leading towards the erasure of cultural identity  – ultimately creating another form of ongoing-under-representation for Native Americans.

Moving forward, we will be focusing primarily on the political underrepresentation of the Native Americans. However, it is important to understand that the underrepresentation of Native Americans is a multifaceted issue that transcends the and extends deep into various aspects of American society. While the lack of political representation is a significant concern, it is important to note that it is just one facet of a broader pattern of systemic inequity and marginalization that Native American communities grapple with. Recognizing that underrepresentation is not confined solely to the political arena, is crucial to adopt a thorough approach that addresses these interconnected issues.

Continuing on, the late 19th and early 20th centuries were a time of significant political transformation for the United States. As the nation grappled with industrialization, urbanization, and the expansion of democratic ideals, various marginalized groups strove to gain representation within the political arena. However, one group that often remains overlooked in this narrative is Native Americans. Despite their rich cultural heritage deeply intertwined with the American landscape, Native Americans found themselves systematically excluded from meaningful participation in the political process. Continuing into this section we will dive into an examination of how Native Americans experienced political underrepresentation during this crucial period. By shedding light on this lesser-known aspect of American history, we can better understand the complexities surrounding democracy’s development and confront enduring issues related to Indigenous rights and representation.

To truly gain and understand the development of the intense political underrepresentation of Native Americans we have to take step back in time, particular to the year 1878 when the Washington Constitutional Convention would convene. The Washington Constitutional Convention of 1878 stands as a pivotal moment in American history, particularly concerning the political underrepresentation of Native Americans. Held during a time when the nation was grappling with issues of equality and inclusion, this convention shed light on the deep-rooted injustices faced by indigenous communities. The proceedings not only highlighted the systemic marginalization of Native Americans but also sparked conversations that would shape future legislation and advocacy efforts aimed at rectifying these longstanding disparities. During this era, Native Americans across the United States were consistently denied their basic rights to political participation. Discriminatory policies and practices had effectively silenced their voices and hindered their ability to influence decisions that directly impacted their lives. This disenfranchisement was acutely felt in Washington state, where tribal nations faced numerous challenges in asserting their political power.

 At the convention, the creators of Washington State’s Constitution made significant choices that directly impacted Native American involvement in politics and representation. One such choice was excluding non-citizens from voting, which affected many Natives as their tribal affiliations rendered them ineligible for citizenship. This exclusion prevented a large portion of Native Americans from participating until 1924 when the Indian Citizenship Act was passed. Furthermore, although there were Indigenous representatives present at this meeting, they did not have any power to vote which resulted in inadequately considering native perspectives during constitution drafting – leading to underrepresentation within political processes across the state. In addition, the 1878 constitution confirmed Washington’s indigenous tribes’ limited sovereignty by placing them under strict jurisdiction where self-governance could be undermined. Additionally, voting restrictions imposed disproportionate property requirements on natives impeding fair opportunities towards meaningful participation or political representation. [6] Notably, these decisions continue impacting today’s policies & governance locally with these communities still facing challenges asserting rightful political rights while maintaining sufficient influence over local affairs.

Furthermore, after the convention in 1878, a significant period of political underrepresentation was set off in the United States. This era was characterized by a combination of legal, cultural, and socio-political factors that marginalized Native American voices in the national political landscape. As mentioned previously, after the passage of the Dawes Act in 1887, Native American lands were dramatically reduced through allotment, often leading to the loss of tribal communal ownership and self-governance. The imposition of citizenship and land ownership requirements for voting further disenfranchised Native Americans, as many were deemed unfit to vote due to their tribal affiliations or lack of individual property. For example, various state constitutions, such as North Dakota’s in 1889, introduced clauses demanding that Native Americans sever tribal ties to be eligible to vote. This effectively disconnected them from their tribal communities and cultural identities. Not only did this impact their involvement in tribal governance, but it also hindered their political representation in state and national politics. Additionally, the federal government’s policies of forced assimilation and the establishment of Indian boarding schools which aimed to eradicate Native cultures and languages also dealt a serious blow to the political representation of Native Americans. This cultural assault hindered Native Americans’ political participation by disconnecting them from their traditional forms of governance and communal decision-making. Native Americans were also not afforded equal opportunities for education and employment, which further in return additionally limited their political influence. [7] Continuing, the Indian Reorganization Act of 1934 represented a partial shift in federal policy, allowing tribes to reconstitute their governments and regain some measure of self-determination. This brief positive shift after 1934, however, wouldn’t be long lived as it following the trend would be undermined by the shift that would occur in the 1940’s.

The 1940s would mark a critical turning point in Native American policy in the United States, heralding a shift that significantly deepened political underrepresentation of Indigenous peoples. This era was characterized by a series of policy changes and legislative actions that not only neglected the voices and interests of Native American communities but actively marginalized them. During the 1940s, there was a significant transformation in government policies towards Native Americans. These changes led to a reduction of tribal sovereignty and autonomy as the government began considering terminating its responsibilities to these communities[8]. Influential members of Congress advocated for assimilating Native Americans into mainstream society while seeking to shift decision-making authority away from them. Simultaneously, states were pressuring federal authorities to withdraw their obligations regarding indigenous populations. The overarching objective was economic and social rehabilitation; however, such policies often disregarded unique cultural and political needs required by these communities. [9] This era is marked by a pivotal shift in Native American policy that had long-lasting consequences on their political representation and self-determination.

This shift would continue through the 1950’s with the previously mentioned relocation policies put in place. However, as we enter the 1960’s another shift occurs with the emergence of the African American Civil Rights movement. The Civil Rights Movement brought about a significant change in Native American political representation. Initially aimed at addressing the rights of African Americans, its principles resonated with other marginalized groups such as Native Americans, who also sought equal treatment and non-discrimination. The passing of two legislative acts – the Civil Rights Act of 1964 and Voting Rights Act of 1965 facilitated greater access to voting for minorities by eliminating discriminatory practices like voter literacy tests and poll taxes that had long plagued native communities. Inspired by these changes, activists emerged from within local tribes seeking self-determination which ultimately led to increased participation in politics resulting in greater engagement on all levels-locally statewide and even federally.

Although the Voting Rights Act of 1965 aimed to eradicate racial discrimination in voting and grant Native Americans full participation in elections, their communities still faced political underrepresentation due to various challenges. These obstacles encompassed issues such as gerrymandering, voter identification requirements, and limited access to polling places on reservations or rural areas. These circumstances adversely impacted Native American voters’ capacity for exercising their democratic rights effectively. [10] Furthermore, a lack of representation at both state legislatures and federal levels persisted throughout subsequent elections — underscoring an ongoing struggle toward inclusive politics that continues today. Even with advancements made through the Voting Rights Act, these barriers demonstrate how deep-seated inequities continue denying fair political representation for Indigenous peoples across America. A prime example of the continuing political underrepresentation that followed the Voting Rights Act is the campaigns for the election of 1972. The campaigns for the election of 1972 underscore the persistent lack of political representation for Native Americans, even in the aftermath of the 1965 Voting Rights Act. It is evident that Native American concerns remained marginalized as both presidential candidates in the 1972 election, George McGovern and Richard Nixon, primarily focused on broader national issues like foreign policy and economic reforms, neglecting specific Native American issues. The campaigns further highlight a historical pattern of unfulfilled promises and pledges of support, further indicating that Native American voices were not adequately heard or represented in the political discourse. Furthermore, Nixon’s decision to reduce the Bureau of Indian Affairs (BIA) budget by nearly $50 million exemplifies a lack of commitment to addressing the unique challenges and needs of Native American communities. Additionally, the campaigns brought attention to the historical trust-based relationship between the United States and Native Americans, which has often been unfulfilling and marked by neglected promises. [11] Overall, despite the enactment of the Voting Rights Act of 1965, the political underrepresentation of Native Americans persisted, as demonstrated by the government’s ongoing failure to address their specific concerns and needs as evidenced by the 1972 election campaigns.

Before we bring this study to an end, in order to provide a more comprehensive picture, it is important to acknowledge the rise of movement for Native American rights that began to develop during the end of the time frame discussed here. Serving as a culmination of the enduring disparities and underrepresentation faced by Native Americans for centuries the Red Power Movement developed in the late 1960s and 1970s. Emerging as a response to these long-standing injustices, the movement sought to address issues such as tribal sovereignty, land rights, cultural preservation, and political activism. The Red Power Movement played a crucial role in raising awareness and advocating for the rights of Native Americans in contemporary American society. While it paved the way for significant progress, Indigenous communities continue to grapple with ongoing challenges, including poverty, healthcare disparities, and political obstacles. [12]These disparities persist, emphasizing the need for continued advocacy and change. However, it’s important to note that the comprehensive examination of the Red Power Movement and its contemporary implications lies beyond the scope of this study, which primarily focuses on the historical context and challenges faced by Native Americans during earlier periods.

Ultimately, the underrepresentation both politically and in general detailed in this section intensely shines a light on how the neglect and amnesia surrounding the forced removals of Native Americans have played a significant role in perpetuating the disparities and challenges faced by these communities. The historical narrative reveals how Native Americans have consistently been excluded from meaningful participation in various aspects of American society, including politics, despite their rich cultural heritage and contributions to the nation. This exclusion extends to the denial of basic rights, voting restrictions, and the erosion of tribal sovereignty. Even after legislative efforts like the Voting Rights Act of 1965 which was aimed at ensuring equal political participation, barriers persisted such as gerrymandering and limited polling access, demonstrating ongoing obstacles to representation. In addition, the focus in the 1972 election campaigns serves as a poignant example of how Native American concerns have been marginalized in national politics. This pattern culminated in the emergence of the Red Power Movement in the late 1960s and 1970s which further pushed the need for advocacy and change in response to deep-rooted disparities. Additionally, this historical underrepresentation and discrimination contribute to the idea that acknowledging and addressing these past injustices and the pattern of underrepresentation are crucial steps toward rectifying the ongoing challenges faced by Native American communities and achieving greater equity and inclusivity.

In conclusion, the involuntary displacement of Native American communities from their traditional lands during the late 19th century and subsequent ignorance about this period in U.S. history have had significant repercussions that still impact Indigenous people today. The marginalization and lack of acknowledgment these occurrences received has contributed to ongoing inequalities, limited representation, and hardships faced by Native Americans and neglecting past injustices has continued a pattern of disregard for indigenous  peoples’ rights which perpetuates further neglect and subordination meant toward them.

Furthermore, the research has highlighted that government policies, such as the Dawes Act and relocation policies of the 1950s, had profound and lasting effects on Native American communities. These policies aimed at assimilation and urbanization disrupted traditional ways of life, eroded tribal sovereignty, and contributed to economic instability. The consequences of these policies are high poverty levels with limited access to quality healthcare and education and political underrepresentation which affects Native Americans even today. Moreover, along with these consequences, light is shed on the matter of political underrepresentation faced by Native Americans throughout history. Starting from exclusionary policies adopted at Washington Constitutional Convention in 1878 to harmful transformations in federal policy during the 1940s; Native Americans have been systemically oppressed within political procedures. Continuing, despite having The Voting Rights Act introduced in 1965, hindrances such as gerrymandering, and voter ID requirements still hinder their impact over politics.

Additionally, the research has also highlighted how the underrepresentation of Native Americans extends beyond politics and encompasses various aspects of American society, including education, healthcare, employment, and media representation. Recognizing the impact of historical amnesia and underrepresentation, it becomes clear that addressing current challenges and fostering development within Native American communities is essential. By shedding light on these historical oversights and their implications for present-day disparities, this research aims to contribute to broader efforts toward achieving equity and justice for Native American populations. Acknowledging their rich cultural heritage, enduring resilience, ongoing struggle for rights and representation are crucial steps towards rectifying past injustices while building a more inclusive society equitable for everyone including the Native Americans.

Burt, Larry W. “Roots of the Native American Urban Experience: Relocation Policy in the 1950s.” American Indian Quarterly 10, no. 2 (1986): 85–99. https://doi.org/10.2307/1183982.  

“Dawes Act of 1887.” National Archives Catalog , 2016. https://catalog.archives.gov/id/5641587.  

Jacobs, Michelle R. Indigenous memory, urban reality stories of American Indian relocation and reclamation. New York: New York University Press, 2023.

Legislative Review 1, no. 12 (1972). https://jstor-org.rider.idm.oclc.org/stable/community.28145368.   

Minderhout, David, and Andrea Frantz. “Invisible Indians: Native Americans in Pennsylvania.” Human Organization 67, no. 1 (2008): 61–67. http://www.jstor.org.rider.idm.oclc.org/stable/44127040.

“Resolution Regarding Native Americans Adopted at the Washington Territory Constitutional Convention, July 17, 1878.” University of Washington Libraries. Special Collections Division. ; Washington Territory Records. Accession No. 4284-001, Box 3. Accessed September 26, 2023. https://search.ebscohost.com/login.aspx?direct=true&db=edsbas&AN=edsbas.73849FE4&site=eds-live&scope=site.  

Treuer, David. “The Heartbeat of Wounded Knee: Native America from 1890 to the Present.” Amazon, 2020. https://www.amazon.com/Heartbeat-Wounded-Knee-America-Present/dp/0399573194.  

Tyler, S. Lyman. A history of indian policy. Washington D.C.: United States Department of the Interior, Bureau of Indian Affairs, 1973.

Wolfley, Jeanette. “Jim Crow, Indian Style: The Disenfranchisement of Native Americans.” American Indian Law Review 16, no. 1 (1991): 167–202. https://doi.org/10.2307/20068694.   “‘Longest Walk,’ Protest March to Oppose Abrogation of All Native American Treaties and the Genocide of Indian People.” Accessed September 26, 2023. https://jstor.org/stable/community.34557616


[1]  “Dawes Act of 1887,” National Archives Catalog , 2016, https://catalog.archives.gov/id/5641587.

[2] Larry W. Burt, “Roots of the Native American Urban Experience: Relocation Policy in the 1950s,” American Indian Quarterly 10, no. 2 (1986): 85–99, https://doi.org/10.2307/1183982.

[3] Michelle R. Jacobs, Indigenous Memory, Urban Reality Stories of American Indian Relocation and Reclamation (New York: New York University Press, 2023).

[4] “‘Longest Walk,’ Protest March to Oppose Abrogation of All Native American Treaties and the Genocide of Indian People,” accessed September 26, 2023, https://jstor.org/stable/community.34557616 .

[5] David Minderhout and Andrea Frantz, “Invisible Indians: Native Americans in Pennsylvania,” Human Organization 67, no. 1 (2008): 61–67, http://www.jstor.org.rider.idm.oclc.org/stable/44127040.

[6] “Resolution Regarding Native Americans Adopted at the Washington Territory Constitutional Convention, July 17, 1878,” University of Washington Libraries, Special Collections Division, Washington Territory Records, Accession No. 4284-001, Box 3, accessed September 26, 2023.”

[7] S. Lyman Tyler, A History of Indian Policy (Washington D.C.: United States Department of the Interior, Bureau of Indian Affairs, 1973).

[8] David Treuer, “The Heartbeat of Wounded Knee: Native America from 1890 to the Present,” Amazon, 2020, https://www.amazon.com/Heartbeat-Wounded-Knee-America-Present/dp/0399573194.

[9] S. Lyman Tyler, A History of Indian Policy (Washington D.C.: United States Department of the Interior, Bureau of Indian Affairs, 1973).

[10] Jeanette Wolfley, “Jim Crow, Indian Style: The Disenfranchisement of Native Americans,” American Indian Law Review 16, no. 1 (1991): 167–202, https://doi.org/10.2307/20068694.

[11] Legislative Review 1, no. 12 (1972), https://jstor-org.rider.idm.oclc.org/stable/community.28145368.

[12] David Treuer, “The Heartbeat of Wounded Knee: Native America from 1890 to the Present,” Amazon, 2020, https://www.amazon.com/Heartbeat-Wounded-Knee-America-Present/dp/0399573194.


 

 

IBM and Auschwitz: New Evidence

Edwin Black

Reprinted with permission from https://historynewsnetwork.org/article/1035

Edwin Black is author of IBM and the Holocaust, The Strategic Alliance between Nazi Germany and America’s Most Powerful Corporation (Crown Publishers 2001 and Three Rivers Press 2002). This article is drawn from Mr. Black’s just released and updated German paperback edition. The new edition includes the discovery of hard evidence linking IBM to Auschwitz. The evidence, detailed here, will be appended to his English language editions at the next reprinting in the new future.

The infamous Auschwitz tattoo began as an IBM number. In August 1943, a timber merchant from Bendzin, Poland, arrived at Auschwitz. He was among a group of 400 inmates, mostly Jews. First, a doctor examined him briefly to determine his fitness for work. His physical information was noted on a medical record. Second, his full prisoner registration was completed with all personal details. Third, his name was checked against the indices of the Political Section to see if he would be subjected to special punishment. Finally, he was registered in the Labor Assignment Office and assigned a characteristic five-digit IBM Hollerith number, 44673.

The five-digit Hollerith number was part of a custom punch card system devised by IBM to track prisoners in Nazi concentration camps, including the slave labor at Auschwitz.

The Polish timber merchant’s punch card number would follow him from labor assignment to labor assignment as Hollerith systems tracked him and his availability for work, and reported the data to the central inmate file eventually kept at Department DII. Department DII of the SS Economics Administration in Oranienburg oversaw all camp slave labor assignments, utilizing elaborate IBM systems.

Later in the summer of 1943, the Polish timber merchant’s same five-digit Hollerith number, 44673, was tattooed on his forearm. Eventually, during the summer of 1943, all non-Germans at Auschwitz were similarly tattooed. Tattoos, however, quickly evolved at Auschwitz. Soon, they bore no further relation to Hollerith compatibility for one reason: the Hollerith number was designed to track a working inmate—not a dead one. Once the daily death rate at Auschwitz climbed, Hollerith-based numbering simply became outmoded. Soon, ad hoc numbering systems were inaugurated at Auschwitz. Various number ranges, often with letters attached, were assigned to prisoners in ascending sequence. Dr. Josef Mengele, who performed cruel experiments, tattooed his own distinct number series on “patients.” Tattoo numbering schemes ultimately took on a chaotic incongruity all its own as an internal Auschwitz-specific identification system.

However, Hollerith numbers remained the chief method Berlin employed to centrally identify and track prisoners at Auschwitz. For example, in late 1943, some 6,500 healthy, working Jews were ordered to the gas chamber by the SS. But their murder was delayed for two days as the Political Section meticulously checked each of their numbers against the Section’s own card index. The Section was under orders to temporarily reprieve any Jews with traces of Aryan parentage.

Sigismund Gajda was another Auschwitz inmate processed by the Hollerith system. Born in Kielce, Poland, Gajda was about 40 years of age when on May 18, 1943, he arrived at Auschwitz. A plain paper form, labeled “Personal Inmate Card,” listed all of Gajda’s personal information. He professed Roman Catholicism, had two children, and his work skill was marked”mechanic.” The reverse side of his Personal Inmate Card listed nine previous work assignments. Once Gajda’s card was processed by IBM equipment, a large indicia in typical Nazi Gothic script was rubber-stamped at the bottom: “Hollerith erfasst,” or “Hollerith registered.” Indeed, that designation was stamped in large letters on hundreds of thousands of processed Personal Inmate Cards at camps all across Europe. The Extermination by Labor campaign itself depended upon specially designed IBM systems that matched worker skills and locations with labor needs across Nazi-dominated Europe. Once the prisoner was too exhausted to work, he was murdered by gas or bullet. Exterminated prisoners were coded “six” in the IBM system.

The Polish timber merchant’s Hollerith tattoo, Sigismund Gajda’s inmate form, and the victimization of millions more at Auschwitz live on as dark icons of IBM’s conscious 12-year business alliance with Nazi Germany. IBM’s custom-designed prisoner-tracking Hollerith punch card equipment allowed the Nazis to efficiently manage the hundreds of concentration camps and sub-camps throughout Europe, as well as the millions who passed through them. Auschwitz’ camp code in the IBM tabulation system was 001.8

Nearly every Nazi concentration camp operated a Hollerith Department known as the Hollerith Abteilung. The three-part Hollerith system of paper forms, punch cards and processing machines varied from camp to camp and from year to year, depending upon conditions. In some camps, such as Dachau and Storkow, as many as two dozen IBM sorters, tabulators, and printers were installed. Other facilities operated punchers only and submitted their cards to central locations such as Mauthausen or Berlin. In some camps, such as Stuthoff, the plain paper forms were coded and processed elsewhere. Hollerith activity, whether paper, punching or processing, was frequently—but not always–located within the camp itself, consigned to a special bureau called the Labor Assignment Office, known in German as the Arbeitseinatz. The Arbeitseinsatz issued the all-important life-sustaining daily work assignments, and processed all inmate cards and labor transfer rosters.

IBM did not sell any of its punch card machines to Nazi Germany. The equipment was leased by the month. Each month, often more frequently, authorized repairmen, working directly for or trained by IBM, serviced the machines on-site–whether in the middle of Berlin or at a concentration camp. In addition, all spare parts were supplied by IBM factories located throughout Europe. Of course, the billions of punch cards continually devoured by the machines, available exclusively from IBM, were extra.

IBM’s extensive technological support for Hitler’s conquest of Europe and genocide against the Jews was extensively documented in my book, IBM and the Holocaust, published in February 2001 and updated in a paperback edition. In March of this year, The Village Voice broke exclusive new details of a special IBM wartime subsidiary set up in Poland by IBM’s New York headquarters shortly after Hitler’s 1939 invasion. In 1939, America had not entered the war, and it was still legal to trade with Nazi Germany. IBM’s new Polish subsidiary, Watson Business Machines, helped Germany automate the rape of Poland. The subsidiary was named for its president Thomas J. Watson.

Central to the Nazi effort was a massive 500-man Hollerith Gruppe, installed in a looming brown building at 24 Murnerstrasse in Krakow. The Hollerith Gruppe of the Nazi Statistical Office crunched all the numbers of plunder and genocide that allowed the Nazis to systematically starve the Jews, meter them out of the ghettos and then transport them to either work camps or death camps. The trains running to Auschwitz were tracked by a special guarded IBM customer site facility at 22 Pawia in Krakow. The millions of punch cards the Nazis in Poland required were obtained exclusively from IBM, including one company print shop at 6 Rymarska Street across the street from the Warsaw Ghetto. The entire Polish subsidiary was overseen by an IBM administrative facility at 24 Kreuz in Warsaw.

The exact address and equipment arrays of the key IBM offices and customer sites in Nazi-occupied Poland have been discovered. But no one has ever been able to locate an IBM facility at, or even near, Auschwitz. Until now. Auschwitz chief archivist Piotr Setkiewicz finally pinpointed the first such IBM customer site. The newly unearthed IBM customer site was a huge Hollerith Büro. It was situated in the I.G. Farben factory complex, housed in Barracks 18, next to German Civil Worker Camp 7, about two kilometers from Auschwitz III, also known as Monowitz Concentration Camp. Auschwitz’ Setkiewicz explains, “The Hollerith office at IG Farben in Monowitz used the IBM machines as a system of computerization of civil and slave labor resources. This gave Farben the opportunity to identify people with certain skills, primarily skills needed for the construction of certain buildings in Monowitz.”

By way of background, what most people call “Auschwitz” was actually a sprawling hell comprised of three concentration camps, surrounded by some 40 subcamps, numerous factories and a collection of farms in a surrounding captive commercial zone. The original Auschwitz became known simply as Auschwitz I, and functioned as a diversified camp for transit, labor and detention. Auschwitz II, also called Birkenau, became the infamous extermination center, operating gas chambers and ovens. Nearby Auschwitz III, known as Monowitz, existed primarily as a slave labor camp. Monowitz is where IBM’s bustling customer site functioned.

Many of the long-known paper prisoner forms stamped Hollerith Erfasst, or” registered by Hollerith,” indicated the prisoners were from Auschwitz III, that is, Monowitz. Now Auschwitz archivist Setkiewicz has also discovered about 100 Hollerith machine summary printouts of Monowitz prisoner assignments and details generated by the I.G. Farben customer site. For example, Alexander Kuciel, born August 12, 1889, was in 1944 deployed as a slave carpenter, skill coded 0149, and his Hollerith printout is marked “Sch/P,” the Reich abbreviation for Schutzhäftling/Pole. Schutzhäftling/Pole means “Polish political prisoner.” The giant Farben facilities, also known as “I.G. Werk Auschwitz,” maintained two Hollerith Büro staff contacts, Herr Hirsch and Herr Husch. One key man running the card index systems was Eduard Müller. Müller was a fat, aging, ill-kempt man, with brown hair and brown eyes. Some said, “He stank like a polecat.” A rabid Nazi, Müller took special delight in harming inmates from his all-important position in camp administration.

Comparison of the new printouts to other typical camp cards shows the Monowitz systems were customized for the specific coding Farben needed to process the thousands of slave workers who labored and died there. The machines were probably also used to manage and develop manufacturing processes and ordinary business applications. The machines almost certainly did not maintain extermination totals, which were calculated as “evacuations” by the Hollerith Gruppe in Krakow. At press time, the diverse Farben codes and range of machine uses are still being studied. It is not known how many additional IBM customer sites researchers will discover in the cold ashes of the expansive commercial Auschwitz zone.

A Hollerith Büro, such as the one at Auschwitz III, was larger than a typical mechanized concentration camp Hollerith Department. A Büro was generally comprised of more than a dozen punching machines, a sorter and one tabulator. Leon Krzemieniecki was a compulsory worker who operated a tabulator at the IBM customer site at the Polish railways office in Krakow that kept track of trains going to and from Auschwitz. He recalls, “I know that trains were constantly going from Krakow to Auschwitz–not only passenger trains, but cargo trains as well.” Krzemieniecki, who worked for two years with IBM punchers, card sorters and tabulators, estimates that a punch card operation for so large a manufacturing complex as Farben “would probably require at least two high-speed tabulators, four sorters, and perhaps 20 punchers.” He added, “The whole thing would probably require 30-40 persons, plus their German supervisors.”

The new revelation of IBM technology in the Auschwitz area constitutes the final link in the chain of documentation surrounding Big Blue’s vast enterprise in Nazi-occupied Poland, supervised at first directly from its New York headquarters, and later through its Geneva office. Jewish leaders and human rights activists were again outraged. “This latest disclosure removes any pretext of deniability and completes the puzzle that has been IBM and Auschwitz: New Evidence.

“When put together about IBM in Poland,” declared Malcolm Hoenlein, vice president of the New York-based Conference of Presidents of Major Jewish Organizations. “The picture that emerges is most disturbing,” added Hoenlein.” IBM must confront this matter honestly if there is to be any closure.”

Marek Orski, state historian of the museum at Poland’s Stuthoff Concentration Camp, has distinguished himself as that country’s leading expert on the use of IBM technology at Polish concentration camps. “This latest information,” asserts Orski,”proves once more that IBM’s Hollerith machines in occupied Poland were functioning in the area of yet another concentration camp, in this case Auschwitz-Monowitz–something completely unknown until now. It is yet another significant revelation in what has become the undoubted fact of IBM’s involvement in Poland. Now we need to compile more documents identifying the exact activity of this Hollerith Büro in Auschwitz Monowitz.”

Krzemieniecki is convinced obtaining such documents would be difficult. “It would be great to have access to those documents,” he said, “but where are they?” He added, “Please remember, I witnessed in 1944, when the war front came closer to Poland, that all the IBM machines in Krakow were removed. I’m sure the Farben machines were being moved at the same time. Plus, the Germans were busy destroying all the records. Even still,” he continues, “what has been revealed thus far is a great achievement.”

Auschwitz historians were originally convinced that there were no machines at Auschwitz, that all the prisoner documents were processed at a remote location, primarily because they could find no trace of the equipment in the area. They even speculated that the stamped forms from Auschwitz III were actually punched at the massive Hollerith service at Mauthausen concentration camp. Indeed, even the Farben Hollerith documents had been identified some time ago at Auschwitz, but were not understood as IBM printouts. That is, not until the Hollerith Büro itself was discovered. Archivists only found the Büro because it was listed in the I.G. Werk Auschwitz phone book on page 50. The phone extension was 4496.”I was looking for something else,” recalls Auschwitz’ Setkiewicz,”and there it was.” Once the printouts were reexamined in the light of IBM punch card revelations, the connection became clear.

Setkiewicz says, “We still need to find more similar identification cards and printouts, and try to find just how extensive was the usage in the whole I.G. Farben administration and employment of workers. But no one among historians has had success in finding these documents.”

In the current climate of intense public scrutiny of corporate subsidiaries, IBM’s evasive response has aroused a renewed demand for accountability. “In the day of Enron and Tyco,” says Robert Urekew, a University of Louisville professor of business ethics, “we now know these are not impersonal entities. They are directed by people with names and faces.” Prof. Urekew, who has studied IBM’s Hitler-era activities, continued, “The news that IBM machines were at Auschwitz is just the latest smoking gun. For IBM to continue to stonewall and hinder access to its New York archives flies in the face of the focus on accountability in business ethics today. Since the United States was not technically at war with Nazi Germany in 1939, it may have been legal for IBM to do business with the Third Reich and its camps in Poland. But was it moral?”

Even some IBM employees are frustrated by IBM’s silence. Michael Zamczyk, for example, is a long-time IBM employee in San Jose, California, working on business controls. A loyal IBMer, Zamczyk has worked for the company for some 28 years. He is also probably the only IBM employee who survived the Krakow ghetto in 1941 and 1942. Since revelations about IBM’s ties to Hitler exploded into public view in February 2001, Zamczyk has been demanding answers—and an apology–from IBM senior management.

“Originally,” says Zamczyk,”I was just trying to determine if it was IBM equipment that helped select my father to be shipped to Auschwitz, and if the machines were used to schedule the trains to Auschwitz.

Zamczyk started writing letters and emails, but to no avail. He could not get any concrete response about IBM’s activities during the Hitler era.”I contacted senior management, all the way up to the president, trying to get an answer,”states Zamczyk. “Since then, I have read the facts about IBM in Poland, about the railroad department at 22 Pawia Street in Krakow, and I read about the eyewitnesses. Now I feel that IBM owes me, as an IBM employee, an apology. And that is all I am looking for.”

Zamczyk was met by stony silence from IBM executives.” The only response I got,” he relates, “was basically telling me there would be no public or private apology. But I am still waiting for that apology and debating what to do next.”

Repeated attempts to obtain IBM reaction to the newest disclosure were rebuffed by IBM spokesman Carol Makovich. I phoned her more than a dozen times, but she did not respond, or grant me permission to examine Polish, Brazilian and French subsidiary documents at the company’s Somers, New York archives. Nor has the company been forthcoming to numerous Jewish leaders, consumers and members of the media who have demanded answers.

At one point, Makovich quipped to a Reuters correspondent, “We are a technology company, we are not historians.”


 

Local History: The American Revolution in the Finger Lakes

Reprinted from New York Almanack based on an essay from the National Park Service’s Finger Lakes National Heritage Area Feasibility Study. https://www.newyorkalmanack.com/2023/09/american-revolution-finger-lakes/#more-98398

Initially, the Haudenosaunee Confederacy (Iroquois) claimed neutrality during the conflict between Britain and the colonists, seeing the disagreement as a civil war and valuing loyalty to their families and to their lands above all else. When the political discontent erupted into the American Revolutionary War, the member nations of the Haudenosaunee Confederacy split their support between the British and newly formed American forces. The majority of nations and individual members supported the British under the belief that those nations would be more likely to keep their relative independence and land under continued British rule, while the Oneida and Tuscarora backed the American Colonists.

As with many American families, alliance was not clear-cut, and in some cases, allegiance was split on a person-by-person basis, which destabilized the clan-based society. What had started as a European civil war on North American soil soon turned the Confederacy against itself, undermining the social unity and political stability that the Six Nations had enjoyed for centuries. In 1778, Loyalists and members of the British-backed nations participated in destructive raids that crippled Continental forces and destroyed frontier settlements in New York and Pennsylvania. Fearing that the New York frontier would be pushed east to the Hudson River if divisive action was not taken, General George Washington ordered General John Sullivan to lead four brigades of men — a sizable portion of the Continental Army — on a scorched-earth campaign that would limit the Haudenosaunee’s ability to attack in the future.

Washington tasked Sullivan with launching a terror campaign to destroy the food supply of the Cayuga and Seneca Nations in the heart of the Finger Lakes and to reduce the Cayuga and Seneca’s forces. Smaller expeditions were tasked with destroying Seneca settlements in western Pennsylvania and Onondaga settlements in Central New York. General Sullivan and his second-in-command, General James Clinton met in Tioga near the Pennsylvania-New York border and began their campaign by destroying the Munsee Delaware settlement of Chemung in present-day Chemung County. Instead of deploying the guerrilla tactics that long served Haudenosaunee well, Confederacy war chiefs and the meager British forces available to counterattack decided to retaliate with a standing battle.

The Battle of Newtown on August 29, 1779, ended in a British and Indian retreat and destroyed morale for the British-backing Confederacy Nations, who now chose to proactively flee to other nearby settlements. For the next two weeks, Sullivan’s forces moved from Seneca Lake to Canandaigua Lake to Chenussio — a Seneca stronghold near present-day Leicester in Livingston County that included 128 multi-family longhouses. By the end of the campaign, Sullivan’s men destroyed more than 40 Haudenosaunee villages, at least 160,000 bushels of corn, countless pounds of stored vegetables and fruit, and only suffered 40 casualties.

While the American forces did not take Haudenosaunee prisoners, the Sullivan Campaign destroyed the nations’ capacity to wage war. By the end of September 1779, more than 5,000 nation members had arrived at the British Fort Niagara expecting food, clothing, and shelter in the face of their catastrophic losses at the hands of the Americans. Instead of lessening the threat to frontier settlements, the Sullivan Campaign increased the animosity of Natives and British alike, laying the ground for fierce fighting within the New York frontier of British-backed Indian raids during the 1780s.

Local History: The Great Depression in New York City

Reprinted from New York Almanack based on an article from the Blackwell’s Almanac, a publication of the Roosevelt Island Historical Society. https://www.newyorkalmanack.com/2023/09/great-depression-in-new-york-city/

As the 1920s advanced, the economy soared. But with that dramatic expansion came irrational exuberance and unchecked speculation: stock prices reached levels that had no basis in reality; margin purchases were rampant; banks handed out loans lavishly and imprudently; and giddy product production resulted in a vast oversupply of goods. On Tuesday, October 29, 1929, it all came crashing down. This is the story of the Great Depression in New York City.

After an erratic week in which stocks, including blue chip stocks, mostly declined, waves of panicked investors sold off their shares, driving the market ever downward. On that one day, now known as Black Tuesday, the market lost $14 billion in value; over the ensuing week, it erased another $30 billion — eventually suffering the staggering loss of 89.2% over its peak in early September.

Bank failures and business bankruptcies followed, presaging a decade of unprecedented economic hardship. New York City came to be viewed as “the symbolic capital of the Depression, the financial capital where it had started, and the place where its effects were most keenly felt.” Many residents lost their savings, their jobs and their homes. By 1932, half the city’s factories were closed, almost one-third of New Yorkers were unemployed (vs. one-quarter of the rest of the country and over one-half in Harlem), and some 1.6 million residents were on relief. Those who remained employed and therefore ineligible for the dole were often forced to take severe pay cuts.

At the time of the crash, under Mayor Jimmy Walker, there were few centralized municipal services that could be tapped for jobs or rescue: there were no central traffic, highway or public works department; street-cleaning was a function of individual boroughs; there were five separate parks departments; unemployment insurance was non-existent and, in the beginning, the Department of Public Welfare had no funds available. New York City, like most cities, was dependent on charitable institutions and alms houses to succor the poor, the homeless and the hungry. Yet these organizations publicly admitted their inability to meet the heavy demands being made of them.

In March 1930, 35,000 out-of-work protesters marched toward City Hall as part of International Unemployment Day organized by the Communist Party. They were met with violent attack by the New York Police Department. Several years later, it was the Black and Latino population’s turn. In addition to being jobless, they had to deal with blatant discrimination, including exclusion from more than 24 of the city’s trade unions and rejection at public work sites. With tempers boiling, a furious Harlem mob vandalized white-owned stores. Some 4,000 individuals took part, inflicting over $2 million in damages, resulting in 30 hospitalizations and several deaths. While an investigation into discriminatory practices was launched, little came of it and the situation continued unchanged.

Riots in New York flared and petered out. What didn’t peter out was the sheer fight to survive – for the hungry, the need to eat, and for the homeless, the need to find shelter. Breadlines and soup kitchens were one aspect of the fight. People lined up daily in long, snaking queues outside bakeries or pantries to score a ration of day-old bread or thin soup. To hide their humiliation from neighbors, many would leave their homes dressed up as if they were going to work. Once on the line, they just stared straight ahead, refusing to interact with their downtrodden peers — in fact, refusing to admit to themselves where they were.

Thousands evicted from their homes took to living in shacks in parks or backstreets. As more and more homeless joined these camps, they grew into little shantytowns nicknamed “Hoovervilles” in condemnation of the inactivity of President Herbert Hoover to remedy the situation. The largest such settlement was located next to the Reservoir in Central Park. Ironically, many of the Hooverville men were construction tradesmen — bricklayers, stone masons, carpenters — who had helped build the luxury buildings surrounding the park and who now set to building their own shanties out of scavenged materials. Despite the skill and artistry with which these abodes were constructed, they were illegal; so both local and federal authorities regularly raided the settlements, destroying the shelters and scattering their inhabitants.

Conditions were dire and pleading letters from city officials and residents alike piled up in the Mayor’s office. Finally, in October 1930, Jimmy Walker created the Mayor’s Official Committee for Relief of the Unemployed and Needy, and things started to happen. By November there was:

  • a City Employment Bureau, which obviated the problem of job-seekers having to pay private employment firms;
  • a stop to the eviction of poor families for rent arrears;
  • a large-scale investigation by the police to determine needs in all 77 precincts;
  • a windfall of contributions to unemployment relief from police and other city employees;
  • an expansion of city lodging facilities; and
  • a special Cabinet Committee to deal with questions of food, clothing and rent.

In the first eight months of its existence, the Committee raised some $1.6 million. Direct relief funds were paid to 11,000 families, while 18,000 tons of food, including Kosher food, was given out to almost a million families. (Night patrolmen spent a good part of their shifts packing and wrapping these food parcels.) The money also paid for coal, shoes and clothing. Another city agency, the Welfare Council, disbursed over $12 million for relief and emergency work wages. These funds too came from voluntary donations. Private citizens contributed; sports teams organized exhibition matches (for example Notre Dame football vs. the New York Giants); and Broadway staged special benefit performances.

For a while spirits rose and hopes of normalcy returned. But by April 1931, it was clear that private welfare measures and one-off City actions could not keep up with the growing distress. Help was needed and it came from a now-familiar individual — Franklin Delano Roosevelt, not as president, but as Governor of New York State. Despairing of any constructive efforts by the Federal government, Roosevelt, unique among governors to accept liability for his constituents, declared: “upon the State falls the duty of protecting and sustaining those of its citizens who, through no fault of their own, find themselves… unable to maintain life.” By August 1931, foreshadowing elements of the future New Deal, a robust public works program was in effect to reduce unemployment. State income tax was increased by 50% and the Comptroller authorized the issuance of revenue bonds at both the state and local level. Some would say that New York City was in better shape than many other cities. Yet it was still on the critical list.

It wasn’t until 1932, when Walker resigned amid an investigation for graft and Herbert Hoover was voted out of office, that the way was paved for major innovations. Newly elected President FDR embodied the optimism of his catchy campaign song, “Happy Days Are Here Again.” Within a couple of years, he promulgated the historic, blockbuster New Deal, and working in close partnership with newly elected Mayor Fiorello LaGuardia, transformed both the country and the City. The “New Deal” New York — the most populous American city with almost seven million residents — was the single greatest beneficiary of the New Deal’s Works Project Administration (WPA) in the entire U.S.

Under the WPA, more than a dozen federal agencies paid for the labor and materials to support hundreds of projects designed to put New Yorkers back to work. The New Deal built housing, schools, courthouses, roads, hospitals and health clinics, libraries, post offices, bridges, and highways. It was the impetus and money behind the Triborough Bridge, LaGuardia Airport, the Lincoln Tunnel, and the East River (FDR) Drive. It also gave the city an extensive system of recreational facilities, including swimming pools, playgrounds, ball fields, hiking trails, and parks.

But construction wasn’t its only recipient. FDR, Eleanor Roosevelt and Harry Hopkins (head of the WPA) recognized that funding culture and practitioners of culture was just as important. (“Hell, they’ve got to eat just like other people,” Hopkins is reported to have said). So, jobless artists, designers, craftsmen and photographers were hired to embellish public spaces with murals and sculptures, while posters publicized other WPA programs, and illustrations, photos and crafts found their way into newly opened galleries and respected museums. Playwrights, writers, actors and singers were paid to create theatrical shows — even Yiddish and German theater. And out-of-work musicians and composers of all stripes (classical, folk, jazz, light opera) were employed to give concerts indoors and out. At the same time, New Deal legislation began strengthening workers’ rights by allowing them to organize, earn a minimum wage and, as discussed below, obtain unemployment compensation and sign up for Social Security.

When Frances Perkins, a fierce advocate of social justice and economic security, was tapped as Secretary of Labor, she brought a list of proposals for FDR’s approval. Among them were unemployment insurance and what she called “old age” insurance. Both of them knew that the development of such programs would encounter many obstacles, not the least of which would be challenges to their constitutionality.

Be that as it may, in 1935, the enabling legislation passed overwhelmingly and FDR authorized the establishment of unemployment insurance and Social Security. And in 1937, the Supreme Court affirmed the constitutionality of levying taxes to fund both programs. IBM won the bid to create the largest and most complicated data processing system ever built. It even designed novel equipment for the unprecedented task of enrolling some 30 million employers and workers, and registering their contributions into the Social Security system for later retirement payouts. According to Perkins, “Nothing [other than the Great Depression] would have bumped the American people into a social security system except something so shocking, so terrifying, as that depression.”

Above and beyond the homeless, 30% of the City’s housed population lived in deteriorating, squalid tenements. There were other slums deemed “unfit for human habitation.” The National Recovery Act of 1933 authorized the clearance of slums, repair of salvageable structures and construction of low cost housing. And the country’s very first “public housing” — a previously unheard of concept — was built in New York under the newly formed New York City Housing Authority (NYCHA). The first three public projects were: First Houses, between First Avenue and Avenue A, from Second to Third Streets in the East Village; Williamsburg Houses, Scholes to Maujer Streets, Leonard Street to Bushwick Avenue, Williamsburg, Brooklyn, Harlem River Houses, Seventh Avenue to Macombs Place, Harlem River Drive, and 151st to 153rd Streets in Harlem. Their public ownership represented a radical step that both created jobs and sheltered people in up-to-date homes. By 1941, nine such projects had been developed in New York City, providing 11,570 units. They are all still with us and the first three have been designated New York City landmarks.

The sheer range of educational programs implemented by the New Deal was remarkable. From kindergarten to college (for example, Hunter College, Brooklyn College, the Merchant Marine Academy in the Bronx), new buildings expanded the student population. Thousands of teachers were hired, and adjunctive programs such as preschool, work-study programs for young people, and vocational classes for adults were instituted. Community education classes were held in libraries, settlement houses, local facilities, trade union halls, park buildings, and even on the radio. There was no end to what a willing individual could learn, including driving, English, home arts, visual arts and new vocational skills. Much of the funds secured for New York City can be directly attributed to LaGuardia’s force of personality. According to Roosevelt, he would show up in Washington “and tell me a sad story. The tears run down my cheeks and tears run down his cheeks and the first thing I know he has wrangled another $50,000,000.”

For many City residents, lack of work had devolved into declining health, malnutrition, and increasing rates of infant mortality. New Deal funding produced new hospitals and neighborhood health clinics. The latter were often located in or near public housing developments and provided free medical and dental care, including immunizations, for all ages. The clinic doctors and nurses also visited homes and schools, and gave classes in healthy living. The clinics even sent housekeepers to help out where parents were ill. Access to regular health care was a first for many New Yorkers and its effects were incontestable: decreased infant mortality, a drop in serious illness and a decline in the suicides that so darkened the Depression years. It took entry into the Second World War to completely obliterate the Great Depression. Tens of thousands of men went off to battle, while the rest of the country was galvanized into full employment by the war effort. Still, the New Deal, with its plethora of alphabet soup subsidiaries, was nothing short of miraculous. It carried the country and New York City through one of the most challenging eras in our history. It transformed the relationship of government to its citizens — embodying a dynamism that has strengthened New York through the years and continues to empower it to this day

The Trumpist Supreme Court: Off the Rails of Democracy

Norman Markowitz

Rage and confusion over the recent Supreme Court decisions is sweeping the nation. The Roe v. Wade decision (1973) establishing women’s reproductive rights has been repealed. A New York State law prohibiting the carrying of concealed guns, passed in response to escalating shootings and deaths, has been declared unconstitutional. The court has sharply reduced the regulatory powers of the Environmental Protection Agency, established in 1970. This comes after decades of scientific research showing the dangers of climate change and global warming.

What is the logic behind this? There is a standard used in philosophy which should be applied to the Court’s recent decisions. Statements, or assertions, should be judged by their “validity and reliability.” Are they true statements in terms of logic, reason, and consistency (validity)? Is the evidence (facts, data) used to support the statement true (reliability)? I will use this standard to look at the Court’s rulings.

The Constitution was a political compromise among merchant capitalists, landlords, slaveholders, creditors, and debtors on a variety of issues — slavery, the payment of debts, and the regulation of trade. It cannot be interpreted like the Jewish Torah, the Christian Gospels, or the Muslim Koran — sacred, unchanging texts. And the Supreme Court has no right to interpret legislation passed by Congress or the directives of the president, since the Constitution did not give the Court the power of judicial review.

However, that power was in effect taken by the Court in 1805 in a brilliant maneuver by Chief Justice John Marshall in Marbury v. Madison. The court has maintained the power of judicial review for over two centuries, often adjusting its interpretations to major changes in society.

The representatives who drafted and approved the Constitution, much less the former colonies/states which ratified it, all rejected the principle of universal suffrage. The leaders of the revolution associated the term “democracy” with mob rule. Property qualifications for voting in federal elections was the established rule. If one took the original intent seriously, the Court would have the power to establish property qualifications for voting, since there is no constitutional amendment abolishing property qualifications for voting, just as there are constitutional amendments abolishing slavery and giving women the right to vote.

When the Constitution was drafted and enacted, English common law defined life as existing when a fetus could be felt moving or kicking in the mother’s womb, called “quickening.” If the mother claimed that the fetus had been aborted before this “quickening,” she was held harmless. Laws banning abortion and contraception, and pamphlets and manuals about both in the mails, were enacted at the state and federal levels in the late 19th century as part of a movement led by the Reverend Anthony Comstock, organizer of the Society for the Suppression of Vice. These laws were part of a backlash against the growing movement for women’s civil rights, equality under the law, and the right to vote. The women’s rights/women’s liberation movement of the 1960s, following in the path of the civil rights/Black liberation movement, led the successful campaign to repeal these laws, which finally resulted in Roe v. Wade, a century after they began to be enacted.

The Court’s decision invalidating a New York state law prohibiting the carrying of concealed handguns is also unreliable. Here the evidence is direct and incontrovertible. The Second Amendment to the Constitution states, “A well-regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.” But in English law and in colonial theory and practice, as Joshua Zeitz in an excellent analysis argues, the amendment never meant that all citizens had the right to bear arms. This right “was inextricably connected to the citizen’s obligation to serve in a militia and to protect the community from enemies domestic and foreign.” And “well-regulated militias” meant militias constituted by legitimate authorities, not private groups like the later KKK, Nazi storm troopers, or self-proclaimed state militias.

Zeitz makes the important point that James Madison, a major author of the Constitution and the Bill of Rights, had earlier drafted legislation in the Virginia legislature barring individuals from openly carrying and displaying guns, like the present New York State law that the Court has declared unconstitutional. The purpose of the amendment was clearly to prevent a government from doing what Britain did in the aftermath of the Boston Tea Party: disperse the colonial legislature and its militia and in effect declare martial law. Also, the guns in question fired single “balls,” not bullets, and had very limited range and accuracy. Today’s AR-15 rifles, for example, used in recent mass shootings, have greater fire power and accuracy than the assault rifles used during World War II and the Korean War.

The Supreme Court’s other decisions on the regulatory powers of the Environmental Protection Agency, and the right of a school employee to engage in religious action, are neither valid in their relationship to the Constitution nor reliable in regard to their factual assertions. They are a repudiation of more than a century of law and policy of the federal regulation of industry and the post–Civil War 14th Amendment defending the civil rights and liberties of citizens from their infringement and/or denial by the states.

The Supreme Court and the judiciary have been the most conservative section of the federal government throughout most of U.S. history. The fact that the justices are not elected and can be removed only through impeachment, resignation, or death explains this.

The courts have in the past and once more in recent decades used the Commerce Clause of the Constitution to declare unconstitutional legislation that regulates business and promotes social welfare. Beginning in the 1880s, they declared corporations “persons” to give them 14th Amendment protections from regulation and taxation by the states, and have over and over again used the 10th Amendment to support states’ rights.

The political nature of the Supreme Court from its very inception is indisputable. The Court, for example, represented the interests of the slaveholder class from the administration of George Washington (himself a slaveholder) up to the Civil War. But as the nation changed, industrial capitalism grew, and the anti-slavery movement became broader, the demands of the slaveholders and the actions of their Supreme Court became more extreme. The Dred Scott decision (1857), which in effect repealed the earlier restrictions on the expansion of slavery in the Western territories, supporting legislation advanced by pro-slavery congresses and presidents, reflected this development. As an afterthought, the slaveholder-dominated Supreme Court claimed that the authors of the Constitution had not intended any Black person, slave or free, to have the rights of an American citizen, an expression of “original intent” which both enraged and strengthened the increasingly militant anti-slavery national coalition.

With the defeat of the Confederacy, slavery was abolished through constitutional amendment in all the states, and the former Confederate states now under Union army occupation had to ratify the amendment to regain admission to the Union. With the support of President Andrew Johnson, a pro-Union former senator from Tennessee (and himself a former slaveholder), they did so while enacting labor codes that in effect declared the former slaves to be unemployed vagrants and returned them to the “custodial care” of their former owners.

In response to these acts, Thaddeus Stevens, Charles Sumner, and other militant anti-slavery leaders of the Republican Party proposed a second constitutional amendment to establish national citizenship and protect the civil rights and civil liberties of the nearly 4 million former slaves. They did this for two reasons. They feared that President Johnson would veto the civil rights legislation they were advancing in Congress. And even if they were able to override his veto, they feared that the Supreme Court, where the now former slaveholders remained a powerful force, would declare such legislation unconstitutional.

The 14th Amendment establishing national citizenship was passed, followed by the 15th, which extended the right to vote. However, the war was a victory for the industrial capitalists and their banker allies, who within a generation betrayed both the former slaves and the workers and farmers who saw Civil War policies like the Homestead Act and the creation of land grant colleges as advancing their class interests.

The Supreme Court and the federal judiciary in the aftermath of the Civil War fiercely defended the interests of “big business” against organized farmers, workers, state governments, and the federal government. In the 1880s, the Supreme Court in a series of decisions invalidated the civil rights acts of the Reconstruction era and the 14th Amendment’s protection of citizenship rights from state government policies. States were permitted to ignore the Civil Rights Act of 1875, which banned exclusion and discrimination in public accommodations. That protection would only be restored by the Civil Rights Act of 1964 after a century of de jure segregation.

In 1896, the Plessy v. Ferguson decision gave states the right to establish segregation by law, using as a cover the principle of “separate but equal” under such laws, although it was clear to everyone that the systematic exclusion of African Americans from public schools, public employment, public transportation, and commercial establishments was crudely unequal. The courts also endorsed state laws which denied the overwhelming majority of Black people the right to vote; the convict lease system, a form of slave labor for prisoners; and state “poll taxes,” which primarily discriminated against poor whites (in most places African Americans had been already disenfranchised).

At the same time, the Court in the 1880s took the 14th Amendment’s defense of the rights of “persons” and applied it to business and corporations, declaring state laws regulating business to be unconstitutional.  At the time the 14th Amendment was proposed and enacted, everyone understood that the “persons” referred to were the 4 million former slaves, no longer under law, but not yet citizens.

But this was just the beginning. An early modest federal income tax (a surcharge on high incomes) was declared unconstitutional in the Pollock case. It negated the Sherman Anti-Trust Act (1890) by declaring that the federal government and the states could only regulate commerce — not manufacture — under the Constitution. In an industrial society, regulation became a farce.

Decades later, a constitutional amendment gave the federal government the right to levy income taxes, and Congress passed legislation that, to a limited extent, regulated trade and restructured the banking system. However, the Court routinely declared unconstitutional state laws protecting the right of workers to organize unions, providing for the health and safety regulation of workplaces, minimum wages, and the 1916 federal law outlawing child labor.

It was not until the Great Depression of the 1930s, which saw the great upsurge of labor with the Communist Party playing a central role, that the New Deal government enacted the most important labor and social welfare legislation since the abolition of slavery and battled to compel the judiciary to accept these major reforms in the interests of the working class and the whole people.

The struggle for major judicial reform went back to the late 19th century. It sought to de-emphasize precedence, the “dead hand” of previous decisions, and make the law respond to social changes and realities, to connect the “facts” as they existed in the present with past decisions under the law. Law professor Roscoe Pound and attorney Louis Brandeis were the champions of this approach to law, called “legal realism.” Brandeis especially popularized the doctrine in leading campaigns against corporate monopolistic price fixing and business corruption of public officials, which earned him the name “the People’s Attorney.”

He also developed a legal brief which incorporated social research (the Brandeis brief) in arguing cases. His fame in the early 20th-century Progressive movement led Woodrow Wilson to appoint him to the Supreme Court, where he joined with Justice Oliver Wendell Holmes to represent a minority that supported the regulation of industry, social legislation, and the defense of First Amendment civil liberties. Regarding civil liberties, the minority supported freedom of speech, assembly, and association unless, in Holmes’s language, there was a “clear and present danger” to society, and not just a “dangerous tendency” that certain acts might lead to others, which was the conservative position.

In the 1936 elections, Roosevelt campaigned against the old-guard Court and the “economic royalists” whom they represented, reviving the language of the American revolution in his and the New Deal’s sweeping victory. Roosevelt sought to expand the court for every justice over the age of 70, which would have increased its size to 15 justices.

Conservatives fought back, wrapping the Court in the Constitution, attacking his court reorganization plan as “court packing.” In the Court fight, conservative Southern Democrats, including many who had worked behind the scenes against the New Deal like senators Tom Connally of Texas and Walter George of Georgia, along with the vice president, John Nance Garner, turned against Roosevelt. The weakened GOP let the Democrats carry the ball, but it was from this court fight that the informal conservative coalition of Southern Democrats and Republicans began to take shape.

Faced with the attack, the Court, which had four Coolidge/Hoover “Business of America is Business” conservatives, three urban liberals, and two moderate conservatives, shifted. In 1936 the Court had voted 6-3 against the New York minimum wage law. But in 1937 the Court upheld by a vote of 5 to 4 a similar Washington State minimum wage law, ruled in favor of the Wagner Act in the Jones and Laughlin Steel case, and upheld the Social Security Act and unemployment insurance. In all these rulings, Owen Roberts and Chief Justice Charles Evans Hughes changed their votes to side with Roosevelt.

By the end of 1937, as the old-guard conservatives began to retire, Roosevelt, defeated in the reorganization fight, began to replace them with New Dealers and by the time of the Pearl Harbor attack had forged a New Deal majority. The new Court moved away from the old doctrines of constitutional original intent associated with the corporate-dominated courts of the post–Civil War era toward a view that the Court must change with changing economic and social conditions. Most of all, the Court retreated from its support for business and its defense of the absolute right of freedom of contract. Instead, a law was to be “presumed constitutional” on questions concerning economic power and government regulation — constitutional regulation came to be seen, as one decision put it, as regulation for the “public good.” Economic freedom was no longer the preferred freedom of the court, and economic activity was no longer local and thus not regulatable.

The court also upheld in the Fair Labor Standards Act minimum wages for all citizens, whereas later it vetoed state minimum wage legislation for women, refused to apply the anti-trust laws to unions, and outlawed the sit-down strike in 1939 (NLRB v. Fansteel Metallurgical Corp.), but in a decision that defended and established peaceful picketing.

At the same time, the Court under New Deal leadership began to develop a new doctrine of preferred freedoms, a doctrine that stressed the need to protect the rights of political dissenters and minorities. In late 1937, the Court declared unconstitutional state laws barring speech and assembly that had been used to convict and imprison Communist Party activists like Angelo Herndon in Georgia, later explicitly defended religious freedom in the case of Jehovah’s Witnesses’ refusal to swear allegiance to the flag and revived the clear and present danger criteria to protect free speech and assembly. In 1938 the Court, for the first time since the end of Reconstruction, enforced some civil rights claims when it contended that the state of Missouri, by not supplying legal education for Black students had violated the separate but equal doctrine of Plessy (Missouri had offered to pay part of their tuition). While the decision didn’t challenge segregation, it pressured Southern states to increase educational programs under segregation for African Americans.

In the Hague case, the Court declared unconstitutional a local Jersey City ordinance against picketing and demonstrations which had been used for mass arrests — subsequently, this was defined to mean peaceful picketing. In U.S. v. Carolene Products (1938), the majority ruled that the court would no longer apply “heightened scrutiny” to economic legislation; however, in a footnote, Harlan Fiske Stone added that the Court was obligated to apply a “more exacting judicial scrutiny” in cases where laws or regulations contradicted the Bill of Rights or adversely affected minorities. The famous “footnote 4” had important implications for Bill of Rights freedoms for dissenters and minorities.

Following the recession of 1937 and the business-conservative counterattack and backlash of 1938, the New Deal was politically stalemated in Congress and without a clear program. However, by this time, the labor social welfare program was consolidated, at least for the short term. Further, the great fortress of conservative power protected from the electoral process — the Supreme Court — was overthrown.

Democratic President Harry Truman’s appointees set back the Court’s support for civil liberties, especially in the 1950–51 Eugene Dennis case, where the Court upheld the convictions and imprisonment of the leadership of the CPUSA under the 1940 Smith Act. The appointments of Earl Warren as Chief Justice and William Brennan by Republican President Dwight Eisenhower, however, greatly strengthened the Court’s progressive majority at a time when Cold War policies moved Congress and the president to the right.

In the Brown decision (1954), the Court declared school segregation unconstitutional. The Supreme Court also in the Yates and other decisions made illegal some of the worst aspects of state and federal anti-Communist policies, leading the FBI to establish its secret Cointelpro program. In the later Miranda and Gideon decisions the Court limited police power to interrogate and hold suspects without formally charging them and reading them their rights, including their right to legal representation or a court-appointed attorney to represent them. The Court also rejected early challenges to the Civil Rights Acts of 1964 and 1965. Although Richard Nixon’s election to the presidency and his appointments moved the Court in a more conservative direction over time, Court decisions in the early 1970s effectively abolished the death penalty in the U.S. and, in Roe v. Wade, legalized abortion.

Even before Ronald Reagan gained the presidency, the Nixon-influenced Court began to move to the right. In 1976, the court gave states the right to reestablish the death penalty (subsequently the death penalty would be established at the federal level in a more extensive way than at the state level). In 1980, the Supreme Court upheld an amendment to the funding of Medicaid in 1976 which barred the use of Medicaid funds for abortions, a cruel blow to the rights of low-income and poor women.

Over the following four decades, a series of decisions chipped away at civil rights and civil liberties; weakened the regulation of commerce, industry, and finance; and removed restrictions on the use of money in elections. The Court’s conservative majority became more militantly reactionary, destroying earlier compromise decisions brokered by conservatives. Donald Trump, who gained the presidency in large part because of the deeply undemocratic nature of U.S. politics, failed to implement his far-right domestic policies, which both large numbers of Americans and people throughout the world saw as “neofascism.” However, his “success” in appointing three Supreme Court judges is now his “legacy,” in that they are doing what he failed to accomplish.

First, we must understand that a large majority of the people oppose these decisions, just as in 1857 and 1936 a large majority of the people opposed the Supreme Court’s pro-slavery Dred Scott decision and its decisions declaring New Deal regulatory and social legislation unconstitutional. The Republican Party mobilized opposition to the Dred Scott decision to win the 1858 congressional elections. More than 70 years later, the Democratic Party mobilized opposition to the conservative Court’s decisions to propel Roosevelt to an overwhelming victory in the 1936 national elections. The same kind of united opposition must be organized now. We must point out that the present Court has set the nation back and may continue to block progress regarding immediate issues such as inflation, health care, or the cost of energy and transportation. Were the government to attempt, for example, to establish price controls, create a national public health system, and expand public transportation, the Court would not be on the people’s side.

The trade union movement, all civil rights and women’s rights organizations, and all environmental organizations must mobilize supporters and communities throughout the nation to vote against the Republican senators and congresspeople who over decades have created this judiciary. Such an electoral victory is necessary but not in itself sufficient. Many today are calling for an expansion of the Court. Congress and the president have the power to do that, since the number 9 is not in the Constitution. We should begin to think about a larger expansion of the federal judiciary itself. Since the 1980s, the conservative Federalist Society has advanced the doctrine of original intent as a cover to restore Court rulings opposing federal regulation of business and social welfare legislation. A government committed to restoring what the Court had represented in the New Deal–Great Society era should actively appoint attorneys who support those positions.

Finally, the question of judicial review itself could be formally ended by Congress and the president. As was contended earlier, it is not a part of the Constitution, and there is no evidence that the Constitutional Convention intended it to be established. The Court has acted to strike down and take away from the people major social protections and rights. As such its power of judicial review can and should be taken away from it.

The Evolution of Disability Rights Movements: Great Britain and the United States

“Know your limits, but never stop trying to break them,” said Kyle Maynard, a man born with congenital amputation, which means that his arms stopped forming at his elbows and his legs stopped forming at his knees. This however did not stop him from a motivational speaker, best selling author, entrepreneur, award-winning extreme athlete, and the first man to crawl to the summit of Mount Kilimanjaro[1]. The rights of people with disabilities have evolved rapidly over the past hundred years, specifically in the last thirty years, both within the United States and Great Britain. These two movements have similarities connecting the two, but appear to develop independently of one another.

It is important to understand two different things when discussing the history of disability rights and the discourse as a whole. The first is the two different models of disability, one being the medical model of disability and the other being the social model of disability. The medical model contends that individuals with disabilities are broken and that they need to be fixed or cured, whereas this social model contends that individuals with disabilities are not broken but instead handicapped by their environment. An example of the medical model would be certain very dangerous medical procedures such as chelation.[2] Since there is no actual “cure” for autism these medical procedures can cause a lot of damage to the individual that is receiving them. An example of the social model would be if a building does not have a ramp or elevator for an individual who uses a wheelchair. These two models have been used to define “disability” throughout U.S. disability history and separate the latter stages of the movement.

The second thing is the language that is used throughout history is very offensive and outdated and while these words are used in this paper they are not meant to be used in a harmful and demeaning way, but instead to give the reader a greater context to the way the individuals with disabilities have been treated throughout history.

Historically disability history focuses on learning about the different laws and acts, and how these ideas progress. However, not much time has been spent looking more in depth into the larger discourse of disability rights as well as how this discourse progressed and how it led to effective change. This fostered interest in if there might be any connection or similarities between the way that the disability rights movement progresses in the United States to how it developed overseas, and more specifically in a country relatively similar to the United States, Great Britain. More specifically, how did the disability rights movement with the United States evolve? How did the disability rights movement in Great Britain evolve over time? And were there any connections between these two movements, and if so what are they?

There is not a large variety of literature about the history of the disability rights movements in both the United States and Great Britain as they are relatively recent movements, only having progressed in the last fifty or so years. There is some literature regarding theories of disability, and the normalcy of disability. People, such Elizabeth Barnes[3] who writes about the social model of disability, and Lennard J Davis,[4] who writes about how normalization of individuals with disabilities within society has led to positive changes being brought about. These books, along with speeches by famous disability rights activists such as Judith Heumann and the Netflix documentary Crip Camp: A Disability Revolution, have all aided in the research process in defining the evolution of these movements and how they compare to one another.[5]

The disability rights movement in the United States has evolved much more rapidly than the subsequent movement in Great Britain, with landmark legislation such as the Section 504 of the Rehabilitation Act of 1973, Developmentally Disabled Assistance and Bill of Rights Act (DD Act) of 1975, The Americans with Disabilities Act (ADA) and many others coming before the major landmark decisions in the United Kingdom. The United States and Great Britain both see the beginning of their disability rights movements begin with a heavy focus on physically disabled veterans after World War two. However, the exposure of the Willowbrook Institution ushers in a new wave of disability rights within the United States, one that focuses on individuals with neurological disabilities, before the subsequent movement evolution inside Great Britain. The U.S. remains ahead of Great Britain with its evolution of disability rights for the remainder of the 20th century and into the 21st century.

This paper will begin to dive deeper into the progression of the disability rights movement within the United States and then look towards Great Britain and its evolution. It will discuss the different waves of the movement and how the discourse present in the United States during the 20th century leading up to the passage of the ADA. Next this paper will look at the way that the disability rights Movement within Great Britain develops, from mainly focusing on ex-service men with physical disabilities, to then sudden shift to focusing on individuals with more neurological and mental disabilities.

United States history is filled with many different rights movements that all seemingly overlap with one another. The Disability Rights movement is one of these, finding its beginning in the middle part of the 20th century, with a heavy focus on the veterans that had been disabled in World War 2. Laws and organizations form to help these individuals reintegrate back into society after the war and begin the normalization of individuals with physical disabilities, which leads to the eventual story of the Willowbrook institution. This story begins to shine the light on individuals with cerebral disabilities, such as autism, down syndrome, and many others. From the outcry of these individuals and more specifically their families the attention becomes on how they can be “cured” or “fixed” and brought back into society. It is not until these individuals begin to speak up on their own, that there is a real shift not only in policy but attitude and mindset within society. This continues with the discussion around independent living, which will begin the third and final era of disability rights within the United States, the self-advocacy era. This all culminates in the passing of the ADA in 1990, which marks a new beginning for individuals with disabilities in speaking up for themselves and how they can be integrated into society as a whole.

One of the key areas of overlap that the United States has in its disability rights movement with Great Britain is the early focus on veterans and ex-service men that had become physically disabled following World War two. This is where we see some of the first rhetoric of the medical model of disability, in an article published in the New York Times in 1946, Howard A. Rusk M.D. writes “Today public attention is focused on the young men of America who are returning from war disabled and handicapped. They number in the thousands.”[6] This highlights just how much of a focus this was during this time, it was a large part of the discussion happening, so much so that it was a full page article within the New York Times. Rusk makes it known how important it is to have systems in place that can help physically disabled veterans, who were the main group that was being advocated for during this time.

We see some of this help that had been put into place in the form of the Purple Heart Unit, also known as the Military Order of the Purple Heart, which is a national veterans organization. In an article about the Purple Heart Unit, the New York Times writes:

“The Military Order of the Purple Heart, nation-wide veterans’ organization, will embark on a peacetime program to speed veterans housing and to provide additional benefits for disabled servicemen, it was announced yesterday by the order’s new national commander Ray Dorris of Portland Ore…The housing program will take precedence over all other programs, Mr. Dorris said, adding that he would confer this week with Housing Expedite Wilson W. Wyatt and officials of the War Assets Administration in Washington on the granting of priorities on surplus equipment needed to complete the partially constructed housing projects.”[7]

This is one of the key organizations at the time putting a focus on helping disabled veterans. This is such a big and important undertaking that the organization is working with the U.S. government to try and get the necessary funding and supplies as quickly as possible. This is their paramount objective, surpassing all others, showing just how important the rights of the disabled veterans were during this time. After this there is a steady amount of legislation that is passed and signed to help disabled veterans, but the next evolution in the disability rights movement begins in the mid-1960s.

In 1965 Senator Robert F. Kennedy made an unannounced visit to one of the biggest institutions for individuals with neurological disabilities at the time, Willowbrook located in upstate New York. While there, Senator Kennedy observed some of the most inhumane and deplorable living conditions imaginable. After his visit to Willowbrook, Senator Kennedy testified at a committee hearing, which resulted in an investigation in the state institution. When speaking on the institution Kennedy had this to say “We hear a great deal these days about civil rights and civil liberties and equality of opportunity and justice … But there are no civil rights for young retarded adults when they are denied the protection of the State Education Law, which commands that all other children must receive an education.”[8] This was the first time that a major public figure spoke up in regards to disability rights in the United States. This would not be the last time that Willowbrook was public news, as around five years later a reporter named Geraldo Rivera ran a documentary that showed the deplorable conditions within Willowbrook. This would spark major outrage, and lead to a new nationwide conversation about the effectiveness and moral need for these large state institutions.

While it would take another five years for state institutions to begin to change with their unimaginable conditions, the interview of Bernard Carabello, this would mark the 1970s, the next evolution of the disability rights movement. The United States enters into its second wave of its Disability Rights Movement, which is the Parent wave, which sees the parents of the individuals with disabilities to be the advocates for their children, and puts the emphasis on helping and “fixing” individuals with disabilities. While this is a step in the right direction it still creates numerous problems and harmful stereotypes. The medical model of disability, which paints individuals with disabilities as “broken” and in need of being “fixed”, is still very much prevalent during this time. This period was very short as it only lasted a few years, but it is a crucial step in the evolution of the disability rights movement within the United States, as it moves the spotlight closer to the individuals with disabilities themselves which ultimately marks the final evolution of this movement.

While the independent living model was the first time that the idea of things such as civil rights would be discussed in regards to individuals with disabilities, a few years before this there was discourse around hiring individuals with disabilities into the workforce. In an article about how the year the employment market for individuals with disabilities is starting to become more normalized in society, Howard Rusk writes, “Throughout the country, community programs for the mentally retarded have been slowly demonstrating the truth of the slogan of the National Association for Retarded Children – ‘the retarded can be helped.”[9] While Robert Kennedy’s visit to Willowbrook was mainly about individuals living there and the inhumane conditions of the state institutions, this discourse is clearly more geared towards helping adults, people who are out of these state institutions, and how they can start to become included within society as a large.

This in turn would create a space for a lot of people such as self-advocates and the families of individuals with disabilities to begin talking about similar issues in regards to disabilities.  In their article Romel Mackelprand and Richard Slasgiver talk about the shift that occurred at the beginning of the 1970s, “The disability movement matures with the development of the independent living concept in the early 1970s. Initially led by people such Lex Frieden, Judy Heumann, and Ed Roberts, independent living applied the minority model as the foundation of the political process of gaining the civil rights of peoples with disabilities.”[10] These are just some of the prominent figures that come about and make names for themselves as Disability Rights activists during the early parts of the movement. The minority model, also known as the social model of disability, is something that is very important to modern disability rights activists, as it states that people with disabilities are not “disabled” by their bodies but by the “able-bodied” society that they live in. This is also referred to as the Social Model of Disability and provides a pivotal framework for discussing changes surrounding disability. This is the first large step into a new evolution of the disability rights movement, where the individuals with disabilities themselves are the ones advocating for change and what is best for them.

One of the major issues that was seen in these institutions was forced sterilization, in which individuals with disabilities were viewed by the public as not being able to contribute to society. An interesting case of this is seen outside of institutions, where parents of three kids all with disabilities are fighting to have them sterilized, however no hospital or medical facility near them would perform the procedure, since there were massive gains made by individuals in protecting others with disabilities from being forcefully sterilized. The parents took their argument to court and “thwarting them, either directly or indirectly, have been the tremendous gains made by the champions of individual freedoms and rights who have won many successes in trying to protect the mentally retarded who are capable of functioning independently in society.”[11] This is the perfect example of how the parent wave of disability and the medical model of disability which arose from the professional wave had very negative impacts on individuals with disabilities. This does however cause massive shifts that are happening during this time, as forced sterilizations, especially for those under the age of 21, were being rejected and more control and rights were being given to those with disabilities.

In the area of education, there are also discussions happening during the 1970s about making sure that students with disabilities are more included. In 1974 the New Jersey legislature began debating the idea of passing legislation that would improve the conditions of students with disabilities in schools. In an article published by the New York Times an unnamed author writes “But proponents of improved education for the retarded children contend that the special session also provides the lawmakers with an ideal chance to require local school districts to install programs for youngsters with severe mental handicaps.”[12] Here it becomes clear just how inclusive the conversation has become. In just a few short years the narrative has switched from individuals with disabilities needing to be kept aside and isolated, to know there is a push for them to become integrated into schools, and as this article states, the push is for even those with neurological disabilities as well as just physical disabilities. This highlights just how much it took for people to fight for more protections for students with disabilities, specifically within schools. 

We continue to see the push for protections for individuals with disabilities at the state level, specifically in New Jersey. In his article titled Disabled Children Get New State Aid, Martin Whaldron writes, “This new policy is only one of several that reflect the quiet revolution underway in the state to protect the rights of New Jersey’s mentally ill, handicapped and ‘developmentally impaired’ residents. Some of these policies reflect an almost complete change in attitudes.”[13] This sudden shift in attitude comes from the work of many activists such but in particular, Judith Huemann is the most prominent of them, especially as she becomes famous for her work in New York City. She became famous for her self-advocacy in being the first person in a wheelchair to obtain a teaching license in New York, something that she had to fight very hard to get.

 There are also efforts at a national level as well to help give rights to individuals with disabilities, especially the right to education. In her article for the New York Times, Judy Glass writes about the changes that are arising out of the conversations about individuals with more neurological disabilities, such as learning disabilities. She writes, “Ten or 15 years ago, the term ‘learning disabled’ as a handicap was largely unheard of… five years ago, the learning disabled children were defined more by exclusion than by objective criteria.”[14] This is another example of the rapid change that occurred in this time frame, as the movement evolves more and more individuals begin to become involved in the movement. The term disability has not crossed over into the realm of education adding another step in ensuring the rights for those with disabilities.

There were also some setbacks that accompanied the disability rights movement, one of the biggest came in April of 1981, in which the Supreme Court ruled:

“that a Federal “bill of rights” for the mentally retarded enacted six years ago, did not oblige states to provide any particular level of care or training for retarded people in state institutions… In the case involving the retarded, the appeals court had ruled that the 1200 residents of Pennhurst, a state institution, were being deprived of their right to treatment.”[15]

This was a huge deal at the time as it further restricted the rights of individuals with disabilities who were still living within these institutions. These institutions were mistreating these individuals and with these laws saying that individuals with disabilities did not need to have their caretakers properly trained, it would only further their mistreatment. It is an unfortunate step backwards in this movement, but it contributes to the continued movement to get these institutions shut down and to get individuals with disabilities out of them and living on their own.

The representation of individuals with disabilities in public spaces went a very long way in helping the Disability Rights Movement within the United States. It gave those with disabilities someone that they could see themselves in, and feel like they were a part of society as a whole. This representation really started to take shape as we head into the late 1980s and early 1990s. One prominent figure at this time was Bob Dole, the Senate Republican leader who made it known that he was an individual with a disability, which was something that had not been discussed before.  In an article from 1986, Dole is quoted as saying:

“I can’t do buttons like you do, just feel and push them in there… I’ve got to be able to see the hole and sort of push the button in. The trouble is these buttons on this shirt are just about a fraction too high, so it’s very hard to do that. So every day you get a little test; you’re tested.”[16]

Dole at the time dealt with many physical disabilities, the main one being all the damage that he has suffered to his right arm. This was one of the first times that someone this prominent and well known within the United States government began advocating for himself as an individual with a disability. This is where we begin to enter into the final stage of the evolution of the disability rights movement, where these issues are not something that is being discussed within the federal government, and changes being implemented on a national scale, whereas before changes were often made on a smaller scale, either by state or even more local.

As the disability rights movement enters into its final stages there is now an even bigger push to help get individuals with disabilities normalized and integrated into society. The Americans with Disabilities Act (ADA) was passed in late 1989 and was set into law in 1990. The New York Times wrote this when discussing the new act, “The act was considered by its supporters to be one of the most sweeping pieces of civil rights legislation in decades. It extended throughout private industry a prohibition against discrimination toward the disabled by government agencies and companies that receive government contracts.”[17] The ADA was a monumental piece of legislation in regards to the disability rights movement in the United States resulting in federal mandates that made every aspect of society more accessible to those with a disability. It comes after years of hard work by many people and paved the road for the future legislation that would be passed in the years to come.

The Americans with Disabilities Act (ADA) marked a true turning point in the American disability rights movement, as it is one of the first acts passed that was focused on helping individuals with disabilities be able to feel a part of larger society. Steve Holmes a writer for the New York Times, described the ADA in the following way:

“The public accommodations provisions of the law, the Americans with Disabilities Act, mean more than merely providing adequate parking spaces or ramps for the handicapped. Restaurants may have to provide Braille or large-type menus for the blind or visually impaired people,… space for customers with wheelchairs and ensure that their friends and family may sit with them.”[18]

These are some of the major changes that came about as a result of the ADA, and they highlight just how little people with disabilities were seen in society, and how powerful the ADA was in shining a light on them. We see things today such as ramps, handicap parking spaces, and other inclusive infrastructure and think of it as common and something that has always been there, but for many people it has not. This time back into the social model of disability, showing how an individual can be disabled and handicapped, because there is no ramp to help them access a building.

There were numerous people that played a large role in helping to get the act passed, many of whom were famous disability rights activists within the United States at the time, including a man named Justin Dart Jr. “Mr. Dart was best known as one of the primary forces behind the Amercians with Disabilities Act, which was passed by Congress and signed into law by President Georgre Bush with Mr. Dart as his side, in 1990.”[19] Justin Dart Jr. was one of the most influential Disability Rights activists within the United States, he was constantly arguing and advocating for the passage of this act. He became very well known among those serving during this time, simply for how much he was around and speaking with people about how important this act was.

The disability rights movement within Great Britain is rooted in highlighting the conditions of the physically disabled, specifically wounded veterans. Much of the early discussion that takes place within Great Britain deals with this select group of people with disabilities. It was not until more recently that the conversation has shifted to be more inclusive of people with neurological disabilities, along with people with more visible physical disabilities.  The movement moved towards the focus on group homes, which were similar to the institutions within the United States, before leading to Great Britain’s own version of the Americans with Disabilities Act, the Disability Discrimination Act in 2005.

An early example of Great Britain’s focus on veterans can be seen when the Parliament is discussing a new finance bill that would have been used to raise money in 1951, Lieutenant-Commander Braithwaite said that “it would also adversely affect disabled ex-service men.”[20] His argument was that the petrol tax that was included in the new finance bill would increase the cost of road transport, and would force more people into using public transport, which was used a lot by disabled individuals at the time. Also when discussing this bill Sir Ian Fraser said that disabled individuals should be excused from the extra petrol duty. This shows how much people were thinking about the physically disabled veterans, as a part of society as a whole, similar to the conversation in the United States.

            Disabled veterans dominated a lot of the early discussion of disability rights within Great Britain as they were the most visible individuals with disabilities that were actively trying to be included into society. When discussing the approach to the idea of the  economic situation post war, Mr. King of Southampton urged “for an increase in the basic rate for disability pensions of disability pensions for disabled ex-service men.”[21] This is a good start for the conversation about disability rights in Great Britain and provides a solid foundation for the future.

            The idea of working and labor was also something that came up in discussions when discussing how to best include them in the labor force. When discussing the idea of defense-workers, looking specifically at summer resorts during the winter time. There is a group of workers that were used to help advance the “national effort”. The argument that these disabled workers could be employed to help in other aspects of the country, was taken before the Ministry of labor, and the position of disabled men specifically was brought up by a man named Mr. E. Evans, stating that it can be difficult for them to find work at times.[22] Mr. Evans is one of the first people that begins to speak out for those with physical disabilities within Great Britain, that is not speaking solely for disabled veterans, advancing the disability rights movement further.

            The infantilization and idea that individuals with disabilities need to be helped and cared for by others, even when they may be perfectly able to take care of themselves, is something that is present. For example, the London times published a newspaper article entitled, Debate on employment of disabled and elder persons[23]. Having people with disabilities be associated with the elderly shows how they were seen by larger society. People with disabilities and specifically in this case, those with physical disabilities are seen as weak and in need of someone to be helping them at all times even though there are things that they still might be able to do by themselves. In this case a lot of them are former veterans, specifically men, so they would most likely still be in good physical condition, only needing help in the area of their handicap. This is a very early argument describing the Social Model of Disability, where the individuals themselves are perfectly fine, it is society and their environment that is handicapping them.

            At the start of the 1960s we began to see the conversation in Great Britain evolve to begin to include individuals with neurological disabilities, alongside those with physical disabilities. Similar to the pattern in the United States at this time, it appears that Great Britain thought its best course of action was to have these individuals placed into group homes. In March of 1961 there was discussion about the construction of a new home in Bognor Regis, near the southern British Coast. The issue that was brought up was about whether or not that project had been abandoned because two local private schools apparently rejected the idea as they felt that the location was too close to their location, and they did not think it would be able to be the proper size necessary.. When talking about the issue Mr. Kenneth Robinson, who was a representative in Parliament during this time had this to say “Projects of this kind are constantly being frustrated by local difficulties being raised about sitting. Is there nothing the Minister can do, perhaps in conjunction with the Minister of Housing and Local Government, in trying to influence local authorities to be a little more sympathetic towards this type of development.”[24] This shows some of the issues that the disability rights movement in Great Britain faced in its early stages, with many doubting if it was even necessary to have these group homes.

            Similar to the United States, in Great Britain these individuals with neurological disabilities are subjected to being separated from the larger part of society by being placed into these group homes. These group homes similar to the institutions in the United States seem to be mistreating the individuals with disabilities as well. Unfortunately, at this time individuals with neurological disabilities are seen as “rejects” and “outcasts”. This also led to the mistreatment of these individuals as they were seen as needing to be removed from society, including many horrible things being done to them such as Euthanasia.

One of the most prominent people in both England and the United States was C. Killick Millard, who was mainly working between 1930-1955 but was the main figure and was a very well known and respected doctor during this time. Ian Dowbiggin writes about Millaird describing him as having “dedicated much of his life to legalizing the right to die, he was likewise motivated by the conviction that an educated, rational and mentally competent person would consent to mercy-killing if suffering from a painful, terminal illness or disability.”[25] This gives insight into how individuals with disabilities were viewed as not able to be educated the same way as their non-disabled counterparts, and how having a disability was seen as something that would not make you of use to the larger part of society. This is another aspect of the disability rights movement in Great Britain that has a parallel to the United States, which is the way that both of these movements had a time where they looked to medical professionals for the answers.

One of the first major pieces of legislation to come from Great Britain in the realm of disability rights is the Chronically Sick and Disabled Persons Act of 1970. This bill established welfare for those who were disabled or for those who suffered from chronic illness. This is an important part of the Disability Rights movement in Great Britain because while there may have been support for the bill, the speaker, who themselves identified as disabled, thought that the bill needed to be stronger. An article written in the London Times “Under the Chronically Sick and Disabled Persons Act, 1970, builders had to provide access facilities where it was ‘in the circumstances both practicable and reasonable. There have been instances in the past 11 years when such facilities were not provided, mainly because nobody has enforced the law.”[26] The prolonged and delayed enforcement of laws is something that is far too common in disability law in particular, with vague and non-descript wording allowing companies and others to get away with not fully giving people with disabilities the accommodations they need. As mentioned in the London Times article it took almost eleven years for something to be done about this, and it is one of the reasons why disability rights are still an active fight.

Prolonged enforcement of laws and regulations can be tied back to an issue that was brought up almost twenty years prior to this incident, which involved giving disabled drivers a badge that would help identify them. In March of 1961, in response to the increasing parking problems of disabled drivers, a man named Mr Dobbs, who was a member of Parliament proposed “to provide a badge to be displayed by disabled drivers to help them and to assist police in using their discretion in dealing with traffic problems.”[27]It would take almost ten more years for this idea to become mainstream and implemented in the Chronically Sick and Disabled Persons Act in 1970. The idea of these badges being used to assist police is very interesting because it speaks to a debate that is happening right now in the United States about how to best aid individuals with disabilities, specifically those with more “invisible disabilities” in regards to things such as traffic stops, and their interactions with police officers. For individuals with disabilities, especially those who may have neurological disabilities, understanding social cues and following directions can often be a tough task, and unfortunately in the United States, police officers can at times give conflicting directions. This can lead to individuals with disabilities being treated harmfully by police officers and not fully understanding why.

We began to see a mindset shift in Great Britain in the late 1990s and one example of this comes from a man Lee Duffin, who although he spent most of his life working in sales and marketing, joined a charity that helped young adults with disabilities to become more self-reliant and independent. Although his main job was fundraising, he said “I had no experiences in fund-raising or the mentally handicapped, but I was so impressed by the charity’s philosophy of helping the young adults to lead a fairly independent and fulfilled life that I wanted to help.”[28] This is a massive shift from just thirty years prior where individuals with disabilities, specifically those with neurological disabilities, were seen as needing to be kept away from society and kept in group homes. This comes in the years following the United States and the idea of independent living that was introduced by disability rights activists there.

As we enter into the 21st century we see the last of the group homes or “long-stay care homes” that were prominent in the late 60s and early 70s and began to become less prominent into the late 80s and 90s. The last of these homes shut down in Great Britain in 2004 and was a part of the effort to help people with disabilities become larger members of society. John Hutton the public health minister had this to say “people in Britain with learning disabilities were among the most socially excluded in the country. Only one of them has a friend outside the immediate circle of their family or paid-for carers.”[29] This is one of the biggest shifts and evolutions in the direction of fostering independence for those with disabilities. In 2001, there were an estimated 1.4 million people living with disabilities in Britain. Around this time as well, there were schools in Britain that received investments in communication aids for students who would need them. This is similar to what happens in the United States, which is that if schools receive federal funding that they have to provide students with the accommodations that they need.

In the United States the Americans with Disabilities Act (or the ADA) in the 1990s provided people with disabilities the rights to access society and for changes to be made to help them it is not until the early 2000s that Great Britain enacts something similar. In 2004 the British government passed the Disability Discrimination Act which acts similar to the ADA. it was described by the media as:

“The most significant aspect of the new provisions is the duty of service providers to make reasonable adjustments to any physical features that are a barrier to the enjoyment of goods and services by disabled people … includes widening a doorway; providing a permanent ramp for a wheelchair user; relocating light switches, for someone who has difficulty reaching;… and providing tactile buttons in lifts’.’[30]

This directly connects to what the ADA did for Americans with disabilities and relates back to the social model, contending that in order for individuals with disabilities to be included within society there needed to be changes made to the environment as well.

Another piece of legislation that was passed in Great Britain that is similar to the ADA is the Disability Discrimination Act (DDA), which was passed in 2005. This act made it illegal to discriminate against individuals with disabilities within the workplace and to make the necessary accommodations to allow these individuals to succeed in the workplace. “Employing disabled people can attract disabled customers.”[31] This is a great way to think about how it feels to include individuals with disabilities within not only the workplace but society as a whole. Seeing people that represent who you are and how you view yourself is very important in helping people feel safe in society.           

            In conclusion, the disability rights movements of both the United States and Great Britain have some connections with one another but it was mainly the United States setting the precedent for and leading the way. Both of these movements have their foundations in the way that society began to see and treat veterans with disabilities following World War II. The care and thought that was given to these veterans opened the door for disability rights activists in each country to begin to further the conversation on disability rights. While the United States had its focused turn to institutions by parents, Great Britain began to look at group homes. In 1990 the United States passed the Americans with Disabilities Act, which provided comprehensive changes that would grant individuals with disabilities a chance to participate in society. 15 years later Great Britain would pass the Disability Discrimination Act, which would act similarly to the ADA. Ultimately showing how, even though the two movements evolve similarly over time, it is the United States that has its evolutions before Great Britain.

The significance of this capstone paper is that it allows for the start of a discussion on the history of disability rights not only in the United States but in Great Britain as well. It is important to just study the history of one nation’s evolution as it can close you off to possible ideas and changes that have been made in other nations that can be adopted in one’s own country. Individuals with disabilities have been mistreated throughout history in many different parts of the world and it is important to begin to understand how this happens and how different nations are able to move forward and away from this awful mindset and treatment of individuals with disabilities.

This capstone paper is significant for education as it allows for students to learn about a history and a movement that has rarely been discussed before. Much of the activism that occurs during the disability rights movement occurs during the late 1970s and 1980s, a time that is just now being discussed more and more in schools, particularly in secondary education. Individuals with disabilities have been treated inhumanely and as outsiders, but if we allow for their story to become a part of our taught history, we can work towards people accepting them for who they are. The disability rights movement also has connections to other historical events, including how the disability rights activists used tactics of other civil rights groups to help fight for their cause. There is also great opportunity for current events with this topic, as this movement is still going today, as many disability rights activists fight to have individuals with disabilities seen by the rest of society.

Alambritis, Stephen. “The Business View.” Times, March 4, 2008, 4[S3]. The Times Digital Archive (accessed November 21, 2022). https://link.gale.com/apps/doc/IF0503625387/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=b85a4ad4.

“An equal workforce, not forced.” Times, October 17, 1985, 12. The Times Digital Archive (accessed September 25, 2022). -https://link-gale com.ezproxy.tcnj.edu/apps/doc/CS201822545/TTDA?u=tconj_ca&sid=book—-mark-TTDA&xid=f91d7aa7.

Barnes, Elizabeth. The Minority Body a Theory of Disability. Oxford: Oxford University Press, 2018.

“Benefits defeats in Lords.” Times, May 22, 1990, 7. The Times Digital Archive (accessed October 17, 2022). https://link-gale-com.ezproxy.tcnj.edu/apps/doc/IF0501825665/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=c3331e85.

Burch, Susan, and Ian Sutherland. “Who’s Not Yet Here? American Disability History.” Radical History Review 2006, no. 94 (2006): 127–47. https://doi.org/10.1215/01636545-2006-94-127.

Cragg, Stephen. “Legislation Update.” Times, September 7, 2004, 7[S1]. The Times Digital Archive (accessed November 21, 2022). https://link.gale.com/apps/doc/IF0502698209/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=ec9de613.

Cooper, Jeremy. Law, Rights, and Disability. London: Jessica Kingsley Publishers, 2003.

Davis, Lennard J. Enforcing Normalcy: Disability, Deafness, and the Body. London: Verso, 1995.

Dearlove, Desmond. “A fight for the right to work.” Times, September 10, 1992, 19[S]. The Times Digital Archive (accessed September 25, 2022). https://link-gale-com.ezproxy.tcnj.edu/apps/doc/IF0503341113/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=36a91163.

Diane Henry, Special to The New York Times. 1977. “Parents of 3 Retarded Girls Fight Hospital Refusal to Sterilize them: Parents Press Bid to Sterilize Retarded Girls.” New York Times (1923-), Oct 02, 1. https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/parents-3-retarded-girls-fight-hospital-refusal/docview/123174722/se-2.

“Diary Of Next Week’s Events.” Times, July 8, 1961, 11. The Times Digital Archive (accessed November 21, 2022). https://link.gale.com/apps/doc/CS184901864/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=5718e179.

Dowbiggin, Ian. “‘A Prey on Normal People’: C. Killick Millard and the Euthanasia Movement in Great Britain, 1930-55.” Journal of Contemporary History 36, no. 1 (2001): 59–85. http://www.jstor.org/stable/261131.

Evans, Richard. “Law will ensure access for disabled in new buildings.” Times, June 2, 1981, 3. The Times Digital Archive (accessed October 17, 2022). https://link-gale-com.ezproxy.tcnj.edu/apps/doc/CS50694338/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=7a585a9f.

Frean, Alexandra. “Care homes for the mentally disabled to shut.” Times, March 21, 2001, 4. The Times Digital Archive (accessed November 21, 2022). https://link.gale.com/apps/doc/IF0502655359/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=caf8b513.

From Our Correspondent. “Assisting The Disabled.” Times, December 4, 1951, 5. The Times Digital Archive (accessed November 21, 2022). https://link.gale.com/apps/doc/CS85675396/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=8da0813e.

Heumann, Judith. Being Heumann. S.l.: WH. Allen, 2021.

Hobson, Rodney. “Working at a different pace.” Times, July 31, 1990, 17. The Times Digital Archive (accessed November 21, 2022). https://link.gale.com/apps/doc/IF0503253499/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=476c1f07.

“House Of Commons.” Times, June 6, 1951, 4. The Times Digital Archive (accessed November 21, 2022). —-https://link.gale.com/apps/doc/CS67456198/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=f90b3-07e.

“House Of Commons.” Times, November 7, 1951, 7. The Times Digital Archive (accessed November 21, 2022). —-https://link.gale.com/apps/doc/CS117788007/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=20fbf67

Howard A Rusk, MD Formerly Chief, Convalescent Services Division, Office of, Air Surgeon. 1946. “Hope for our Disabled Millions: They can be Rehabilitated, Says a Physician, if we Apply Methods used in Restoring Handicapped Veterans. our Disabled Millions our Disabled Millions.” New York Times (1923-), Jan 27, 1946. —-https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/hope-our-—-disabled-millions/docview/107574818/se-2.

Howard A. Rusk, M.D. 1964. “Hiring the Retarded: ‘ 63 Marked Employment Turning Point for Mentally Handicapped in the U.S.” New York Times (1923-), Jan 06, 121. —-https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/hiring-ret—-arded/docview/115529793/se-2..

John Sibley. 1965. “Kennedy Charges Neglect in State Care of Retarded: KENNEDY ASSAILS CARE OF RETARDED.” New York Times (1923-), Sep 10, 1. —-https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/kennedy-charges-neglect-state-care-retarded/docview/116840893/se-2.

Jonathan Fuerbringer Special to The New York Times. 1986. “To Dole, it was an Education to Get Past Disability.” New York Times (1923-), Jun 16, 1. —-_https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/dole-was-education-get-past-disability/docview/110931546/se-2.

Judy Glass. 1980. “New Efforts to Assist ‘Learning Disabled’ Debated Across L.I.: New Efforts to Assist ‘Learning Disabled’ New Efforts to Assist ‘Learning Disabled’.” New York Times (1923-), Nov 23, 4. —-https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/new-effor—-ts-assist-learning-disabled-debated/docview/121268082/se-2.

“Legal Appointments.” Times, May 7, 1985, 29. The Times Digital Archive (accessed September 25, 2022). https://link-gale-com.ezproxy.tcnj.edu/apps/doc/CS486772903/TTDA?u=tconj_ca&sid=boo——kmark-TTDA&xid=28b1fa67.

Linda Greenhouse, Special to The NewYork Times. 1981. “Justices Restrict A ‘Bill of Rights’ for the Retarded: High Court Calls U.S. Law Only Advisory for States Release of Retarded People ‘Findings’ in ‘Bill of Rights’ Court Restricts ‘Rights’ of Retarded Right to Refuse Medication.” New York Times (1923-), Apr 21, 2. —–https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/justices-restrict-bill-rights-retarded/docview/121615394/se-2.

Mackelprang, Romel W. and Richard O. Salsgiver. “People with Disabilities and Social Work: Historical and Contemporary Issues.” Social Work 41, no. 1 (01, 1996): 7-14. —https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/scholarly-journals/people-with–disabilities-social-work-historical/docview/215272364/se-2.

Martin Waldron. 1978. “Disabled Children Get New State Aid: Disabled Children are Getting New Help from the State.” New York Times (1923-), Mar 05, 3. —-https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/disabled-children-get-new-state-aid/docview/123790595/se-2.

Noyes, Hugh. “Disabled peers put aid plea.” Times, April 10, 1970, 1. The Times Digital Archive (accessed October 17, 2022). —- https://link-gale-com.ezproxy.tcnj.edu/apps/doc/CS17134218/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=8a9252f9.

“Parliament.” Times, August 3, 1951, 3. The Times Digital Archive (accessed November 21, 2022). Retrieved from  —https://link.gale.com/apps/doc/CS50547971/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=92465—-b0b.

“Purple Heart Unit To Act On Housing: Order Back Speed-Up Of U.S. Efforts To Aid Veterans–Also To Help Disabled Men.” 1946. New York Times (1923-), Sep 08, 40. –https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/purple-heart-unit-act-on-housing/docview/107403078/se-2.

Special to The NewYork Times. 1974. “Improved Education Urged for Retarded: Disparities seen Resulting.” New York Times (1923-), Jun 23, 78. —–https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/improved-education-urged-retarded/docview/120059739/se-2.  

Steven A. Holmes. “Sweeping U.S. Law To Help Disabled Goes Into Effect: Gains Seen For Millions Statute May Force Businesses To Alter Buildings And Offer Specialized Services Sweeping U.S. Law To Help Millions Of The Disabled Goes Into Effect New Anti-Bias Legislation Could Bring Changes To Many Businesses.” Jan 27, 1992. New York Times ——https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/sweeping-u-s-law-help-disabled-goes-into-effect/docview/109037130/se-2.

Stevenson, Richard W. “Justin Dart Jr., 71, Advocate for Rights of Disabled People.” New York Times (1923-), Jun 24, 2002. —https://ezproxy.tcnj.edu/login?url=https://www.proquest.com/historical-newspapers/justin-dart-jr-71-advocate-rights-disabled-people/docview/92295369/se-2.

“Subsidy Rate In Airport Charges.” Times, March 7, 1961, 4. The Times Digital Archive (accessed November 21, 2022). —-https://link.gale.com/apps/doc/CS67723367/TTDA?u=tconj_ca&sid=bookmark-TTDA&xid=7df135cc.

Walker Alan and Peter Townsend. 1981. Disability in Britain: A Manifesto of Rights. Oxford: Martin Robertson.


[1] GDA Podcasts, GDA Podcasts, April 26, 2017.

[2] NHS, “Treatments That Are Not Recommended for Autism,” NHS choices (NHS, December 16, 2022),

[3] Elizabeth Barnes, The Minority Body: A Theory of Disability (Oxford, United Kingdom: Oxford University Press, 2018).

[4] Lennard J. Davis, Enforcing Normalcy: Disability, Deafness, and the Body (London: Verso, 1995).

[5] Crip Camp: A revolution, Netflix, 2020.

[6] Howard A Rusk, “Hope for our Disabled Millions”, New York Times, January 27th, 1946.

[7] “Purple Heart Unit To Act On Housing: Order Back Speed-Up Of U.S. Efforts To Aid Veterans–Also To Help Disabled Men.”, 1946, New York Times.

[8]John Sibley, “Kennedy Charges Neglect in State Care of Retarded. September 10, 1965. New York Times

[9] Howard A. Rusk, “Hiring the Retarded”, January 6, 1963, New York Times.

[10]Romel W. Mackeprang, and Richard O. Salsgiver, “People with disabilities and Social Work: Historical and Contemporary Issues”.1996, Social Work.

[11] Diane Henry, “Parents of 3 Retarded Girls Fight Hospital Refusal to Sterilize Them”, October 2, 1977, New York Times.

[12] “Improved Education Urged for Retarded”, June 23, 1978, New York Times.

[13] Martin Waldron, “Disabled Children Get New State Aid”, March 5, 1978, New York Times.

[14] Judy Glass, “New Efforts to Assist ‘Learning Disabled’ Debated Across L.I.” November 23, 1980, New York Times.

[15] Linda Greenhouse, “Justices Restrict A ‘Bill of Rights’ For the Retarded”, April 21, 1981, New York Times

[16] Jonathan Fuerbringer, “To Dole, It Was An Education to Get Past Disability”, June 16, 1986, New York Times.

[17]Richard, “Justin Dart Jr., 71, June 14, 2002.

[18]  Steven A. Holmes, “Sweeping U.S. Law To Help Disabled Goes Into Effect: Gains Seen For Millions Statute May Force Businesses To Alter Buildings And Offer Specialized Services Sweeping U.S. Law To Help Millions Of The DisabledI Goes Into Effect New AntI-Bias Legislation Could Bring Changes To Many Businesses.” Jan 27, 1992, New York Times.

[19] Richard W Stevenson, “Justin Dart Jr., 71, Advocate for Rights of Disabled People”, June 14, 2002.

[20] “House of Commons”, Times, June 6,1951, The Times Digital Archive.

[21] “House of Commons”, Times, November 7, 1951, The Time Digital Archive.

[22] “Parliament”, Times, August 3, 1951, The Times Digital Archive.

[23] “Diary Of Next Week’s Events”, Times, July 8, 1961, The Times Digital Archive.

[24] “Subsidy Rate In Airport Changes”, Times, March 7, 1961, The Time Digital Archive.

[25]Ian Dowbiggin, “A Prey on Normal People”, Journal of Contemporary History, (2001), 65.

[26] Richard Evans, “Law will ensure access for disabled in new buildings”, Times, June 2, 1981, The Times Digital Archive.

[27] “Launchers For Research in Space”, Times, March 14, 1961, The Times Digital Archive.

[28] Rodney Hobson, “Working at a different pace”, Times, July 31, 1990, The Times Digital Archive.

[29] Alexandra Frean, “Care homes for the mentally disabled to shut”, Times, March 21, 2001, The Times Digital Archive.

[30] Stephen Cragg,“Legislation Update”, Times, September 7th, 2004, The Times Digital Archive.

[31] Stephen Alambritis, “The Business View”, Times, March 4, 2008, The Times Digital Archive.

NCSS Response to AP African American Course Controversy

NCSS Response to the AP African American Course Controversy

Official statement of the National Council for the Social Studies:

NCSS recognizes that states and districts have the right to approve or not approve individual courses and, in so doing, have a responsibility to use a transparent evaluation process that includes educators and other experts in the field. When courses, especially those that were created and supported by some of the United States’ most esteemed scholars and organizations, appear to have been rejected without a transparent process, all educators and community members should be concerned and have the right to request more information on the process used.

Of equal concern to NCSS is that the current political climate might negatively impact the great work that is being done throughout the United States to diversify curricula, use culturally responsive resources, and build content and pedagogical knowledge so that educators might better create lessons and other opportunities to address a longstanding marginalization of Black histories in the American education system. The NCSS previously addressed concerns about “divisive concepts” laws that seek “to ban the teaching of such concepts as race, racism, white supremacy, equity, justice, and social-emotional learning, as well as to limit the teaching of content such as slavery, Black history, women’s suffrage, and civil rights.”

NCSS supports the teaching of Black histories in a manner that engages students in learning about the achievements, joy, perseverance, agency, and resilience of Black Americans. An attempt to block courses that fully portray the Black experience, such as the AP African American Studies course, places professional judgment boundaries on teachers’ freedom to teach  and denies students the right to learn rich, complex histories that allow for multiple perspectives and deep exploration of the successes and struggles in our collective history across cultures. Every student has the right to learn about Black histories and the Black experience, and every teacher has the right to teach Black histories and the Black experience without the fear of intimidation and retaliation.

NCSS continues to advocate for the inclusion of Black histories and contemporary issues across K-12 curricula and calls on all education officials to provide students with the right to learn about, and from, the experiences of Black Americans. NCSS strongly believes in the educational value of offering diverse learning experiences in schools. We believe all students deserve the opportunity to learn African American studies and should have access to courses that support their pursuit of higher education and the study of African American history and culture in all education settings and throughout life.