Interesting

Ask HISTORY: What was the first capital of the United States?

Ask HISTORY: What was the first capital of the United States?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.


Our History: When New York was the U.S. capital

Unusual as it may seem, New York City's role as our nation's first capital is rarely more than touched upon by historians today. New York, however, served as our nation's capital for five years from 1785 to 1790, and played a vital role in the formation of our new government after the British were defeated and the Treaty of Paris ended the war.

New York City served as the seat of our nation's legislature and as de facto “capital” from the time when our government was operating under the Articles of Confederation in 1784. It was late in 1784 when the Continental Congress, in which authority was vested at the time, voted to designate New York City as its meeting place until such time as a “federal district” on the banks of the Delaware River near Philadelphia could be completed. They had designated a commission to plan the district and then moved to New York City. Old City Hall, which later became Federal Hall, was then redesigned by Pierre Charles L'Enfant to make it suitable to house the Congress. The first session of the new Congress was held here on Jan. 11, 1785.

Soon after the Constitution was considered ratified, that is when nine states had done so, during the tenure of Cyrus Griffin of Virginia as president of Congress, New York City was officially chosen as the temporary seat of the new government that was soon to be formed. In January 1789, Congress issued directives detailing the choice of presidential electors, and on Feb. 4, 1789, the guidelines for choosing the president.

The first session of the new Congress was set and convened in the city on March 4, 1789. There were only eight senators and 13 representatives present because many delegates were still en route and thus a quorum could not be achieved. When 30 of the 59 members were present, business began. Pennsylvania's Frederick Augustus Muhlenberg was chosen speaker. He was the great uncle of William Augustus Muhlenberg, who played so prominent a role in the religious and educational life of early Flushing and College Point.

George Washington arrived in the city on April 30, 1789, and was inaugurated as our first president on the following day. The first act of Congress in New York City was to mandate the procedure for administering oaths of public office. By July, Congress began to organize the executive departments. The first was the Department of Foreign Affairs, soon renamed “The State Department.” Thomas Jefferson was appointed secretary of state, but since he had not yet returned from France, John Jay handled the affairs of state until his return.

In August, the War Department, under Henry Knox, was established and early in September Alexander Hamilton became secretary of the treasury. Within a few months, the attorney-general, chief justice of the Supreme Court, and the postmaster general were named. In September 1789, Congress established a 1,000-man army composed of one regiment of eight infantry companies, and one battalion of four artillery companies.

George Washington drew many famous people to the city: statesmen, heroes, and celebrities, including Attorney-General Edmund Randolph, Congressman James Madison and Senator John Hancock.

It was Alexander Hamilton who, it appears, can be credited with working out the compromise that would eventually establish our federal capital on the Potomac instead of in Philadelphia. On July 10, 1790, the House of Representatives voted to locate the planned national capital on a 10-mile site along the Potomac River, with the proviso that the exact place be chosen by President Washington. Philadelphia was designated the temporary capital and Congress moved there in December of 1790.

President Washington personally selected the site for the proposed federal district and the Capitol on the Potomac. The construction of the White House began in 1792 and work on the Capitol in 1793.

In the very beginning, there had been talk of turning lower Manhattan into a federal district, and plans were made to build the presidential mansion on Governor's Island. But the selection of a permanent site for our national capital had always been a topic that evoked controversy. Correspondence of the day reveals that it appeared to be a contest between the commercially oriented city of New York and the overwhelmingly more conservative rural and agricultural population elsewhere. There also was an apparent fear that New York's populace, especially so-called “society” had aristocratic leanings. English fashions and luxuries were in favor as were more “courtly” styles of entertaining. The Boston Gazette described the city as a “vortex of folly and dissipation” and even President Washington was criticized for his elegant cream-colored coach and lavish entertaining, as well as the “exorbitant” rent ($2,500 per annum) for the executive mansion.

Added to this was the notoriety of the city as a “cauldron of financial speculation.” In the end, it appears this played a role that led to a compromise that moved Congress out of our city.

Though it was the Federalist's aim to try to keep New York City as the nation's capital, in the end it was Hamilton's financial program that prevailed. Hamilton, speaking to Senator Rufus King, was to say about his financial plan for the country's debt, that it “was the primary object, all subordinate points which oppose it must be sacrificed.” Rufus King listened, and in July Congress voted to build its permanent capital on the banks of the Potomac.

Congress met for the last time in Federal Hall in New York City on Aug. 12, 1790 and President George Washington stepped onto a barge moored at Macomb's wharf 18 days later and departed for the temporary capital in Philadelphia.

However, New York City had a destiny to fulfill. No longer a capital city, it became through the ensuing years the city of capital with the promise of becoming, indeed, the first city of the United States.


Port of Honolulu, September 19, 1820 .

John Coffin Jones, Jr. (also known in Hawaiian documents as John Aluli) was appointed Agent for Commerce and Seamen on September 19, 1820. He began to serve in October of 1820, at the port of Honolulu . The post of commercial agent was raised to consul effective July 5, 1844, and held by Peter A. Brinsmade , who had already been appointed commercial agent on April 13, 1838.

Earliest Date: August 31, 1852 . Latest Date: August 12, 1898 (date of annexation).

Honolulu.

Earliest Date: October, 1820 .Latest Date: June 19, 1863 (seems to indicate the elevation of Consulate to Legation. Consular functions continued, with the last extant consular officer date of June 1, 1897.)

Kahului .

Earliest Date: August 20, 1880 . Latest Date: August 12, 1898 . Kahului is located on the island of Maui.

Lahaina (Rahaina).

Earliest Date: April 22, 1850 (confirmation of appointment of consul) Latest Date: April 9, 1869 (confirmation of appointment of consul). Before 1845, Lahaina was the capital of Hawaii. Lahaina is located on the island of Maui.

Mahukona .

Earliest Date: September 15, 1882 . Latest Date: August 12, 1898 (date of annexation). Mahukona is located on the island of Hawaii.


The Gilded Age

Mark Twain dubbed the decades after the Civil War the "Gilded Age." It was a period dominated by political scandal and the "Robber Barons," the growth of railroads, the economization of oil and electricity, and the development of America's first giant—national and even international—corporations.

Corporations took off in the United States during this time, in part, because they were simple to form, and most states allowed free incorporation and required only a simple registration.

In the 21st century, there are fees associated with forming a corporation, unlike during the Gilded Age.

Some rich corporations soon became rent-seekers, reinforcing Henry Clay's idea of state-assisted industrialization. Historian Charles A. Beard wrote that government gifts tended to go to the largest investments. Ironically, the two biggest names in American corporate history, John Rockefeller and Andrew Carnegie, were noteworthy for fighting against government favors and subsidized competitors.

Americans' opinions of corporations sunk after the Stock Market Crash of 1929. In the public mind, Big Business, especially the financial sector, seemed to be to blame for the onset of the Great Depression. Reinforcing this sentiment was the book "The Modern Corporation and Private Property" that was published in 1932, in which authors Adolf Berle and Gardiner Means argued that those who legally have ownership over public companies (that is, the shareholders) have been separated from their control, leaving management and the directors to manipulate the resources of companies to their own advantage without effective scrutiny.


Ask HISTORY: What was the first capital of the United States? - HISTORY

People have lived in the land of Ohio for thousands of years. Early cultures were the Mound Building Cultures such as the Hopewell and the Adena peoples. These peoples disappeared around 1000 AD and were replaced by new cultures including the Fort Ancient people and the Whittlesey.


Ohio Welcome Sign by ErgoSum88

In the 1600s the Iroquois Indians moved into the land in order to hunt for beaver furs. Many of the existing tribes were pushed out of the region. However, due to diseases brought by Europeans, many of the Iroquois were wiped out. They were later replaced by tribes from the east such as the Delaware, the Shawnee, and the Miami.

The first European to arrive in Ohio was French explorer Robert de La Salle in 1669. He claimed the land for the French. Soon the French had established trading posts in order to capitalize on the valuable fur trade in the region. They built several forts including Fort Miami in 1680 and Fort Sandusky in 1750.

In the early 1700s, British colonists from the east coast began to move into the area. They were looking for new land to settle and wanted a part of the fur trade. Soon the British and the French were competing for the fur trade, which eventually led to war.

French and Indian War

The war between the French and British lasted from 1754 to 1763. It is called the French and Indian War. Different Native American tribes allied with different sides of the war. The Ohio region was the site of many battles and bloodshed. George Washington fought on the side of the British in Ohio at the Battle of Fort Necessity. The British eventually won the war and took over the Ohio region in 1763.

When the Revolutionary War ended in 1783, Ohio became part of the United States. A few years later, in 1787, the United States created the Northwest Territory. This territory was a large area of frontier land that included such future states as Ohio, Illinois, Indiana, Michigan, Wisconsin, and part of Minnesota.

In 1788, General Rufus Putnam led a number of settlers into Ohio and established Marietta as the first permanent settlement. Soon, many more settlers from the United States moved into the land. The population grew until, in 1803, Ohio was admitted into the Union as the 17th state. The first capital was in Chillicothe. In 1816, Columbus became the permanent capital.


Ohio Farm by tpsdave

Much of the early 1800s was marked by battles and wars in Ohio. First, there was a rebellion among the Native Americans led by Shawnee chief Tecumseh. He believed that the land had been taken unfairly from his people. Soon after Tecumseh's forces were defeated, Ohio became the battleground for some of the fighting with the British in the War of 1812.

Ohio fought on the side of the Union during the Civil War. It was a "free state" that had outlawed slavery. Many slaves had escaped to Ohio through the Underground Railroad prior to the start of the war. Although few battles occurred in the state, many Ohio men fought for the Union army during the war. Some of the Union's most senior military leaders, such as Generals Ulysses S. Grant and William Tecumseh Sherman, were from Ohio.

Over the years, seven presidents of the United States have been born in Ohio. This is second only to Virginia. The presidents born in Ohio include James Garfield, Ulysses S. Grant, Warren G. Harding, Benjamin Harrison, Rutherford B. Hayes, William McKinley, and William Howard Taft.


Cleveland, Ohio by Lovleet


History of the Death Penalty

The death penalty has existed in the United States since colonial times. Its history is intertwined with slavery, segregation, and social reform movements.

There are excellent sources available for those interested in the history of capital punishment. The following pages contain a brief summary of that history, with an emphasis on developments in the United States.

This chart* chronicles the United State’s use of the death penalty over the past four centuries. The chart highlights the gradual rise in use of capital punishment in the seventeenth, eighteenth, and nineteenth centuries a peak of executions in the early 20th century moratorium and then the resumption of executions after moratorium.

The use of the death penalty has declined sharply in the United States over the past 25 years. New death sentences have fallen more than 85% since peaking at more than 300 death sentences per year in the mid 1990s. Executions have declined by 75% since peaking at 98 in 1999.

For a timeline of significant events in the history of the death penalty in the United States, see DPIC’s Death Penalty Timeline. For dynamic visualizations and more information on executions and new death sentences in the modern era of capital punishment, see DPIC’s Executions and Sentencing Data pages.


Contents

Investors have been acquiring businesses and making minority investments in privately held companies since the dawn of the industrial revolution. Merchant bankers in London and Paris financed industrial concerns in the 1850s most notably Crédit Mobilier, founded in 1854 by Jacob and Isaac Pereire, who together with New York-based Jay Cooke financed the United States Transcontinental Railroad.

Later, J. Pierpont Morgan's J.P. Morgan & Co. would finance railroads and other industrial companies throughout the United States. In certain respects, J. Pierpont Morgan's 1901 acquisition of Carnegie Steel Company from Andrew Carnegie and Henry Phipps for $480 million represents the first true major buyout as they are thought of today.

Due to structural restrictions imposed on American banks under the Glass–Steagall Act and other regulations in the 1930s, there was no private merchant banking industry in the United States, a situation that was quite exceptional in developed nations. As late as the 1980s, Lester Thurow, a noted economist, decried the inability of the financial regulation framework in the United States to support merchant banks. US investment banks were confined primarily to advisory businesses, handling mergers and acquisitions transactions and placements of equity and debt securities. Investment banks would later enter the space, however long after independent firms had become well established.

With few exceptions, private equity in the first half of the 20th century was the domain of wealthy individuals and families. The Vanderbilts, Whitneys, Rockefellers and Warburgs were notable investors in private companies in the first half of the century. In 1938, Laurance S. Rockefeller helped finance the creation of both Eastern Air Lines and Douglas Aircraft and the Rockefeller family had vast holdings in a variety of companies. Eric M. Warburg founded E.M. Warburg & Co. in 1938, which would ultimately become Warburg Pincus, with investments in both leveraged buyouts and venture capital.

It was not until after World War II that what is considered today to be true private equity investments began to emerge marked by the founding of the first two venture capital firms in 1946: American Research and Development Corporation. (ARDC) and J.H. Whitney & Company. [1]

ARDC was founded by Georges Doriot, the "father of venture capitalism" [2] (founder of INSEAD and former dean of Harvard Business School), with Ralph Flanders and Karl Compton (former president of MIT), to encourage private sector investments in businesses run by soldiers who were returning from World War II. ARDC's significance was primarily that it was the first institutional private equity investment firm that raised capital from sources other than wealthy families although it had several notable investment successes as well. [3] ARDC is credited with the first major venture capital success story when its 1957 investment of $70,000 in Digital Equipment Corporation (DEC) would be valued at over $35.5 million after the company's initial public offering in 1968 (representing a return of over 500 times on its investment and an annualized rate of return of 101%). [4] Former employees of ARDC went on to found several prominent venture capital firms including Greylock Partners (founded in 1965 by Charlie Waite and Bill Elfers) and Morgan, Holland Ventures, the predecessor of Flagship Ventures (founded in 1982 by James Morgan). [5] ARDC continued investing until 1971 with the retirement of Doriot. In 1972, Doriot merged ARDC with Textron after having invested in over 150 companies.

J.H. Whitney & Company was founded by John Hay Whitney and his partner Benno Schmidt. Whitney had been investing since the 1930s, founding Pioneer Pictures in 1933 and acquiring a 15% interest in Technicolor Corporation with his cousin Cornelius Vanderbilt Whitney. By far, Whitney's most famous investment was in Florida Foods Corporation. The company, having developed an innovative method for delivering nutrition to American soldiers, later came to be known as Minute Maid orange juice and was sold to The Coca-Cola Company in 1960. J.H. Whitney & Company continues to make investments in leveraged buyout transactions and raised $750 million for its sixth institutional private equity fund in 2005.

Before World War II, venture capital investments (originally known as "development capital") were primarily the domain of wealthy individuals and families. One of the first steps toward a professionally managed venture capital industry was the passage of the Small Business Investment Act of 1958. The 1958 Act officially allowed the U.S. Small Business Administration (SBA) to license private "Small Business Investment Companies" (SBICs) to help the financing and management of the small entrepreneurial businesses in the United States. Passage of the Act addressed concerns raised in a Federal Reserve Board report to Congress that concluded that a major gap existed in the capital markets for long-term funding for growth-oriented small businesses. It was thought that fostering entrepreneurial companies would spur technological advances to compete against the Soviet Union. Facilitating the flow of capital through the economy up to the pioneering small concerns in order to stimulate the U.S. economy was and still is the main goal of the SBIC program today. [6] The 1958 Act provided venture capital firms structured either as SBICs or Minority Enterprise Small Business Investment Companies (MESBICs) access to federal funds which could be leveraged at a ratio of up to 4:1 against privately raised investment funds. The success of the Small Business Administration's efforts are viewed primarily in terms of the pool of professional private equity investors that the program developed as the rigid regulatory limitations imposed by the program minimized the role of SBICs. In 2005, the SBA significantly reduced its SBIC program, though SBICs continue to make private equity investments.

The real growth in Private Equity surged in 1984 to 1991 period when Institutional Investors, e.g. Pension Plans, Foundations and Endowment Funds such as the Shell Pension Plan, the Oregon State Pension Plan, the Ford Foundation and the Harvard Endowment Fund started investing a small part of their trillion dollars portfolios into Private Investments - particularly venture capital and Leverage Buyout Funds.

During the 1960s and 1970s, venture capital firms focused their investment activity primarily on starting and expanding companies. More often than not, these companies were exploiting breakthroughs in electronic, medical or data-processing technology. As a result, venture capital came to be almost synonymous with technology finance.

It is commonly noted that the first venture-backed startup was Fairchild Semiconductor (which produced the first commercially practicable integrated circuit), funded in late 1957 by a loan from Sherman Fairchild's Fairchild Camera with the help of Arthur Rock, an early venture capitalist with the firm of Hayden Stone in New York (which received 20% of the equity of the newly formed company). Another early VC firm was Venrock Associates. [7] Venrock was founded in 1969 by Laurance S. Rockefeller, the fourth of John D. Rockefeller's six children as a way to allow other Rockefeller children to develop exposure to venture capital investments.

It was also in the 1960s that the common form of private equity fund, still in use today, emerged. Private equity firms organized limited partnerships to hold investments in which the investment professionals served as general partner and the investors, who were passive limited partners, put up the capital. The compensation structure, still in use today, also emerged with limited partners paying an annual management fee of 1–2% and a carried interest typically representing up to 20% of the profits of the partnership.

An early West Coast venture capital company was Draper and Johnson Investment Company, formed in 1962 [8] by William Henry Draper III and Franklin P. Johnson, Jr. In 1964 Bill Draper and Paul Wythes founded Sutter Hill Ventures, and Pitch Johnson formed Asset Management Company.

The growth of the venture capital industry was fueled by the emergence of the independent investment firms on Sand Hill Road, beginning with Kleiner, Perkins, Caufield & Byers and Sequoia Capital in 1972. Located in Menlo Park, CA, Kleiner Perkins, Sequoia and later venture capital firms would have access to the burgeoning technology industries in the area. By the early 1970s, there were many semiconductor companies based in the Santa Clara Valley as well as early computer firms using their devices and programming and service companies. [9] Throughout the 1970s, a group of private equity firms, focused primarily on venture capital investments, would be founded that would become the model for later leveraged buyout and venture capital investment firms. In 1973, with the number of new venture capital firms increasing, leading venture capitalists formed the National Venture Capital Association (NVCA). The NVCA was to serve as the industry trade group for the venture capital industry. [10] Venture capital firms suffered a temporary downturn in 1974, when the stock market crashed and investors were naturally wary of this new kind of investment fund. It was not until 1978 that venture capital experienced its first major fundraising year, as the industry raised approximately $750 million. During this period, the number of venture firms also increased. Among the firms founded in this period, in addition to Kleiner Perkins and Sequoia, that continue to invest actively are AEA Investors, TA Associates, Mayfield Fund, Apax Partners, New Enterprise Associates, Oak Investment Partners and Sevin Rosen Funds.

Venture capital played an instrumental role in developing many of the major technology companies of the 1980s. Some of the most notable venture capital investments were made in firms that include: Tandem Computers, Genentech, Apple Inc., Electronic Arts, Compaq, Federal Express and LSI Corporation.

McLean Industries and public holding companies Edit

Although not strictly private equity, and certainly not labeled so at the time, the first leveraged buyout may have been the purchase by Malcolm McLean's McLean Industries, Inc. of Pan-Atlantic Steamship Company in January 1955 and Waterman Steamship Corporation in May 1955. [11] Under the terms of the transactions, McLean borrowed $42 million and raised an additional $7 million through an issue of preferred stock. When the deal closed, $20 million of Waterman cash and assets were used to retire $20 million of the loan debt. The newly elected board of Waterman then voted to pay an immediate dividend of $25 million to McLean Industries. [12]

Similar to the approach employed in the McLean transaction, the use of publicly traded holding companies as investment vehicles to acquire portfolios of investments in corporate assets would become a new trend in the 1960s popularized by the likes of Warren Buffett (Berkshire Hathaway) and Victor Posner (DWG Corporation) and later adopted by Nelson Peltz (Triarc), Saul Steinberg (Reliance Insurance) and Gerry Schwartz (Onex Corporation). These investment vehicles would utilize a number of the same tactics and target the same type of companies as more traditional leveraged buyouts and in many ways could be considered a forerunner of the later private equity firms. In fact, it is Posner who is often credited with coining the term "leveraged buyout" or "LBO" [13]

Posner, who had made a fortune in real estate investments in the 1930s and 1940s acquired a major stake in DWG Corporation in 1966. Having gained control of the company, he used it as an investment vehicle that could execute takeovers of other companies. Posner and DWG are perhaps best known for the hostile takeover of Sharon Steel Corporation in 1969, one of the earliest such takeovers in the United States. Posner's investments were typically motivated by attractive valuations, balance sheets and cash flow characteristics. Because of its high debt load, Posner's DWG would generate attractive but highly volatile returns and would ultimately land the company in financial difficulty. In 1987, Sharon Steel sought Chapter 11 bankruptcy protection.

Warren Buffett, who is typically described as a stock market investor rather than a private equity investor, employed many of the same techniques in the creation on his Berkshire Hathaway conglomerate as Posner's DWG Corporation and in later years by more traditional private equity investors. In 1965, with the support of the company's board of directors, Buffett assumed control of Berkshire Hathaway. At the time of Buffett's investment, Berkshire Hathaway was a textile company, however, Buffett used Berkshire Hathaway as an investment vehicle to make acquisitions and minority investments in dozens of the insurance and reinsurance industries (GEICO) and varied companies including: American Express, The Buffalo News, the Coca-Cola Company, Fruit of the Loom, Nebraska Furniture Mart and See's Candies. Buffett's value investing approach and focus on earnings and cash flows are characteristic of later private equity investors. Buffett would distinguish himself relative to more traditional leveraged buyout practitioners through his reluctance to use leverage and hostile techniques in his investments.

KKR and the pioneers of private equity Edit

Lewis Cullman's acquisition of Orkin Exterminating Company in 1963 is among the first significant leveraged buyout transactions. [14] [15] [16] However, the industry that is today described as private equity was conceived by a number of corporate financiers, most notably Jerome Kohlberg, Jr. and later his protégé, Henry Kravis. Working for Bear Stearns at the time, Kohlberg and Kravis along with Kravis' cousin George Roberts began a series of what they described as "bootstrap" investments. They targeted family-owned businesses, many of which had been founded in the years following World War II and by the 1960s and 1970s were facing succession issues. Many of these companies lacked a viable or attractive exit for their founders as they were too small to be taken public and the founders were reluctant to sell out to competitors, making a sale to a financial buyer potentially attractive. Their acquisition of in 1964 is among the first significant leveraged buyout transactions. In the following years, the three Bear Stearns bankers would complete a series of buyouts including Stern Metals (1965), Incom (a division of Rockwood International, 1971), Cobblers Industries (1971) and Boren Clay (1973) as well as Thompson Wire, Eagle Motors and Barrows through their investment in Stern Metals. Although they had a number of highly successful investments, the $27 million investment in Cobblers ended in bankruptcy. [17]

By 1976, tensions had built up between Bear Stearns and Kohlberg, Kravis and Roberts leading to their departure and the formation of Kohlberg Kravis Roberts in that year. Most notably, Bear Stearns executive Cy Lewis had rejected repeated proposals to form a dedicated investment fund within Bear Stearns and Lewis took exception to the amount of time spent on outside activities. [18] Early investors included the Hillman Family [19] By 1978, with the revision of the Employee Retirement Income Security Act regulations, the nascent KKR was successful in raising its first institutional fund with approximately $30 million of investor commitments. [20] That year, the firm signed a precedent-setting deal to buy the publicly traded Houdaille Industries, which made industrial pipes, for $380 million. It was by far the largest take-private at the time. [21]

In 1974, Thomas H. Lee founded a new investment firm to focus on acquiring companies through leveraged buyout transactions, one of the earliest independent private equity firms to focus on leveraged buyouts of more mature companies rather than venture capital investments in growth companies. Lee's firm, Thomas H. Lee Partners, while initially generating less fanfare than other entrants in the 1980s, would emerge as one of the largest private equity firms globally by the end of the 1990s.

The second half of the 1970s and the first years of the 1980s saw the emergence of several private equity firms that would survive the various cycles both in leveraged buyouts and venture capital. Among the firms founded during these years were: Cinven, Forstmann Little & Company, Welsh, Carson, Anderson & Stowe, Candover, and GTCR.

Management buyouts also came into existence in the late 1970s and early 1980s. One of the most notable early management buyout transactions was the acquisition of Harley-Davidson. A group of managers at Harley-Davidson, the motorcycle manufacturer, bought the company from AMF in a leveraged buyout in 1981, but racked up big losses the following year and had to ask for protection from Japanese competitors. [ citation needed ]

Regulatory and tax changes impact the boom Edit

The advent of the boom in leveraged buyouts in the 1980s was supported by three major legal and regulatory events:

  • Failure of the Carter tax plan of 1977 – In his first year in office, Jimmy Carter put forth a revision to the corporate tax system that would have, among other results, reduced the disparity in treatment of interest paid to bondholders and dividends paid to stockholders. Carter's proposals did not achieve support from the business community or Congress and were not enacted. Because of the different tax treatment, the use of leverage to reduce taxes was popular among private equity investors and would become increasingly popular with the reduction of the capital gains tax rate. [22]
  • Employee Retirement Income Security Act of 1974 (ERISA) – With the passage of ERISA in 1974, corporate pension funds were prohibited from holding certain risky investments including many investments in privately held companies. In 1975, fundraising for private equity investments cratered, according to the Venture Capital Institute, totaling only $10 million during the course of the year. In 1978, the US Labor Department relaxed certain parts of the ERISA restrictions, under the "prudent man rule," [23] thus allowing corporate pension funds to invest in private equity resulting in a major source of capital available to invest in venture capital and other private equity. Time reported in 1978 that fund raising had increased from $39 million in 1977 to $570 million just one year later. [24] Many of these same corporate pension investors would become active buyers of the high yield bonds (or junk bonds) that were necessary to complete leveraged buyout transactions.
  • Economic Recovery Tax Act of 1981 (ERTA) – On August 15, 1981, Ronald Reagan signed the Kemp-Roth bill, officially known as the Economic Recovery Tax Act of 1981, into law, lowering of the top capital gains tax rate from 28 percent to 20 percent, and making high risk investments even more attractive.

In the years that would follow these events, private equity would experience its first major boom, acquiring some of the famed brands and major industrial powers of American business.

The decade of the 1980s is perhaps more closely associated with the leveraged buyout than any decade before or since. For the first time, the public became aware of the ability of private equity to affect mainstream companies and "corporate raiders" and "hostile takeovers" entered the public consciousness. The decade would see one of the largest booms in private equity culminating in the 1989 leveraged buyout of RJR Nabisco, which would reign as the largest leveraged buyout transaction for nearly 17 years. In 1980, the private equity industry would raise approximately $2.4 billion of annual investor commitments and by the end of the decade in 1989 that figure stood at $21.9 billion marking the tremendous growth experienced. [25]

Beginning of the LBO boom Edit

The beginning of the first boom period in private equity would be marked by the well-publicized success of the Gibson Greetings acquisition in 1982 and would roar ahead through 1983 and 1984 with the soaring stock market driving profitable exits for private equity investors.

In January 1982, former US Secretary of the Treasury William E. Simon, Ray Chambers and a group of investors, which would later come to be known as Wesray Capital Corporation, acquired Gibson Greetings, a producer of greeting cards. The purchase price for Gibson was $80 million, of which only $1 million was rumored to have been contributed by the investors. By mid-1983, just sixteen months after the original deal, Gibson completed a $290 million IPO and Simon made approximately $66 million. [26] [27] Simon and Wesray would later complete the $71.6 million acquisition of Atlas Van Lines. The success of the Gibson Greetings investment attracted the attention of the wider media to the nascent boom in leveraged buyouts.

Between 1979 and 1989, it was estimated that there were over 2,000 leveraged buyouts valued in excess of $250 million [28] Notable buyouts of this period (not described elsewhere in this article) include: Malone & Hyde (1984), Wometco Enterprises (1984), Beatrice Companies (1985), Sterling Jewelers (1985), Revco Drug Stores (1986), Safeway (1986), Southland Corporation (1987), Jim Walter Corp (later Walter Industries, Inc., 1987), BlackRock (1988), Federated Department Stores (1988), Marvel Entertainment (1988), Uniroyal Goodrich Tire Company (1988) and Hospital Corporation of America (1989).

Because of the high leverage on many of the transactions of the 1980s, failed deals occurred regularly, however the promise of attractive returns on successful investments attracted more capital. With the increased leveraged buyout activity and investor interest, the mid-1980s saw a major proliferation of private equity firms. Among the major firms founded in this period were: Bain Capital, Chemical Venture Partners, Hellman & Friedman, Hicks & Haas, (later Hicks Muse Tate & Furst), The Blackstone Group, Doughty Hanson, BC Partners, and The Carlyle Group.

As the market developed, new niches within the private equity industry began to emerge. In 1982, Venture Capital Fund of America, the first private equity firm focused on acquiring secondary market interests in existing private equity funds was founded and then, two years later in 1984, First Reserve Corporation, the first private equity firm focused on the energy sector, was founded.

Venture capital in the 1980s Edit

The public successes of the venture capital industry in the 1970s and early 1980s (e.g., DEC, Apple, Genentech) gave rise to a major proliferation of venture capital investment firms. From just a few dozen firms at the start of the decade, there were over 650 firms by the end of the 1980s, each searching for the next major "home run". The capital managed by these firms increased from $3 billion to $31 billion over the course of the decade. [29]

The growth the industry was hampered by sharply declining returns and certain venture firms began posting losses for the first time. In addition to the increased competition among firms, several other factors impacted returns. The market for initial public offerings cooled in the mid-1980s before collapsing after the stock market crash in 1987 and foreign corporations, particularly from Japan and Korea, flooded early stage companies with capital. [29]

In response to the changing conditions, corporations that had sponsored in-house venture investment arms, including General Electric and Paine Webber either sold off or closed these venture capital units. Venture capital units within Chemical Bank (today CCMP Capital) and Continental Illinois National Bank (today CIVC Partners), among others, began shifting their focus from funding early stage companies toward investments in more mature companies. Even industry founders J.H. Whitney & Company and Warburg Pincus began to transition toward leveraged buyouts and growth capital investments. [29] [30] [31] Many of these venture capital firms attempted to stay close to their areas of expertise in the technology industry by acquiring companies in the industry that had reached certain levels of maturity. In 1989, Prime Computer was acquired in a $1.3 billion leveraged buyout by J.H. Whitney & Company in what would prove to be a disastrous transaction. Whitney's investment in Prime proved to be nearly a total loss with the bulk of the proceeds from the company's liquidation paid to the company's creditors. [32]

Although lower profile than their buyout counterparts, new leading venture capital firms were also formed including Draper Fisher Jurvetson (originally Draper Associates) in 1985 and Canaan Partners in 1987 among others.

Corporate raiders, hostile takeovers and greenmail Edit

Although buyout firms generally had different aims and methods, they were often lumped in with the "corporate raiders" who came on the scene in the 1980s. The raiders were best known for hostile bids—takeover attempts that were opposed by management. By contrast, private equity firms generally attempted to strike deals with boards and CEOs, though in many cases in the 1980s they allied with managements that were already under pressure from raiders. But both groups bought companies through leveraged buyouts both relied heavily on junk bond financing and under both types of owners in many cases major assets were sold, costs were slashed and employees were laid off. Hence, in the public mind, they were lumped together. [33]

Management of many large publicly traded corporations reacted negatively to the threat of potential hostile takeover or corporate raid and pursued drastic defensive measures including poison pills, golden parachutes and increasing debt levels on the company's balance sheet. The threat of the corporate raid would lead to the practice of "greenmail", where a corporate raider or other party would acquire a significant stake in the stock of a company and receive an incentive payment (effectively a bribe) from the company in order to avoid pursuing a hostile takeover of the company. Greenmail represented a transfer payment from a company's existing shareholders to a third party investor and provided no value to existing shareholders but did benefit existing managers. The practice of "greenmail" is not typically considered a tactic of private equity investors and is not condoned by market participants.

Among the most notable corporate raiders of the 1980s were Carl Icahn, Victor Posner, Nelson Peltz, Robert M. Bass, T. Boone Pickens, Harold Clark Simmons, Kirk Kerkorian, Sir James Goldsmith, Saul Steinberg and Asher Edelman. Carl Icahn developed a reputation as a ruthless corporate raider after his hostile takeover of TWA in 1985. [34] The result of that takeover was Icahn systematically selling TWA's assets to repay the debt he used to purchase the company, which was described as asset stripping. [35] In 1985, Pickens was profiled on the cover of Time magazine as "one of the most famous and controversial businessmen in the U.S." for his pursuit of Unocal, Gulf Oil and Cities Services. [36] In later years, many of the corporate raiders would be re-characterized as "Activist shareholders".

Many of the corporate raiders were onetime clients of Michael Milken, whose investment banking firm Drexel Burnham Lambert helped raise blind pools of capital with which corporate raiders could make a legitimate attempt to take over a company and provided high-yield debt financing of the buyouts.

Drexel Burnham raised a $100 million blind pool in 1984 for Nelson Peltz and his holding company Triangle Industries (later Triarc) to give credibility for takeovers, representing the first major blind pool raised for this purpose. Two years later, in 1986, Wickes Companies, a holding company run by Sanford Sigoloff raised a $1.2 billion blind pool. [37]

In 1985, Milken raised $750 million for a similar blind pool for Ronald Perelman which would ultimately prove instrumental in acquiring his biggest target: The Revlon Corporation. In 1980, Ronald Perelman, the son of a wealthy Philadelphia businessman, and future "corporate raider" having made several small but successful buyouts, acquired MacAndrews & Forbes, a distributor of licorice extract and chocolate that Perelman's father had tried and failed to acquire 10 years earlier. [38] Perelman would ultimately divest the company's core business and use MacAndrews & Forbes as a holding company investment vehicle for subsequent leveraged buyouts including Technicolor, Inc., Pantry Pride and Revlon. Using the Pantry Pride subsidiary of his holding company, MacAndrews & Forbes Holdings, Perelman's overtures were rebuffed. Repeatedly rejected by the company's board and management, Perelman continued to press forward with a hostile takeover raising his offer from an initial bid of $47.50 per share until it reached $53.00 per share. After receiving a higher offer from a white knight, private equity firm Forstmann Little & Company, Perelman's Pantry Pride finally was able to make a successful bid for Revlon, valuing the company at $2.7 billion. [39] The buyout would prove troubling, burdened by a heavy debt load. [40] [41] [42] Under Perelman's control, Revlon sold four divisions: two were sold for $1 billion, its vision care division was sold for $574 million and its National Health Laboratories division was spun out to the public market in 1988. Revlon also made acquisitions including Max Factor in 1987 and Betrix in 1989 later selling them to Procter & Gamble in 1991. [43] Perelman exited the bulk of his holdings in Revlon through an IPO in 1996 and subsequent sales of stock. As of December 31, 2007, Perelman still retains a minority ownership interest in Revlon. The Revlon takeover, because of its well-known brand, was profiled widely by the media and brought new attention to the emerging boom in leveraged buyout activity.

In later years, Milken and Drexel would shy away from certain of the more "notorious" corporate raiders as Drexel and the private equity industry attempted to move upscale.

RJR Nabisco and the Barbarians at the Gate Edit

Leveraged buyouts in the 1980s including Perelman's takeover of Revlon came to epitomize the "ruthless capitalism" and "greed" popularly seen to be pervading Wall Street at the time. One of the final major buyouts of the 1980s proved to be its most ambitious and marked both a high-water mark and a sign of the beginning of the end of the boom that had begun nearly a decade earlier. In 1989, Kohlberg Kravis Roberts (KKR) closed on a $31.1 billion takeover of RJR Nabisco. It was, at that time and for over 17 years, the largest leverage buyout in history. The event was chronicled in the book, Barbarians at the Gate: The Fall of RJR Nabisco, and later made into a television movie starring James Garner.

F. Ross Johnson was the President and CEO of RJR Nabisco at the time of the leveraged buyout and Henry Kravis was a general partner at KKR. The leveraged buyout was in the amount of $25 billion (plus assumed debt), and the battle for control took place between October and November 1988. KKR would eventually prevail in acquiring RJR Nabisco at $109 per share marking a dramatic increase from the original announcement that Shearson Lehman Hutton would take RJR Nabisco private at $75 per share. A fierce series of negotiations and horse-trading ensued which pitted KKR against Shearson Lehman Hutton and later Forstmann Little & Co. Many of the major banking players of the day, including Morgan Stanley, Goldman Sachs, Salomon Brothers, and Merrill Lynch were actively involved in advising and financing the parties.

After Shearson Lehman's original bid, KKR quickly introduced a tender offer to obtain RJR Nabisco for $90 per share—a price that enabled it to proceed without the approval of RJR Nabisco's management. RJR's management team, working with Shearson Lehman and Salomon Brothers, submitted a bid of $112, a figure they felt certain would enable them to outflank any response by Kravis's team. KKR's final bid of $109, while a lower dollar figure, was ultimately accepted by the board of directors of RJR Nabisco. KKR's offer was guaranteed, whereas the management offer (backed by Shearson Lehman and Salomon) lacked a "reset", meaning that the final share price might have been lower than their stated $112 per share. Many in RJR's board of directors had grown concerned at recent disclosures of Ross Johnson' unprecedented golden parachute deal. TIME magazine featured Ross Johnson on the cover of their December 1988 issue along with the headline, "A Game of Greed: This man could pocket $100 million from the largest corporate takeover in history. Has the buyout craze gone too far?". [44] KKR's offer was welcomed by the board, and, to some observers, it appeared that their elevation of the reset issue as a deal-breaker in KKR's favor was little more than an excuse to reject Ross Johnson's higher payout of $112 per share. F. Ross Johnson received $53 million from the buyout.

At $31.1 billion of transaction value, RJR Nabisco was by far the largest leveraged buyouts in history. In 2006 and 2007, a number of leveraged buyout transactions were completed that for the first time surpassed the RJR Nabisco leveraged buyout in terms of nominal purchase price. However, adjusted for inflation, none of the leveraged buyouts of the 2006–2007 period would surpass RJR Nabisco. Unfortunately for KKR, size would not equate with success as the high purchase price and debt load would burden the performance of the investment. It had to pump additional equity into the company a year after the buyout closed and years later, when it sold the last of its investment, it had chalked up a $700 million loss. [45]

Two years earlier, in 1987, Jerome Kohlberg, Jr. resigned from Kohlberg Kravis Roberts & Co. over differences in strategy. Kohlberg did not favor the larger buyouts (including Beatrice Companies (1985) and Safeway (1986) and would later likely have included the 1989 takeover of RJR Nabisco), highly leveraged transactions or hostile takeovers being pursued increasingly by KKR. [46] The split would ultimately prove acrimonious as Kohlberg sued Kravis and Roberts for what he alleged were improper business tactics. The case was later settled out of court. [47] Instead, Kohlberg chose to return to his roots, acquiring smaller, middle-market companies and in 1987, he would found a new private equity firm Kohlberg & Company along with his son James A. Kohlberg, at the time a KKR executive. Jerome Kohlberg would continue investing successfully for another seven years before retiring from Kohlberg & Company in 1994 and turning his firm over to his son. [48]

As the market reached its peak in 1988 and 1989, new private equity firms were founded which would emerge as major investors in the years to follow, including: ABRY Partners, Coller Capital, Landmark Partners, Leonard Green & Partners and Providence Equity Partners.

By the end of the 1980s the excesses of the buyout market were beginning to show, with the bankruptcy of several large buyouts including Robert Campeau's 1988 buyout of Federated Department Stores, the 1986 buyout of the Revco drug stores, Walter Industries, FEB Trucking and Eaton Leonard. The RJR Nabisco deal was showing signs of strain, leading to a recapitalization in 1990 that involved the contribution of $1.7 billion of new equity from KKR. [49] In response to the threat of unwelcome LBOs, certain companies adopted a number of techniques, such as the poison pill, to protect them against hostile takeovers by effectively self-destructing the company if it were to be taken over (these practices are increasingly discredited).

The collapse of Drexel Burnham Lambert Edit

Drexel Burnham Lambert was the investment bank most responsible for the boom in private equity during the 1980s due to its leadership in the issuance of high-yield debt. The firm was first rocked by scandal on May 12, 1986, when Dennis Levine, a Drexel managing director and investment banker, was charged with insider trading. Levine pleaded guilty to four felonies, and implicated one of his recent partners, arbitrageur Ivan Boesky. Largely based on information Boesky promised to provide about his dealings with Milken, the Securities and Exchange Commission initiated an investigation of Drexel on November 17. Two days later, Rudy Giuliani, the United States Attorney for the Southern District of New York, launched his own investigation. [50]

For two years, Drexel steadfastly denied any wrongdoing, claiming that the criminal and SEC cases were based almost entirely on the statements of an admitted felon looking to reduce his sentence. However, it was not enough to keep the SEC from suing Drexel in September 1988 for insider trading, stock manipulation, defrauding its clients and stock parking (buying stocks for the benefit of another). All of the transactions involved Milken and his department. Giuliani began seriously considering indicting Drexel under the powerful Racketeer Influenced and Corrupt Organizations Act (RICO), under the doctrine that companies are responsible for an employee's crimes. [50]

The threat of a RICO indictment, which would have required the firm to put up a performance bond of as much as $1 billion in lieu of having its assets frozen, unnerved many at Drexel. Most of Drexel's capital was borrowed money, as is common with most investment banks and it is difficult to receive credit for firms under a RICO indictment. [50] Drexel's CEO, Fred Joseph said that he had been told that if Drexel were indicted under RICO, it would only survive a month at most. [51]

With literally minutes to go before being indicted, Drexel reached an agreement with the government in which it pleaded nolo contendere (no contest) to six felonies – three counts of stock parking and three counts of stock manipulation. [50] It also agreed to pay a fine of $650 million – at the time, the largest fine ever levied under securities laws. Milken left the firm after his own indictment in March 1989. [51] Effectively, Drexel was now a convicted felon.

In April 1989, Drexel settled with the SEC, agreeing to stricter safeguards on its oversight procedures. Later that month, the firm eliminated 5,000 jobs by shuttering three departments – including the retail brokerage operation.

The high-yield debt markets had begun to shut down in 1989, a slowdown that accelerated into 1990. On February 13, 1990 after being advised by United States Secretary of the Treasury Nicholas F. Brady, the U.S. Securities and Exchange Commission (SEC), the New York Stock Exchange (NYSE) and the Federal Reserve System, Drexel Burnham Lambert officially filed for Chapter 11 bankruptcy protection. [51]

S&L and the shutdown of the Junk Bond Market Edit

In the 1980s, the boom in private equity transactions, specifically leveraged buyouts, was driven by the availability of financing, particularly high-yield debt, also known as "junk bonds". The collapse of the high yield market in 1989 and 1990 would signal the end of the LBO boom. At that time, many market observers were pronouncing the junk bond market “finished.” This collapse would be due largely to three factors:

  • The collapse of Drexel Burnham Lambert, the foremost underwriter of junk bonds (discussed above).
  • The dramatic increase in default rates among junk bond issuing companies. The historical default rate for high yield bonds from 1978 to 1988 was approximately 2.2% of total issuance. In 1989, defaults increased dramatically to 4.3% of the then $190 billion market and an additional 2.6% of issuance defaulted in the first half of 1990. As a result of the higher perceived risk, the differential in yield of the junk bond market over U.S. treasuries (known as the "spread") had also increased by 700 basis points (7 percentage points). This made the cost of debt in the high yield market significantly more expensive than it had been previously. [52][53] The market shut down altogether for lower rated issuers.
  • The mandated withdrawal of savings and loans from the high yield market. In August 1989, the U.S. Congress enacted the Financial Institutions Reform, Recovery and Enforcement Act of 1989 as a response to the savings and loan crisis of the 1980s. Under the law, savings and loans (S&Ls) could no longer invest in bonds that were rated below investment grade. S&Ls were mandated to sell their holdings by the end of 1993 creating a huge supply of low priced assets that helped freeze the new issuance market.

Despite the adverse market conditions, several of the largest private equity firms were founded in this period including: Apollo Management, Madison Dearborn and TPG Capital.

Beginning roughly in 1992, three years after the RJR Nabisco buyout, and continuing through the end of the decade the private equity industry once again experienced a tremendous boom, both in venture capital (as will be discussed below) and leveraged buyouts with the emergence of brand name firms managing multibillion-dollar sized funds. After declining from 1990 through 1992, the private equity industry began to increase in size raising approximately $20.8 billion of investor commitments in 1992 and reaching a high-water mark in 2000 of $305.7 billion, outpacing the growth of almost every other asset class. [25]

Resurgence of leveraged buyouts Edit

Private equity in the 1980s was a controversial topic, commonly associated with corporate raids, hostile takeovers, asset stripping, layoffs, plant closings and outsized profits to investors. As private equity reemerged in the 1990s it began to earn a new degree of legitimacy and respectability. Although in the 1980s, many of the acquisitions made were unsolicited and unwelcome, private equity firms in the 1990s focused on making buyouts attractive propositions for management and shareholders. According to The Economist, “[B]ig companies that would once have turned up their noses at an approach from a private-equity firm are now pleased to do business with them.” [3] Private equity investors became increasingly focused on the long term development of companies they acquired, using less leverage in the acquisition. In the 1980s leverage would routinely represent 85% to 95% of the purchase price of a company as compared to average debt levels between 20% and 40% in leveraged buyouts in the 1990s and the first decade of the 21st century. KKR's 1986 acquisition of Safeway, for example, was completed with 97% leverage and 3% equity contributed by KKR, whereas KKR's acquisition of TXU in 2007 was completed with approximately 19% equity contributed ($8.5 billion of equity out of a total purchase price of $45 billion). Private equity firms are more likely to make investments in capital expenditures and provide incentives for management to build long-term value.

The Thomas H. Lee Partners acquisition of Snapple Beverages, in 1992, is often described as the deal that marked the resurrection of the leveraged buyout after several dormant years. [54] Only eight months after buying the company, Lee took Snapple Beverages public and in 1994, only two years after the original acquisition, Lee sold the company to Quaker Oats for $1.7 billion. Lee was estimated to have made $900 million for himself and his investors from the sale. Quaker Oats would subsequently sell the company, which performed poorly under new management, three years later for only $300 million to Nelson Peltz's Triarc. As a result of the Snapple deal, Thomas H. Lee, who had begun investing in private equity in 1974, would find new prominence in the private equity industry and catapult his Boston-based Thomas H. Lee Partners to the ranks of the largest private equity firms.

It was also in this time that the capital markets would start to open up again for private equity transactions. During the 1990–1993 period, Chemical Bank established its position as a key lender to private equity firms under the auspices of pioneering investment banker, James B. Lee, Jr. (known as Jimmy Lee, not related to Thomas H. Lee). By the mid-1990s, under Jimmy Lee, Chemical had established itself as the largest lender in the financing of leveraged buyouts. Lee built a syndicated leveraged finance business and related advisory businesses including the first dedicated financial sponsor coverage group, which covered private equity firms in much the same way that investment banks had traditionally covered various industry sectors. [55] [56]

The following year, David Bonderman and James Coulter, who had worked for Robert M. Bass during the 1980s completed a buyout of Continental Airlines in 1993, through their nascent Texas Pacific Group, (today TPG Capital). TPG was virtually alone in its conviction that there was an investment opportunity with the airline. The plan included bringing in a new management team, improving aircraft utilization and focusing on lucrative routes. By 1998, TPG had generated an annual internal rate of return of 55% on its investment. Unlike Carl Icahn's hostile takeover of TWA in 1985, [34] Bonderman and Texas Pacific Group were widely hailed as saviors of the airline, marking the change in tone from the 1980s. The buyout of Continental Airlines would be one of the few successes for the private equity industry which has suffered several major failures, including the 2008 bankruptcies of ATA Airlines, Aloha Airlines and Eos Airlines.

As the market for private equity matured, so too did its investor base. The Institutional Limited Partner Association was initially founded as an informal networking group for limited partner investors in private equity funds in the early 1990s. However the organization would evolve into an advocacy organization for private equity investors with more than 200 member organizations from 10 countries. As of the end of 2007, ILPA members had total assets under management in excess of $5 trillion with more than $850 billion of capital commitments to private equity investments.

The venture capital boom and the Internet Bubble (1995–2000) Edit

In the 1980s, FedEx and Apple Inc. were able to grow because of private equity or venture funding, as were Cisco, Genentech, Microsoft and Avis. [57] However, by the end of the 1980s, venture capital returns were relatively low, particularly in comparison with their emerging leveraged buyout cousins, due in part to the competition for hot startups, excess supply of IPOs and the inexperience of many venture capital fund managers. Unlike the leveraged buyout industry, after total capital raised increased to $3 billion in 1983, growth in the venture capital industry remained limited through the 1980s and the first half of the 1990s increasing to just over $4 billion more than a decade later in 1994.

After a shakeout of venture capital managers, the more successful firms retrenched, focusing increasingly on improving operations at their portfolio companies rather than continuously making new investments. Results would begin to turn very attractive, successful and would ultimately generate the venture capital boom of the 1990s. Former Wharton Professor Andrew Metrick refers to these first 15 years of the modern venture capital industry beginning in 1980 as the "pre-boom period" in anticipation of the boom that would begin in 1995 and last through the bursting of the Internet bubble in 2000. [58]

The late 1990s were a boom time for the venture capital, as firms on Sand Hill Road in Menlo Park and Silicon Valley benefited from a huge surge of interest in the nascent Internet and other computer technologies. Initial public offerings of stock for technology and other growth companies were in abundance and venture firms were reaping large windfalls. Among the highest profile technology companies with venture capital backing were Amazon.com, America Online, eBay, Intuit, Macromedia, Netscape, Sun Microsystems and Yahoo!.

The Nasdaq crash and technology slump that started in March 2000 shook virtually the entire venture capital industry as valuations for startup technology companies collapsed. Over the next two years, many venture firms had been forced to write-off large proportions of their investments and many funds were significantly "under water" (the values of the fund's investments were below the amount of capital invested). Venture capital investors sought to reduce size of commitments they had made to venture capital funds and in numerous instances, investors sought to unload existing commitments for cents on the dollar in the secondary market. By mid-2003, the venture capital industry had shriveled to about half its 2001 capacity. Nevertheless, PricewaterhouseCoopers' MoneyTree Survey shows that total venture capital investments held steady at 2003 levels through the second quarter of 2005.

Although the post-boom years represent just a small fraction of the peak levels of venture investment reached in 2000, they still represent an increase over the levels of investment from 1980 through 1995. As a percentage of GDP, venture investment was 0.058% percent in 1994, peaked at 1.087% (nearly 19x the 1994 level) in 2000 and ranged from 0.164% to 0.182% in 2003 and 2004. The revival of an Internet-driven environment (thanks to deals such as eBay's purchase of Skype, the News Corporation's purchase of MySpace.com, and the very successful Google.com and Salesforce.com IPOs) have helped to revive the venture capital environment. However, as a percentage of the overall private equity market, venture capital has still not reached its mid-1990s level, let alone its peak in 2000.

Stagnation in the LBO market Edit

As the venture sector collapsed, the activity in the leveraged buyout market also declined significantly. Leveraged buyout firms had invested heavily in the telecommunications sector from 1996 to 2000 and profited from the boom which suddenly fizzled in 2001. In that year at least 27 major telecommunications companies, (i.e., with $100 million of liabilities or greater) filed for bankruptcy protection. Telecommunications, which made up a large portion of the overall high yield universe of issuers, dragged down the entire high yield market. Overall corporate default rates surged to levels unseen since the 1990 market collapse rising to 6.3% of high yield issuance in 2000 and 8.9% of issuance in 2001. Default rates on junk bonds peaked at 10.7 percent in January 2002 according to Moody's. [59] [60] As a result, leveraged buyout activity ground to a halt. [61] [62] The major collapses of former high-fliers including WorldCom, Adelphia Communications, Global Crossing and Winstar Communications were among the most notable defaults in the market. In addition to the high rate of default, many investors lamented the low recovery rates achieved through restructuring or bankruptcy. [60]

Among the most affected by the bursting of the internet and telecom bubbles were two of the largest and most active private equity firms of the 1990s: Tom Hicks' Hicks Muse Tate & Furst and Ted Forstmann's Forstmann Little & Company. These firms were often cited as the highest profile private equity casualties, having invested heavily in technology and telecommunications companies. [63] Hicks Muse's reputation and market position were both damaged by the loss of over $1 billion from minority investments in six telecommunications and 13 Internet companies at the peak of the 1990s stock market bubble. [64] [65] [66] Similarly, Forstmann suffered major losses from investments in McLeodUSA and XO Communications. [67] [68] Tom Hicks resigned from Hicks Muse at the end of 2004 and Forstmann Little was unable to raise a new fund. The treasure of the State of Connecticut, sued Forstmann Little to return the state's $96 million investment to that point and to cancel the commitment it made to take its total investment to $200 million. [69] The humbling of these private equity titans could hardly have been predicted by their investors in the 1990s and forced fund investors to conduct due diligence on fund managers more carefully and include greater controls on investments in partnership agreements.

Deals completed during this period tended to be smaller and financed less with high yield debt than in other periods. Private equity firms had to cobble together financing made up of bank loans and mezzanine debt, often with higher equity contributions than had been seen. Private equity firms benefited from the lower valuation multiples. As a result, despite the relatively limited activity, those funds that invested during the adverse market conditions delivered attractive returns to investors. In Europe LBO activity began to increase as the market continued to mature. In 2001, for the first time, European buyout activity exceeded US activity with $44 billion of deals completed in Europe as compared with just $10.7 billion of deals completed in the US. This was a function of the fact that just six LBOs in excess of $500 million were completed in 2001, against 27 in 2000. [70]

As investors sought to reduce their exposure to the private equity asset class, an area of private equity that was increasingly active in these years was the nascent secondary market for private equity interests. Secondary transaction volume increased from historical levels of 2% or 3% of private equity commitments to 5% of the addressable market in the early years of the new decade. [71] [72] Many of the largest financial institutions (e.g., Deutsche Bank, Abbey National, UBS AG) sold portfolios of direct investments and “pay-to-play” funds portfolios that were typically used as a means to gain entry to lucrative leveraged finance and [[mergers and acquisitions]] assignments but had created hundreds of millions of dollars of losses. Some of the most notable financial institutions to complete publicly disclosed secondary transactions during this period include: Chase Capital Partners (2000), National Westminster Bank (2000), UBS AG (2003), Deutsche Bank (MidOcean Partners) (2003) Abbey National (2004) and Bank One (2004).

As 2002 ended and 2003 began, the private equity sector, which had spent the previous two and a half years reeling from major losses in telecommunications and technology companies and had been severely constrained by tight credit markets. As 2003 got underway, private equity began a five-year resurgence that would ultimately result in the completion of 13 of the 15 largest leveraged buyout transactions in history, unprecedented levels of investment activity and investor commitments and a major expansion and maturation of the leading private equity firms.

The combination of decreasing interest rates, loosening lending standards and regulatory changes for publicly traded companies would set the stage for the largest boom private equity had seen. The Sarbanes Oxley legislation, officially the Public Company Accounting Reform and Investor Protection Act, passed in 2002, in the wake of corporate scandals at Enron, WorldCom, Tyco, Adelphia, Peregrine Systems and Global Crossing among others, would create a new regime of rules and regulations for publicly traded corporations. In addition to the existing focus on short term earnings rather than long term value creation, many public company executives lamented the extra cost and bureaucracy associated with Sarbanes-Oxley compliance. For the first time, many large corporations saw private equity ownership as potentially more attractive than remaining public. Sarbanes-Oxley would have the opposite effect on the venture capital industry. The increased compliance costs would make it nearly impossible for venture capitalists to bring young companies to the public markets and dramatically reduced the opportunities for exits via IPO. Instead, venture capitalists have been forced increasingly to rely on sales to strategic buyers for an exit of their investment. [73]

Interest rates, which began a major series of decreases in 2002 would reduce the cost of borrowing and increase the ability of private equity firms to finance large acquisitions. Lower interest rates would encourage investors to return to relatively dormant high-yield debt and leveraged loan markets, making debt more readily available to finance buyouts. Alternative investments also became increasingly important as investors focused on yields despite increases in risk. This search for higher yielding investments would fuel larger funds, allowing larger deals, never before thought possible, to become reality.

Certain buyouts were completed in 2001 and early 2002, particularly in Europe where financing was more readily available. In 2001, for example, BT Group agreed to sell its international yellow pages directories business (Yell Group) to Apax Partners and Hicks, Muse, Tate & Furst for £2.14 billion (approximately $3.5 billion at the time), [74] making it then the largest non-corporate LBO in European history. Yell later bought US directories publisher McLeodUSA for about $600 million, and floated on London's FTSE in 2003.

Resurgence of the large buyout Edit

Marked by the two-stage buyout of Dex Media at the end of 2002 and 2003, large multibillion-dollar U.S. buyouts could once again obtain significant high yield debt financing and larger transactions could be completed. The Carlyle Group, Welsh, Carson, Anderson & Stowe, along with other private investors, led a $7.5 billion buyout of QwestDex. The buyout was the third largest corporate buyout since 1989. QwestDex's purchase occurred in two stages: a $2.75 billion acquisition of assets known as Dex Media East in November 2002 and a $4.30 billion acquisition of assets known as Dex Media West in 2003. R. H. Donnelley Corporation acquired Dex Media in 2006. Shortly after Dex Media, other larger buyouts would be completed signaling the resurgence in private equity was underway. The acquisitions included Burger King (by Bain Capital), Jefferson Smurfit (by Madison Dearborn), Houghton Mifflin [75] [76] (by Bain Capital, the Blackstone Group and Thomas H. Lee Partners) and TRW Automotive by the Blackstone Group.

In 2006 USA Today reported retrospectively on the revival of private equity: [77]

LBOs are back, only they've rebranded themselves private equity and vow a happier ending. The firms say this time it's completely different. Instead of buying companies and dismantling them, as was their rap in the '80s, private equity firms… squeeze more profit out of underperforming companies. But whether today's private equity firms are simply a regurgitation of their counterparts in the 1980s… or a kinder, gentler version, one thing remains clear: private equity is now enjoying a "Golden Age." And with returns that triple the S&P 500, it's no wonder they are challenging the public markets for supremacy.

By 2004 and 2005, major buyouts were once again becoming common and market observers were stunned by the leverage levels and financing terms obtained by financial sponsors in their buyouts. Some of the notable buyouts of this period include: Dollarama (2004), Toys "R" Us (2004), The Hertz Corporation (2005), Metro-Goldwyn-Mayer (2005) and SunGard (2005).

Age of the mega-buyout Edit

As 2005 ended and 2006 began, new "largest buyout" records were set and surpassed several times with nine of the top ten buyouts at the end of 2007 having been announced in an 18-month window from the beginning of 2006 through the middle of 2007. The buyout boom was not limited to the United States as industrialized countries in Europe and the Asia-Pacific region also saw new records set. In 2006, private equity firms bought 654 U.S. companies for $375 billion, representing 18 times the level of transactions closed in 2003. [79] U.S. based private equity firms raised $215.4 billion in investor commitments to 322 funds, surpassing the previous record set in 2000 by 22% and 33% higher than the 2005 fundraising total. [80] However, venture capital funds, which were responsible for much of the fundraising volume in 2000 (the height of the dot-com bubble), raised only $25.1 billion in 2006, a 2% percent decline from 2005 and a significant decline from its peak. [81] The following year, despite the onset of turmoil in the credit markets in the summer, saw yet another record year of fundraising with $302 billion of investor commitments to 415 funds. [82]

Among the largest buyouts of this period included: Georgia-Pacific Corp (2005), Albertson's (2006), EQ Office (2006 ), Freescale Semiconductor (2006), Ally Financial GMAC (2006), HCA (2006), Kinder Morgan (2006), Harrah's Entertainment (2006), TDC A/S (2006), Sabre Holdings (2006), Travelport (2006), Alliance Boots (2007), Biomet (2007), Chrysler (2007), First Data (2007) and TXU (2007).

Publicly traded private equity Edit

Although there had previously been certain instances of publicly traded private equity vehicles, the convergence of private equity and the public equity markets attracted significantly greater attention when several of the largest private equity firms pursued various options through the public markets. Taking private equity firms and private equity funds public appeared an unusual move since private equity funds often buy public companies listed on exchange and then take them private. Private equity firms are rarely subject to the quarterly reporting requirements of the public markets and tout this independence to prospective sellers as a key advantage of going private. Nevertheless, there are fundamentally two separate opportunities that private equity firms pursued in the public markets. These options involved a public listing of either:

  • A private equity firm (the management company), which provides shareholders an opportunity to gain exposure to the management fees and carried interest earned by the investment professionals and managers of the private equity firm. The most notable example of this public listing was completed by The Blackstone Group in 2007
  • A private equity fund or similar investment vehicle, which allows investors that would otherwise be unable to invest in a traditional private equity limited partnership to gain exposure to a portfolio of private equity investments.

In May 2006, Kohlberg Kravis Roberts raised $5 billion in an initial public offering for a new permanent investment vehicle (KKR Private Equity Investors or KPE) listing it on the Euronext exchange in Amsterdam (ENXTAM: KPE). KKR raised more than three times what it had expected at the outset as many of the investors in KPE were hedge funds that sought exposure to private equity but that could not make long term commitments to private equity funds. Because private equity had been booming in the preceding years, the proposition of investing in a KKR fund appeared attractive to certain investors. [83] KPE's first-day performance was lackluster, trading down 1.7% and trading volume was limited. [84] Initially, a handful of other private equity firms, including Blackstone, and hedge funds had planned to follow KKR's lead but when KPE was increased to $5 billion, it soaked up all the demand. [85] That, together with the slump of KPE's shares, caused the other firms to shelve their plans. KPE's stock declined from an IPO price of €25 per share to €18.16 (a 27% decline) at the end of 2007 and a low of €11.45 (a 54.2% decline) per share in Q1 2008. [86] KPE disclosed in May 2008 that it had completed approximately $300 million of secondary sales of selected limited partnership interests in and undrawn commitments to certain KKR-managed funds in order to generate liquidity and repay borrowings. [87]

On March 22, 2007, after nine months of secret preparations, the Blackstone Group filed with the SEC [88] to raise $4 billion in an initial public offering. On June 21, Blackstone sold a 12.3% stake in its ownership to the public for $4.13 billion in the largest U.S. IPO since 2002. [89] Traded on the New York Stock Exchange under the ticker symbol BX, Blackstone priced at $31 per share on June 22, 2007. [90] [91]

Less than two weeks after the Blackstone Group IPO, rival firm Kohlberg Kravis Roberts filed with the SEC [92] in July 2007 to raise $1.25 billion by selling an ownership interest in its management company. [93] KKR had previously listed its KKR Private Equity Investors (KPE) private equity fund vehicle in 2006. The onset of the credit crunch and the shutdown of the IPO market would dampen the prospects of obtaining a valuation that would be attractive to KKR and the flotation was repeatedly postponed.

Other private equity investors were seeking to realize a portion of the value locked into their firms. In September 2007, the Carlyle Group sold a 7.5% interest in its management company to Mubadala Development Company, which is owned by the Abu Dhabi Investment Authority (ADIA) for $1.35 billion, which valued Carlyle at approximately $20 billion. [94] Similarly, in January 2008, Silver Lake Partners sold a 9.9% stake in its management company to the California Public Employees' Retirement System (CalPERS) for $275 million. [95]

Apollo Management completed a private placement of shares in its management company in July 2007. By pursuing a private placement rather than a public offering, Apollo would be able to avoid much of the public scrutiny applied to Blackstone and KKR. [96] [97] In April 2008, Apollo filed with the SEC [98] to permit some holders of its privately traded stock to sell their shares on the New York Stock Exchange. [99] In April 2004, Apollo raised $930 million for a listed business development company, Apollo Investment Corporation (NASDAQ: AINV), to invest primarily in middle-market companies in the form of mezzanine debt and senior secured loans, as well as by making direct equity investments in companies. The Company also invests in the securities of public companies. [100]

Historically, in the United States, there had been a group of publicly traded private equity firms that were registered as business development companies (BDCs) under the Investment Company Act of 1940. [101] Typically, BDCs are structured similar to real estate investment trusts (REITs) in that the BDC structure reduces or eliminates corporate income tax. In return, REITs are required to distribute 90% of their income, which may be taxable to its investors. As of the end of 2007, among the largest BDCs (by market value, excluding Apollo Investment Corp, discussed earlier) are: American Capital Strategies (NASDAQ: ACAS), Allied Capital Corp (NASDAQ:ALD), Ares Capital Corporation (NASDAQ:ARCC), Gladstone Investment Corp (NASDAQ:GAIN) and Kohlberg Capital Corp (NASDAQ:KCAP).

Secondary market and the evolution of the private equity asset class Edit

In the wake of the collapse of the equity markets in 2000, many investors in private equity sought an early exit from their outstanding commitments. [102] The surge in activity in the secondary market, which had previously been a relatively small niche of the private equity industry, prompted new entrants to the market, however the market was still characterized by limited liquidity and distressed prices with private equity funds trading at significant discounts to fair value.

Beginning in 2004 and extending through 2007, the secondary market transformed into a more efficient market in which assets for the first time traded at or above their estimated fair values and liquidity increased dramatically. During these years, the secondary market transitioned from a niche sub-category in which the majority of sellers were distressed to an active market with ample supply of assets and numerous market participants. [103] By 2006 active portfolio management had become far more common in the increasingly developed secondary market and an increasing number of investors had begun to pursue secondary sales to rebalance their private equity portfolios. The continued evolution of the private equity secondary market reflected the maturation and evolution of the larger private equity industry. Among the most notable publicly disclosed secondary transactions (it is estimated that over two-thirds of secondary market activity is never disclosed publicly): CalPERS (2008), Ohio Bureau of Workers' Compensation (2007), MetLife (2007), Bank of America (2006 and 2007), Mellon Financial Corporation (2006), American Capital Strategies (2006), JPMorgan Chase, Temasek Holdings, Dresdner Bank and Dayton Power & Light.

In July 2007, turmoil that had been affecting the mortgage markets, spilled over into the leveraged finance and high-yield debt markets. [104] [105] The markets had been highly robust during the first six months of 2007, with highly issuer friendly developments including PIK and PIK Toggle (interest is "Payable In Kind") and covenant light debt widely available to finance large leveraged buyouts. July and August saw a notable slowdown in issuance levels in the high yield and leveraged loan markets with only few issuers accessing the market. Uncertain market conditions led to a significant widening of yield spreads, which coupled with the typical summer slowdown led to many companies and investment banks to put their plans to issue debt on hold until the autumn. However, the expected rebound in the market after Labor Day 2007 did not materialize and the lack of market confidence prevented deals from pricing. By the end of September, the full extent of the credit situation became obvious as major lenders including Citigroup and UBS AG announced major writedowns due to credit losses. The leveraged finance markets came to a near standstill. [106] As a result of the sudden change in the market, buyers would begin to withdraw from or renegotiate the deals completed at the top of the market, most notably in transactions involving: Harman International (announced and withdrawn 2007), Sallie Mae (announced 2007 but withdrawn 2008), Clear Channel Communications (2007) and BCE (2007).

The credit crunch has prompted buyout firms to pursue a new group of transactions in order to deploy their massive investment funds. These transactions have included Private Investment in Public Equity (or PIPE) transactions as well as purchases of debt in existing leveraged buyout transactions. Some of the most notable of these transactions completed in the depths of the credit crunch include Apollo Management's acquisition of the Citigroup Loan Portfolio (2008) and TPG Capital's PIPE investment in Washington Mutual (2008). According to investors and fund managers, the consensus among industry members in late 2009 was that private equity firms will need to become more like asset managers, offering buyouts as just part of their portfolio, or else focus tightly on specific sectors in order to prosper. The industry must also become better in adding value by turning businesses around rather than pure financial engineering. [107]

1980s reflections of private equity Edit

Although private equity rarely received a thorough treatment in popular culture, several films did feature stereotypical "corporate raiders" prominently. Among the most notable examples of private equity featured in motion pictures included:

  • Wall Street (1987) – The notorious "corporate raider" and "greenmailer" Gordon Gekko, representing a synthesis of the worst features of various famous private equity figures, intends to manipulate an ambitious young stockbroker to take over a failing but decent airline. Although Gekko makes a pretense of caring about the airline, his intentions prove to be to destroy the airline, strip its assets and lay off its employees before raiding the corporate pension fund. Gekko would become a symbol in popular culture for unrestrained greed (with the signature line, "Greed, for lack of a better word, is good") that would be attached to the private equity industry.
  • Other People's Money (1991) – A self-absorbed corporate raider "Larry the Liquidator" (Danny DeVito), sets his sights on New England Wire and Cable, a small-town business run by family patriarch Gregory Peck who is principally interested in protecting his employees and the town.
  • Pretty Woman (1990) – Although Richard Gere's profession is incidental to the plot, the selection of the corporate raider who intends to destroy the hard work of a family-run business by acquiring the company in a hostile takeover and then selling off the company's parts for a profit (compared in the movie to an illegal chop shop). Ultimately, the corporate raider is won over and chooses not to pursue his original plans for the company.

Two other works were pivotal in framing the image of buyout firms. [108] Barbarians at the Gate, the 1990 best seller about the fight over RJR Nabisco linked private equity to hostile takeovers and assaults on management. A blistering story on the front page of the Wall Street Journal the same year about KKR's buyout of the Safeway supermarket chain painted a much more damaging picture. [109] The piece, which later won a Pulitzer Prize, began with the suicide of a Safeway worker in Texas who had been laid off and went on to chronicle how KKR had sold off hundreds of stores after the buyout and slashed jobs.

Contemporary reflections of private equity and private equity controversies Edit

Carlyle group featured prominently in Michael Moore's 2003 film Fahrenheit 9-11. The film suggested that The Carlyle Group exerted tremendous influence on U.S. government policy and contracts through their relationship with the president's father, George H. W. Bush, a former senior adviser to the Carlyle Group. Moore cited relationships with the Bin Laden family. The movie quotes author Dan Briody claiming that the Carlyle Group "gained" from September 11 because it owned United Defense, a military contractor, although the firm's $11 billion Crusader artillery rocket system developed for the U.S. Army is one of the few weapons systems canceled by the Bush administration. [110]

Over the next few years, attention intensified on private equity as the size of transactions and profile of the companies increased. The attention would increase significantly following a series of events involving The Blackstone Group: the firm's initial public offering and the birthday celebration of its CEO. The Wall Street Journal observing Blackstone Group's Steve Schwarzman's 60th birthday celebration in February 2007 described the event as follows: [111]

The Armory's entrance hung with banners painted to replicate Mr. Schwarzman's sprawling Park Avenue apartment. A brass band and children clad in military uniforms ushered in guests. A huge portrait of Mr. Schwarzman, which usually hangs in his living room, was shipped in for the occasion. The affair was emceed by comedian Martin Short. Rod Stewart performed. Composer Marvin Hamlisch did a number from "A Chorus Line." Singer Patti LaBelle led the Abyssinian Baptist Church choir in a tune about Mr. Schwarzman. Attendees included Colin Powell and New York Mayor Michael Bloomberg. The menu included lobster, baked Alaska and a 2004 Maison Louis Jadot Chassagne Montrachet, among other fine wines.

Schwarzman received a severe backlash from both critics of the private equity industry and fellow investors in private equity. The lavish event which reminded many of the excesses of notorious executives including Bernie Ebbers (WorldCom) and Dennis Kozlowski (Tyco International). David Bonderman, the founder of TPG Capital remarked, "We have all wanted to be private – at least until now. When Steve Schwarzman's biography with all the dollar signs is posted on the web site none of us will like the furor that results – and that's even if you like Rod Stewart." [111] As the IPO drew closer, there were moves by a number of congressman and senators to block the stock offering and to raise taxes on private equity firms and/or their partners—proposals many attributed in part to the extravagance of the party. [112]

David Rubenstein's fears would be confirmed when in 2007, the Service Employees International Union launched a campaign against private equity firms, specifically the largest buyout firms through public events, protests as well as leafleting and web campaigns. [113] [114] [115] A number of leading private equity executives were targeted by the union members [116] however the SEIU's campaign was non nearly as effective at slowing the buyout boom as the credit crunch of 2007 and 2008 would ultimately prove to be.

In 2008, the SEIU would shift part of its focus from attacking private equity firms directly toward the highlighting the role of sovereign wealth funds in private equity. The SEIU pushed legislation in California that would disallow investments by state agencies (particularly CalPERS and CalSTRS) in firms with ties to certain sovereign wealth funds. [117] The SEIU has attempted to criticize the treatment of taxation of carried interest. The SEIU, and other critics, point out that many wealthy private equity investors pay taxes at lower rates (because the majority of their income is derived from carried interest, payments received from the profits on a private equity fund's investments) than many of the rank and file employees of a private equity firm's portfolio companies. [118]


The History of the Death Penalty: A Timeline

Eighteenth Century B.C. - first established death penalty laws.

Eleventh Century A.D. - William the Conqueror will not allow persons to be hanged except in cases of murder.

1608 - Captain George Kendall becomes the first recorded execution in the new colonies.

1632 - Jane Champion becomes the first woman executed in the new colonies.

1767 - Cesare Beccaria’s essay, On Crimes and Punishment, theorizes that there is no justification for the state to take a life.

Late 1700s - United States abolitionist movement begins.

Early 1800s - Many states reduce their number of capital crimes and build state penitentiaries.

1823-1837 - Over 100 of the 222 crimes punishable by death in Britain are eliminated.

1834 - Pennsylvania becomes the first state to move executions into correctional facilities.

1838 - Discretionary death penalty statutes enacted in Tennessee.

1847 - Michigan becomes the first state to abolish the death penalty for all crimes except treason.

1890- William Kemmler becomes first person executed by electrocution.

Early 1900s - Beginning of the “Progressive Period” of reform in the United States.

1907-1917 - Nine states abolish the death penalty for all crimes or strictly limit it.

1920s - 1940s - American abolition movement loses support.

1924 - The use of cyanide gas introduced as an execution method

1930s - Executions reach the highest levels in American history - average 167 per year.

1948 - The United Nations General Assembly adopted the Universal Declaration of Human Rights proclaiming a “right to life.”

1950-1980 - De facto abolition becomes the norm in western Europe.

1958 - Trop v. Dulles. Eighth Amendment’s meaning contained an “evolving standard of decency that marked the progress of a maturing society.”

1966 - Support of capital punishment reaches all-time low. A Gallup poll shows support of the death penalty at only 42%.

1968 - Witherspoon v. Illinois. Dismissing potential jurors solely because they express opposition to the death penalty held unconstitutional.

1970 - Crampton v. Ohio and McGautha v. California. The Supreme Court approves of unfettered jury discretion and non-bifurcated trials.

June 1972 - Furman v. Georgia. Supreme Court effectively voids 40 death penalty statutes and suspends the death penalty.

1976 - Gregg v. Georgia. Guided discretion statutes approved. Death penalty reinstated

January 17, 1977 - Ten-year moratorium on executions ends with the execution of Gary Gilmore by firing squad in Utah.

1977 - Oklahoma becomes the first state to adopt lethal injection as a means of execution.

1977 - Coker v. Georgia. Held death penalty is an unconstitutional punishment for rape of an adult woman when the victim is not killed.

December 7, 1982 - Charles Brooks becomes the first person executed by lethal injection.

1984 - Velma Barfield becomes the first woman executed since reinstatement of the death penalty.

1986 - Ford v. Wainwright. Execution of insane persons banned.

1986 - Batson v. Kentucky. Prosecutor who strikes a disproportionate number of citizens of the same race in selecting a jury is required to rebut the inference of discrimination by showing neutral reasons for his or her strikes.

1987 - McCleskey v. Kemp. Racial disparities not recognized as a constitutional violation of “equal protection of the law” unless intentional racial discrimination against the defendant can be shown.

1988 - Thompson v. Oklahoma. Executions of offenders age fifteen and younger at the time of their crimes is unconstitutional.

1989 - Stanford v. Kentucky, and Wilkins v. Missouri. Eighth Amendment does not prohibit the death penalty for crimes committed at age sixteen or seventeen.

1989 - Penry v. Lynaugh. Executing persons with “mental retardation” is not a violation of the Eighth Amendment.

1993 - Herrera v. Collins. In the absence of other constitutional grounds, new evidence of innocence is no reason for federal court to order a new trial.

1994 - President Clinton signs the Violent Crime Control and Law Enforcement Act expanding the federal death penalty.

1996 - President Clinton signs the Anti-Terrorism and Effective Death Penalty Act restricting review in federal courts.

1998 - Karla Faye Tucker and Judi Buenoano executed.

November 1998 - Northwestern University holds the first-ever National Conference on Wrongful Convictions and the Death Penalty. The Conference brings together 30 inmates who were freed from death row because of innocence.

January 1999 - Pope John Paul II visits St. Louis, Missouri, and calls for an end to the death penalty.

April 1999 - U.N. Human Rights Commission Resolution Supporting Worldwide Moratorium On Executions.

June 1999 - Russian President, Boris Yeltsin, signs a decree commuting the death sentences of all of the convicts on Russia’s death row.

January 2000 - Illinois Governor George Ryan declares a Moratorium on executions and appoints a blue-ribbon Commission on Capital Punishment to study the issue.

2002 - Ring v. Arizona. A death sentence where the necessary aggravating factors are determined by a judge violates a defendant’s constitutional right to a trial by jury.

2002 - Atkins v. Virginia. the execution of “mentally retarded” defendants violates the Eighth Amendment’s ban on cruel and unusual punishment.

January 2003 - Gov. George Ryan grants clemency to all of the remaining 167 death row inmates in Illinois because of the flawed process that led to these sentences.

June 2004 - New York’s death penalty law declared unconstitutional by the state’s high court.

March 2005 - In Roper V. Simmons, the United States Supreme Court ruled that the death penalty for those who had committed their crimes under 18 years of age was cruel and unusual punishment.

December 2007 - The New Jersey General Assembly votes to become the first state to legislatively abolish capital punishment since it was re-instated in 1976.

February 2008 - The Nebraska Supreme Court rules electrocution, the sole execution method in the state, to be cruel and unusual punishment, effectively freezing all executions in the state.

June 2008 - Kennedy v. Louisiana. Capital punishment cannot apply to those convicted of child rape where no death occurs.

March 2009 - Governor Bill Richardson signs legislation to repeal the death penalty in New Mexico, replacing it with life without parole.

March 2011 - Governor Pat Quinn signs legislation to repeal the death penalty in Illinois, replacing it with life without parole.


Contents

The Pre-Columbian Era is the time before Christopher Columbus went to the Americas in 1492. At that time, Native Americans lived on the land that is now controlled by the United States. They had various cultures: Native Americans in the Eastern Woodlands hunted game and deer Native Americans in the Northwest fished Native Americans in the Southwest grew corn and built houses called pueblos and Native Americans in the Great Plains hunted Bison. [1] [2] Around the year 1000, the Vikings visited Newfoundland. However, they did not settle there. [3]

The English tried to settle at Roanoke Island in 1585. [4] The settlement did not last, and no one knows what happened to the people. In 1607, the first lasting English settlement was made at Jamestown, Virginia, by John Smith, John Rolfe and other Englishmen interested in gold and adventure. [5] In its early years, many people in Virginia died of disease and starvation. The colony in Virginia lasted because it made money by planting tobacco. [6]

In 1621, a group of Englishmen called the Pilgrims settled at Plymouth, Massachusetts. [7] A bigger colony was built at Massachusetts Bay by the Puritans in 1630. [8] The Pilgrims and the Puritans were interested in making a better society, not looking for gold. They called this ideal society a "city on a hill". [9] A man named Roger Williams left Massachusetts after disagreeing with the Puritans, and started the colony of Rhode Island in 1636. [10]

Great Britain was not the only country to settle what would become the United States. In the 1500s, Spain built a fort at Saint Augustine, Florida. [11] France settled Louisiana, and the area around the Great Lakes. The Dutch settled New York, which they called New Netherland. Other areas were settled by Scotch-Irish, Germans, and Swedes. [12] [13] However, in time Britain controlled all of the colonies, and most American colonists adopted the British way of life. The growth of the colonies was not good for Native Americans. [14] Many of them died of smallpox, a disease brought to America by the Europeans. The ones who lived lost their lands to the colonists. [14]

In the early 1700s, there was a religious movement in the colonies called the Great Awakening. [15] Preachers such as Jonathan Edwards preached sermons. [15] One of them was called "Sinners in the Hands of an Angry God". The Great Awakening may have led to the thinking used in the American Revolution. [16]

By 1733, there were thirteen colonies. New York City, Philadelphia, Boston, and Charleston were the largest cities and main ports at that time. [17]

From 1756 to 1763, England and France fought a war over their land in America called the Seven Years' War or the French and Indian War, which the British won. [18] After the war, the Royal Proclamation of 1763 said that the colonists could not live west of the Appalachian Mountains. Many colonists who wanted to move to the frontier did not like the Proclamation. [19]

After the French and Indian War, the colonists began to think that they were not getting their "rights as freeborn Englishman". [20] This meant they wanted to be treated fairly by the English government. This was mainly caused by new taxes the British made the colonies pay to pay for the war. [21] Americans called this "No taxation without representation", meaning that the colonists should not have to pay taxes unless they had votes in the British Parliament. [21] Each tax was disliked, and replaced by another which led to more unity between the colonies. In 1770, colonists in Boston known as the Sons of Liberty got in a fight with British soldiers. This became known as the Boston Massacre. [22] After the Tea Act, the Sons of Liberty dumped hundreds of boxes of tea in the sea. This was known as the Boston Tea Party (1773). [23] [24] This led to the British Army taking over Boston. [25] After that, leaders of the 13 colonies formed a group called the Continental Congress. [26] Many people were members of the Continental Congress, but some of the more important ones were Benjamin Franklin, John Adams, Thomas Jefferson, John Hancock, Roger Sherman and John Jay. [27]

In 1776, Thomas Paine wrote a pamphlet called Common Sense. It argued that the colonies should be free of English rule. [28] This was based on the English ideas of natural rights and social contract put forth by John Locke and others. [29] On July 4, 1776, people from the 13 colonies agreed to the United States Declaration of Independence. This said that they were free and independent states, and were not part of England any more. [30] The colonists were already fighting Britain in the Revolutionary War at this time. The Revolutionary War started in 1775 at Lexington and Concord. [31] Though American soldiers under George Washington lost many battles to the British, they won a major victory at Saratoga in 1777. [32] This led to France and Spain joining the war on the side of the Americans. In 1781, an American victory at Yorktown helped by the French led Britain to decide to stop fighting and give up the colonies. [33] America had won the war and its independence.

In 1781, the colonies formed a confederation of states under the Articles of Confederation, but it lasted only six years. It gave almost all the power to the states and very little to the central government. [34] The confederation had no president. It could not remove Native Americans or the British from the frontier, nor could it stop mob uprisings such as Shays' Rebellion. [35] After Shays' Rebellion, many people thought the Articles of Confederation were not working. [36]

In 1787, a constitution was written. Many of the people who helped write the Constitution, such as Washington, James Madison, Alexander Hamilton and Gouverneur Morris, were among the major thinkers in America at the time. [13] Some of these men would later hold important offices in the new government. The constitution created a stronger national government that had three branches: executive (the President and his staff), legislative (the House of Representatives and the Senate), and judicial (the federal courts). [37]

Some states agreed to the Constitution very quickly. In other states, many people did not like the Constitution because it gave more power to the central government and had no bill of rights. [38] [39] To try and get the Constitution passed, Madison, Hamilton and Jay wrote a series of newspaper articles called the Federalist Papers. [38] [39] Very soon after, the Bill of Rights was added. This was a set of 10 amendments (changes), that limited the government's power and guaranteed rights to the citizens. [40] Like the Declaration of Independence, the Constitution is a social contract between the people and the government. [41] The main idea of the Constitution is that the government is a republic (a representative democracy) elected by the people, who all have the same rights. However, this was not true at first, when only white males who owned property could vote. [42] Because of state laws as well as the 14th, 15th, 19th, 24th and 26th Amendments, almost all American citizens who are at least 18 years old can vote today. [37]

In 1789, Washington was elected the first President. He defined how a person should act as President and retired after two terms. [43] During Washington's term, there was a Whiskey Rebellion, where country farmers tried to stop the government from collecting taxes on whiskey. [44] In 1795, Congress passed the Jay Treaty, which allowed for increased trade with Britain in exchange for the British giving up their forts on the Great Lakes. [45] However, Great Britain was still doing things that hurt the U.S., such as impressment (making American sailors join the British Royal Navy). [46]

John Adams defeated Thomas Jefferson in the election of 1796 to become the second President of the United States. This was the first American election that was between two political parties. [47] As president, Adams made the army and navy larger. [48] He also got the Alien and Sedition Acts passed, which were much disliked. [49]

In the election of 1800, Jefferson defeated Adams. One of the most important things he did as President was to make the Louisiana Purchase from France, which made the United States twice as big. [50] Jefferson sent Lewis and Clark to map the Louisiana Purchase. [13] Jefferson also tried to stop trade with England and France so that the United States would not become involved in a war the two countries were fighting. [51] Fighting broke out between the United States and England in 1812 when James Madison was President. This was called the War of 1812. [52]

One of the problems of this period was slavery. By 1861, over three million African-Americans were enslaved in the South. [53] This means that they worked for other people, but had no freedom and received no money for their work. Most worked picking cotton on large plantations. Cotton became the main crop in the South after Eli Whitney invented the cotton gin in 1793. [54] There were a few slave rebellions against slavery, including one led by Nat Turner. All of these rebellions failed. [55] The South wanted to keep slavery, but by the time of the Civil War, many people in the North wanted to end it. [56] Another argument between the North and South was about the role of government. The South wanted stronger state governments, but the North wanted a stronger central government. [56]

After the War of 1812 the Federalist Party faded away, leaving an "Era of Good Feelings" in which only one party was important, under Presidents James Madison and James Monroe. [57] Under Monroe, the United States' policy in North America was the Monroe Doctrine, which suggested that Europe should stop trying to control the United States and other independent countries in the Americas. [58] Around this time, Congress called for something called the "American System". [59] The American System meant spending money on banking, transportation and communication. Due to the American System, bigger cities and more factories were built. [60] One of the big transportation projects of this time was the Erie Canal, a canal in the state of New York. [61] By the 1840s, railroads were built as well as canals. By 1860, thousands of miles of railroads and telegraph lines had been built in the United States, mostly in the Northeast and Midwest. [62]

In the early 19th century, the industrial revolution came to America. Many factories were built in Northern cities such as Lowell, Massachusetts. [13] Most of them made clothes. Many factory workers were women, and some were children or people from Ireland or Germany. [63] [64] Despite this industrialization, America was still a nation of farmers. [65]

In the early and mid-1800s, there was a religious movement called the Second Great Awakening. Thousands of people gathered at large religious meetings called revivals. [66] They thought they could bring about a Golden Age in America through religion. [67] New religious movements such as the Holiness Movement and the Mormons started, and groups like the Methodist Church grew. [68] The Second Great Awakening led to two movements in reform, that is, changing laws and behaviors to make society better. [69] One of these was the Temperance Movement, which believed that drinking alcohol was evil. The other was abolitionism, which tried to end slavery. People such as Harriet Beecher Stowe and William Lloyd Garrison wrote books and newspapers saying that slavery should stop. [13] They also formed political movements, which included the Liberty Party, the Free Soil Party and the Republican Party. [70] Some abolitionists, such as Frederick Douglass, were former slaves. By 1820, slavery was very rare in the North, but continued in the South. [13]

In the 19th century, there was something called the “cult of domesticity” for many American women. This meant that most married women were expected to stay in the home and raise children. [71] As in other countries, American wives were very much under the control of their husband, and had almost no rights. Women who were not married had only a few jobs open to them, such as working in clothing factories and serving as maids. [72] By the 19th century, women such as Lucretia Mott and Elizabeth Cady Stanton thought that women should have more rights. In 1848, many of these women met and agreed to fight for more rights for women, including voting. [73] Many of the women involved in the movement for women’s rights were also involved in the movement to end slavery. [13]

In 1828, Andrew Jackson was elected President. He was the first president elected from the Democratic Party. He changed the government in many ways. Since many of his supporters were poor people who had not voted before, he rewarded them with government jobs, which is called "spoils" or "patronage". [13] Because of Jackson, a new party was formed to run against him called the Whigs. This was called the "Second Party System". [74] Jackson was very much against the National Bank. He saw it as a symbol of Whigs and of powerful American businessmen. [13] [75] Jackson also called for a high import tax that the South did not like. They called it the "Tariff of Abominations". [56] Jackson’s Vice-President, John C. Calhoun, was from the South. He wrote that the South should stop the tariff and perhaps leave the Union (secession). These words would be used again during the Civil War. [56]

People started to move west of the Mississippi River and the Rocky Mountains at this time. The first people who moved west were people who caught and sold animal skins such as John Colter and Jim Bridger. [76] [77] By the 1840s, many people were moving to Oregon by wagon, and even more people went west after the California Gold Rush of 1849. [78] [79] Many new states were added to the first thirteen, mostly in the Midwest and South before the Civil War and in the West after the Civil War. During this period, Native Americans lost much of their land. They had lost military battles to the Americans at Tippecanoe and in the Seminole War. [80] In the 1830s, Indians were being pushed out of the Midwest and South by events such as the Trail of Tears and the Black Hawk War. [81] By the 1840s, most Native Americans had been moved west of the Mississippi River.

The Mexican–American War Edit

In 1845, Texas, which was a nation after it left Mexico, joined the United States. [82] Mexico did not like this, and the Americans wanted the land Mexico had on the West Coast (“Manifest Destiny”). [83] This led to the U.S. and Mexico fighting a war called the Mexican-American War. During the war, the U.S captured the cities of San Francisco, Los Angeles, Monterrey, Veracruz and Mexico City. [84] As a result of the war, the U.S. gained land in California and much of the American Southwest. Many people in the North did not like this war, because they thought it was just good for Southern slave states. [85]

In the 1840s and 1850s, people in the Northern states and people in the Southern states did not agree whether slavery was right or wrong in the territories—parts of the United States that were not yet states. [86] People in the government tried to make deals to stop a war. Some deals were the Compromise of 1850 and the Kansas-Nebraska Act, but they did not really work to keep the Union together. [87] People in the South were angry at books like Uncle Tom’s Cabin that said that slavery was wrong. People in the North did not like a Supreme Court decision called Dred Scott that kept Scott a slave. [88] People from the South and people from the North started killing each other in Kansas over slavery. This was called "Bleeding Kansas". [13] One of the people from Bleeding Kansas, John Brown, took over a town in Virginia in 1859 to make a point about slavery being wrong and to try to get slaves to fight their owners. [89]

In the election of 1860, the Democratic Party split and the Republican candidate for President, Abraham Lincoln, was elected. After this, many Southern states left the Union. Eventually, eleven states left. They tried to start a new country called the Confederate States of America, or the "Confederacy". [90] A war started between the Union (North) and the Confederacy (South). Not having factories made it harder for Southern soldiers to get guns or uniforms. [91] The South could not get supplies because Northern ships blockaded the Southern coast. [92]

Early in the war, Confederate generals such as Robert E. Lee and Stonewall Jackson won battles over Union generals such as George B. McClellan and Ambrose Burnside. [93] In 1862 and 1863, the Union Army tried to take the Confederate capital of Richmond, Virginia several times, but failed each time. [94] Lee's army invaded the North twice, but was turned back at Antietam and Gettysburg. [92] In the middle of war, Lincoln made the Emancipation Proclamation, which freed all slaves in the Confederacy, and started letting black men fight in the Union Army. [95] The war started going the Union’s way after the battles of Gettysburg and Vicksburg in 1863. Gettysburg stopped Lee from invading the North, and Vicksburg gave the Union control over the Mississippi River. [92] In 1864, a Union Army under William T. Sherman marched through Georgia and destroyed much of it. [96] By 1865, Union General Ulysses S. Grant had taken Richmond and forced Lee to give up the fight at Appomattox. [97]

In April 1865, Lincoln was shot and killed while watching a play. The new president, Andrew Johnson, had to go through the process of reconstruction, which was putting the United States back together after the Civil War. During this time, the 13th, 14th, and 15th Amendments to the Constitution were passed, freeing slaves, making them citizens and allowing them to vote. [98] Congress was run by "Radical Republicans", who wanted to punish the South after the Civil War. [99] They did not like Johnson, and almost removed him from office. [99] They also sent many soldiers to the South, installed unpopular "scalawag" governments, and made the South pass the 14th and 15th Amendments. [100] The South did not like this, so they made "Jim Crow" laws that placed blacks in lower roles. [101] White Southerners started a group called the Ku Klux Klan that attacked blacks and stopped them from voting. [102]

During this time, many people moved to the United States from other countries, such as Ireland, Italy, Germany, Eastern Europe, and China. [103] Many of them worked in large factories and lived in big cities, such as New York City, Chicago, and Boston, often in small, poor, close-together apartments called "tenements" or "slums". [104] They often were used by "political machines", who gave them jobs and money in exchange for votes. [104]

Major politicians were chosen by political machines and were corrupt. [105] The government could do little and leaders of big businesses often had more power than the government. [105] At this time, there were several very big businesses called trusts. People who ran trusts made millions of dollars while paying their workers low wages. Some of these people were John D. Rockefeller, Andrew Carnegie, and J.P. Morgan. [106]

After the Civil War, people continued to move west where new states were formed. People now could get free land in the West due to an 1862 law called the Homestead Act. [107] Most of the land in the West was owned by the government, railroads, or large farmers. [13] The Transcontinental Railroad, finished in 1869, helped get people and goods from the west to the rest of the country. Chicago became the center of trade between West and East because many rail lines met there. [108] There were problems between the white settlers and the native Indians as more people began to move west. Because of this, many more Indians were killed at battles such as Wounded Knee. [109] Almost all the Indians' land was taken away by laws like the Dawes Act. [110]

Many Americans thought the railroads charged farmers so much money that it made them poor. [111] Workers led several strikes against the railroad that were put down by the army. Also, farmers started groups to fight the railroad, such as the Grange. [112] These groups became the Populist Movement, which almost won the presidency under William Jennings Bryan. The Populists wanted reforms such as an income tax and direct election of Senators. [113] The Populist Party died out after 1896. Many of the things the Populists wanted would happen during the Progressive Era. [114]

In the United States, progressivism is the belief that the government should have a larger role in the economy to provide good living standards for people, especially workers. [115] Imperialism was the belief that the U.S. should build a stronger navy and conquer land.

In the late nineteenth and early 20th centuries, the U.S. started being more active in foreign affairs. In 1898, the United States fought a war with Spain called the Spanish–American War. The United States won, and gained Puerto Rico, Guam, Guantanamo and the Philippines. [116] Combined with the purchase of Alaska and the taking-over of Hawaii, the United States had gained all the territory it has today, plus some it would later lose after World War II. [117] Around this time, the U.S. and European nations opened up trade with China. This was because they had beaten China in the Opium Wars and the Boxer Rebellion. The U.S. and Europe were able to trade with China through the Open Door Policy. [118]

In 1901, Theodore Roosevelt became President. He had been a soldier in the Spanish–American War. He called for a foreign policy known as the "Big Stick". [119] This meant having a large navy and exercising control over Latin America. [120] [121] Between 1901 and 1930, the United States sent soldiers into Latin America several times. [121] When Roosevelt was president, work was begun on the Panama Canal, a link between the Pacific and Atlantic Oceans that made travel around the world much faster. [122]

During this time, people started to notice the poor condition of American cities. A group of people called the “muckrakers” wrote books and newspaper articles about subjects like the power of big business, unclean practices in factories, and the condition of poor people. [13] Roosevelt and Congress answered their concerns with laws such as the Pure Food and Drug Act. The Act controlled the way food was made to make sure it was safe. [123] Another response to the muckrakers was something called "trust-busting", where big businesses were broken up into smaller ones. [124] The biggest business broken up this way was the Standard Oil Company in 1911. [125]

In 1912, Woodrow Wilson became President. He was a Progressive, but not quite the same as Roosevelt. [126] [127] He fought the "triple wall of privilege", which was big business, taxes, and fees on goods coming into the United States. [13] During this time, the Sixteenth and Seventeenth Amendments to the U.S. Constitution were passed. They allowed for a federal income tax and direct election of U.S. Senators. [128]

The United States did not want to enter World War I [129] but wanted to sell weapons to both sides. In 1915 a German submarine sank a ship carrying Americans called the Lusitania. [129] This angered Americans, and Germany stopped attacking passenger ships. In January 1917 Germany started attacking them again, and sent the Zimmerman Telegram to Mexico about invading the U.S. [130] The United States joined the war against Germany, and it ended a year later. Wilson worked to create an international organization called the League of Nations. The main goal of the League was preventing war. [131] However, the United States did not join because isolationists rejected the peace treaty. [132] At the end of World War I, a flu pandemic killed millions of people in the U.S. and Europe. [133] After the war, the United States was one of the richest and most powerful nations in the world. [134]

The "Roaring Twenties" Edit

The 1920s were an era of growth and increased wealth for the United States. Many Americans began buying consumer products, such as Model T Fords and appliances. [136] Advertising became very important to American life. [136] During this time, many black people moved out of the South and into large cities such as New York City, Chicago, St. Louis and Los Angeles. [137] They brought with them jazz music, which is why the 1920s are called the "Jazz Age". [136] The 1920s were also the Prohibition Era after the Eighteenth Amendment passed. [138] During the 1920s, drinking alcohol was illegal, but many Americans drank it anyway. [136] This led to much rum-running and violent crime. [136]

Racism was strong in the 1920s. The Ku Klux Klan was powerful once again, and attacked black people, Catholics, Jews and immigrants. [139] People blamed the war and problems in business on immigrants and labor leaders, whom they said were Bolsheviks (Russian communists). [13] [140] Many people also thought that the United States had lost touch with religion. They handled that by changing religion, and some of them by attacking science. [136]

After World War I, the United States had an isolationist foreign policy. That meant it did not want to enter into another global war. It passed laws and treaties that supposedly would end war forever, and refused to sell weapons to its former allies. [141]

In 1921, Warren G. Harding became President. He believed that the best way to make the economy good was for the government to be friendly to big business by cutting taxes and regulating less. [142] While the economy was doing very well under these policies, America had the largest difference between how much money the rich had and how much money the poor had. [143] Harding's presidency had several problems. The biggest one was Teapot Dome over oil drilling in the Navy Oil Reserve. [144] Harding died in 1923, and Calvin Coolidge became President. Coolidge believed that the government should keep out of business, just like Harding, and continued many of Harding's policies. [135] Coolidge chose not to seek the presidency in 1928 and Herbert Hoover became president. [145]

The Great Depression Edit

In 1929, a Great Depression hit the United States. The stock market crashed (lost much of its value). Many banks ran out of money and closed. [146] By 1932, over a quarter of the nation had no jobs, and much of the nation was poor or unemployed. [147] Many people were driven off farms, not only because of the Depression, but also because of a storm known as the "Dust Bowl" and because farmers had not been doing well during the 1920s. [148]

President Hoover tried to do something about the Depression, but it did not work. [149] In 1932, he was defeated and Franklin D. Roosevelt became President. He created the New Deal. It was a series of government programs which would give relief (to the people who were hurt by the bad economy), recovery (to make the economy better), and reform (to make sure a depression never happens again). [150]

The New Deal had many programs such as Social Security, the National Recovery Administration (regulated wages), Works Progress Administration (built thousands of roads, schools, government buildings and works of art), the Civilian Conservation Corps (gave young people jobs to help the environment), and Tennessee Valley Authority (built dams and electric lines in the South). [148] These programs put millions of Americans to work, though often at low pay. [151] [152] Many of these programs were started early in Roosevelt's term in a time called the "Hundred Days" or in 1935 in a time called the "Second New Deal". [153] Programs like Social Security grew out of populist movements by people such as Huey Long that were called "Share Our Wealth" and "Ham and Eggs". [153] The New Deal also led to the rise of worker's unions such as the Congress of Industrial Organizations. [13]

The New Deal is often called the period that "saved capitalism", and stopped America from becoming a Communist or Fascist state. [148] Although the New Deal improved the economy, it did not end the Great Depression. The Great Depression was ended by World War II. [154]

As World War II was beginning, the United States said they would not get involved in it. Most Americans thought the United States should remain neutral, and some people thought the United States should enter the war on the side of the Germans. [13] [141] Eventually, the U.S. did try to help the Allied Powers (Soviet Union, Britain, and France) with the Lend Lease Act. It gave the Allies a lot of money and guns in trade for use of air bases throughout the world. [155]

On December 7, 1941, Japan attacked Pearl Harbor, a U.S.Naval base in Hawaii. [156] The U.S. was no longer neutral, and it declared war on the Axis Powers (Germany, Japan, Italy). The U.S. entering World War II ended the Great Depression because the war created many jobs. [154] While some of the battles the U.S. fought in were air and naval battles with Japan, the U.S. mainly fought in Europe and Africa. [157] The U.S. opened up several fronts, including in North Africa and Italy. [157] The U.S. also bombed Germany from airplanes, blowing up German cities and factories. [157] On June 6, 1944 (D-Day), American and British forces invaded Normandy. A year later, the Allies had freed France and taken Berlin. [153] In 1945, Roosevelt died, and Harry Truman became president. The U.S. decided to drop two atomic bombs on Japan. Japan gave up soon afterwards, and the war ended.

The war meant different things for women and minorities. During the war, many women worked in weapons factories. They were symbolized by a character called "Rosie the Riveter". [158] [159] Many African-Americans served in the army, but often in segregated units with white officers. [160] Japanese-Americans on the West Coast were forced to live in internment camps, though a few also served in the Army. [161]

Cold War Edit

After World War II, the Soviet Union and the United States were the two most powerful countries left in the world. The Cold War was a period of tension between the two countries over ways of life. The two countries tried to get other countries on their side. The Soviet Union tried to get countries to become Communist and the United States tried to stop them from being Communist. [162] American and Soviet soldiers never fought in battles, but they fought indirectly in the Korean War (1950s) and the Vietnam War (1950s–1970s). [163]

The Korean War lasted only a few years, but led to American soldiers being in Korea since then. [164] The Vietnam War lasted much longer. It started with a few American troops in Vietnam, but by the 1960s thousands of Americans were being sent to Vietnam. [165] Both wars were between a Northern Communist government helped by the Soviet Union and Communist China and a Southern government helped by the U.S. The Korean War resulted in a split Korea, but the Vietnam War resulted in a Communist Vietnam after the United States left due to American people wanting to end the war. [166] Over a quarter million Americans died or were wounded in Vietnam, which was very much a military failure. [167] The U.S. and Soviet Union argued about where they could place nuclear weapons. One of these arguments was the Cuban Missile Crisis. During the Cuban Missile Crisis, the U.S. and Soviet Union came very close to attacking each other with nuclear weapons. [168]

During the Cold War, the United States had a "Red Scare" where the government tried to find people it thought were Communist. The House of Representatives had a group called the House Un-American Activities Committee to deal with this, and Joseph McCarthy led hearings in the Senate. [169] The Red Scare led to people losing their jobs, going to jail, and even being executed. [170] Many actors and authors were put on blacklists, which meant they could not get jobs in movies or get credit for their writings. [13] [171]

The Cold War began with an arms race between the United States and the Soviet Union to see who could have more and better weapons. This started after the Soviets were the second country to develop an atomic bomb. [172] In the United States, this started something called the "Military Industrial Complex", which meant business and government working together to spend a lot of money on large-scale weapons projects. Business and government helped each other to get more money and more power. [173] Part of the Complex was something called the Marshall Plan, which rebuilt Europe while making them buy American goods. [174] The Complex allowed for a growing middle class, but also kept the Cold War going. [173]

Besides the arms race, another part of the Cold War was the "Space Race". This started when the Soviets launched a satellite into space called Sputnik in 1957. [175] Americans became worried that the United States was falling behind the Soviet Union, and made their schools focus more on mathematics and science. [176] Within a few years, both the United States and the Soviet Union had sent satellites, animals and people into orbit. [175] In 1969 the Apollo 11 mission put Neil Armstrong and Buzz Aldrin on the Moon. [177]

United States foreign policy changed in the 1970s when the United States left Vietnam and Richard Nixon left office due to a political scandal called Watergate. [13] In the 1970s and 1980s, the United States had a policy of "detente" with the Soviet Union. This meant that the two countries signed treaties to stop use of weapons. [178] Under Nixon and Reagan, the United States sent troops and money to many Latin American governments to stop them from being Communist. [121] This led to violence in Latin America. [121] Around this time, the economy suffered because the United States was not making as many things as it used to, and because some countries in the Middle East were not giving the U.S. as much oil as it wanted (this was called an "oil embargo"). [162] The Middle East became very important in American foreign policy after several Americans were kidnapped in Iran in 1979. [179] In the 1980s, people in the U.S. government sold weapons to people in Iran and gave the money to "contra" soldiers in Nicaragua. [180] This was called the "Iran-Contra affair". In the 1970s and 1980s, the U.S. normalized relations with China. [181] The Cold War came to an end as Communist governments in the Soviet Union and other countries fell apart. [182]

Domestic and social issues Edit

The United States once again had prosperity. Millions of white people moved out of the cities and into suburbs, and into Southern and Western states known as the "Sunbelt". [183] They bought new cars and television sets. [184] The birth rate in the 1940s and 1950s rose, in what was called the "Baby Boom" [185] The "Space Age" inspired "Googie" style art and architecture. [186] Many more people became part of the middle class, but there were still many people who were poor. [187]

Poverty was most common among African-Americans. Most lived in poor neighborhoods in Northern cities, or in the South where they faced racism and "Jim Crow" segregation. [13] These conditions led to the Civil Rights Movement of the 1950s, led by Martin Luther King Jr. and others. In 1954, the Supreme Court found school segregation illegal in Brown v. Board of Education, though it would be several years before school segregation was ended. [188] In 1955, King led a bus boycott in Montgomery, Alabama. [189] In the late 1950s and 1960s, King got help from Presidents John F. Kennedy, who was shot, and Lyndon B. Johnson. [190] In 1963, he led a march on Washington calling for civil rights. Soon after, Congress passed laws that made most segregation illegal. [191] Johnson also passed a program called the Great Society that helped poor people and minorities. [192]

Gays and lesbians, who had often been persecuted, also started to ask for rights, beginning with the Stonewall riots in 1969. [193] Chicanos, Native Americans, old people, consumers, and people with disabilities also fought for rights, as did women. Though women had had jobs during World War II, most of them went back to the home after the war. [194] Women did not like that they often held jobs that paid less than men or that fewer opportunities were open to them. [195] People like Betty Freidan and Gloria Steinem founded groups such as the National Organization for Women to try and solve these problems. NOW and other groups wanted an Equal Rights Amendment that would guarantee them equality in all areas. [196] In the 1970s and 1980s, many more jobs and opportunities were opened to women. There were some women like Phyllis Schlafly who opposed Freidan and Steinem and were known as "anti-feminists". [197] It was partly because of the anti-feminists that the Equal Rights Amendment was defeated, but also because women had already gained equality in many areas and they did not want to be drafted into the army. [197]

In the 1960s, the counterculture was created. [198] Some of the followers of the counterculture were called hippies. They had long hair, and lived communally, smoking marijuana and practicing free love. [199] The counterculture, along with college students, were the groups most against the Vietnam War. [200] They also were the groups that listened to new music known as rock and roll. [201]

In 1973, the Supreme Court issued a decision called Roe v. Wade, which made many abortions legal. [202] The many changes led to a reaction by Jerry Falwell and other conservatives who called themselves the "Religious Right" and the "Moral Majority". [203]

Reagan Era Edit

Ronald Reagan was elected President in 1980. He defeated incumbent Jimmy Carter by winning 44 out of the 50 American states. [13] During the Reagan Era, the country was facing through inflation, a bad economy, and the American foreign policy were not as good. When Ronald Reagan became president, he signed the Economic Recovery Tax Act of 1981 which lowered taxes for corporations, supposedly so they could reinvest the surplus profits back into business. During Reagan's presidency, he expanded the American military creating more jobs, but also raising the deficit due to overspending. During his first term, the economy went from a 4.5% to 7.2%.

In 1984, Reagan won in a major landslide by winning 49 out of the 50 American states. During his second term, Reagan focused on ending the Cold War. He held many meetings between Margaret Thatcher, Pope John Paul II, and Soviet leader Mikhail Gorbachev. They first met at the Geneva Summit in 1985. Later they both discovered their passion of ending the war. Reagan met four times with Soviet leader Mikhail Gorbachev, who ascended to power in 1985, and their summit conferences led to the signing of the Intermediate-Range Nuclear Forces Treaty.

Also during his second term, Reagan's Invasion of Grenada and bombing of Libya were popular in the US, though his backing of the Contras rebels was mired in the controversy over the Iran–Contra affair that revealed Reagan's poor management style. [205]

Since leaving office in 1989, Reagan became one of the most popular Presidents of the United States. [13]

Post-Cold War era Edit

In the late 1980s and early 1990s, the Cold War came to an end. This was due to the Russian leader Mikhail Gorbachev starting a policy called perestroika, the fall of the Berlin Wall, and the Soviet Union breaking into different countries. [206] Around this time, the United States cut down on its production of cheap goods, and had many people working in service jobs. [207] Part of these service jobs were in computers and the internet, which came into wide use in the 1990s. [208] By this time, the United States had a very large trade deficit, meaning it received more goods from other countries, such as China, than it sent to other countries. [209]

The Middle East became the main focus of U.S. foreign policy. [210] In 1991, the United States fought a war with Iraq called the First Gulf War or Operation Desert Storm. This was to stop Iraqi leader Saddam Hussein from occupying Kuwait, a small oil-producing country.

In 1992, Bill Clinton became President. Under Clinton, the United States sent soldiers into Bosnia as part of a United Nations mission. [13] The United States also agreed to a trade pact called the North American Free Trade Agreement (and repealed Glass–Steagall Legislation). [211] Clinton was impeached for lying in court about his relationship with Monica Lewinsky, but the Senate voted against removing him as President. [212]

21st century Edit

Bush presidency Edit

In 2000, George W. Bush was elected President. Terrorists attacked the World Trade Center on September 11, 2001. Thousands of people died. Soon after the attacks, the U.S. and NATO went to Afghanistan to find Osama bin Laden and others who they believed planned the September 11 attacks. In 2003, the United States invaded Iraq. The wars in Iraq and Afghanistan have lasted many years. By 2011, most American soldiers had left Iraq, and combat there was over.

In 2005, the southern United States was hit by Hurricane Katrina. Much of the city of New Orleans was destroyed. In 2006, the Democrats won back Congress because Americans did not like the way Bush dealt with War in Iraq or Katrina. At the end of Bush's term, the United States entered the worst recession since the Great Depression.

Obama presidency Edit

Barack Obama was elected President in 2008. He became the first African-American President of the United States. During his first years in office, Obama and Congress passed reforms on health care and banking. They also passed a large stimulus bill to help the economy during the recession. During the recession, the government used large amounts of money to keep the banking and auto industries from falling apart. There was also a large oil spill in the Gulf of Mexico. In 2010, Congress passed the Patient Protecton and Affordable Care Act, a sweeping overhaul of the health care system. Dubbed "Obamacare", it was faced with fierce criticism from conservative media.

A "Tea Party movement" started during Obama's presidency. This group opposes Obama's health care plan and other policies they see as "big government." Due to the recession, the Tea Party and a dislike of what Obama did, Republicans won a large number of House and Senate seats in the 2010 election. In 2011, Tea Party members of Congress almost shut down the government and sent the U.S. into default (not being able to pay people the government owes money). A few months later, many young people protested against organized and concentrated wealth during the Occupy movement. In 2012, Obama was reelected to a second term. Following reelection, Obama faced major obstruction from Congressional Republicans. This polarization in the political atmosphere and the media, lead to events such as the 2013 Federal Government Shutdown and the stalling of Obama's Supreme Court pick, Judge Merrick Garland to replace Justice Antonio Scalia. In 2014, Republicans took control of both houses of Congress, further adding to the gridlock. In foreign policy, President Obama helped crafted the Paris Climate Agreement, a major global commitment to fighting climate change. He also forged the Iran Nuclear Agreement and opened relations with Cuba for the first time in fifty years.

Trump presidency Edit

The United States presidential election, 2016 attracted much attention. Main popular candidates of the election were Republicans Donald Trump and Senator Ted Cruz and Democrats Hillary Clinton and Senator Bernie Sanders. Trump and Clinton won their respective primaries. On November 9, 2016, Trump defeated Clinton. Trump was inaugurated on January 20, 2017. Afterwards, there were many protests against Trump across the country.

On January 27, President Trump signed an executive order that stopped refugees from entering the country for 120 days and denied entry to citizens of Iraq, Iran, Libya, Somalia, Sudan, Syria, and Yemen for 90 days, citing security concerns about terrorism. The next day, thousands of protesters gathered at airports and other locations throughout the United States to protest the signing of the order and detainment of the foreign nationals. [213] Later, the administration seemed to reverse a portion of part of the order, effectively exempting visitors with a green card. [214]

On May 3, 2017, Puerto Rico filed for bankruptcy after a massive debt and weak economy. [215] It is the largest bankruptcy case in American history. [215]

On September 24, 2019, Speaker of the House Nancy Pelosi announced that the House of Representatives would begin an impeachment inquiry into Trump. On October 31, 2019, the House voted 232–196 to created procedures for public hearings. [216] On December 16, the House Judiciary Committee released a report specifying criminal bribery and wire fraud charges as part of the abuse of power charge. [217] The house voted to impeach Trump on December 18, 2019, making him the third president in American history to be impeached. [218]

During most of 2020, the United States was impacted by the COVID-19 pandemic that has impacted the world. The country became the highest infected and the most people dead from the infection. [219] The Trump administration received negative responses for their handling of the virus. [220] [221] Some people refuse to wear surgical masks to help stop the transmission of the virus. [222] [223] In some states, governors locked-down their states in an attempt to stop the spreading of the virus. [224]

Starting in May 2020, racial tensions in the country began to intensify through the police murder of George Floyd causing massive protests and rioting across the country. [225] [226] The Black Lives Matter movement began in grow popularity through mixed reception. [227]


Early History of the Death Penalty

The first established death penalty laws date as far back as the Eighteenth Century B.C. in the Code of King Hammurabi of Babylon, which codified the death penalty for 25 different crimes. The death penalty was also part of the Fourteenth Century B.C.’s Hittite Code in the Seventh Century B.C.’s Draconian Code of Athens, which made death the only punishment for all crimes and in the Fifth Century B.C.’s Roman Law of the Twelve Tablets. Death sentences were carried out by such means as crucifixion, drowning, beating to death, burning alive, and impalement.

In the Tenth Century A.D., hanging became the usual method of execution in Britain. In the following century, William the Conqueror would not allow persons to be hanged or otherwise executed for any crime, except in times of war. This trend would not last, for in the Sixteenth Century, under the reign of Henry VIII, as many as 72,000 people are estimated to have been executed. Some common methods of execution at that time were boiling, burning at the stake, hanging, beheading, and drawing and quartering. Executions were carried out for such capital offenses as marrying a Jew, not confessing to a crime, and treason.

The number of capital crimes in Britain continued to rise throughout the next two centuries. By the 1700s, 222 crimes were punishable by death in Britain, including stealing, cutting down a tree, and robbing a rabbit warren. Because of the severity of the death penalty, many juries would not convict defendants if the offense was not serious. This lead to reforms of Britain’s death penalty. From 1823 to 1837, the death penalty was eliminated for over 100 of the 222 crimes punishable by death. (Randa, 1997)


Watch the video: Ask History: First. Capital. History (May 2022).


Comments:

  1. Tyronne

    Don't give me the minute?

  2. Connlaio

    I believe that you are making a mistake. I can defend my position. Email me at PM, we'll talk.

  3. Fenrirg

    do you recognize that they have written?

  4. Aradal

    In this something is and is the good idea. I keep him.

  5. Phaethon

    I apologize, but in my opinion you are wrong. I offer to discuss it. Write to me in PM, we'll talk.

  6. Culbart

    Sorry to interfere, I would also like to express my opinion.



Write a message