Army War College Publication Repository      Total Publications 10


Author: Dr Conrad C Crane, Dr Michael E Lynch, Shane P. Reilly

Published: February 2019


The establishment of Army Futures Command (AFC) in August 2018 was the most significant change to the Institutional Army in a generation, and it signaled the value the Army placed on studying the future. While the establishment of a new four star headquarters might be seen as a bold move, it was in reality the culmination of 30 years of future development in the Army. Those three decades saw the development of numerous structures designed to examine the potential for future concepts and technology, with uneven success. The processes were good, but technological overreach, and over 20 years of war in the Middle East, doomed most efforts to put useful concepts into practice. Army visionaries such as Generals William E. DePuy and Donn A. Starry had planned for the future as early as the 1970s with an eye on the past, and initiated the programs and doctrine necessary to transform and modernize the Army following Vietnam.


Published: February 2019

This study analyzes the initial entry training programs for Army inductees for the last 100 years, to identify the patterns that have shaped that training. Technology has changed over the years, and training has adapted, but technological change has been a less important factor than the oscillation between wartime and peacetime methodologies. Changes in technology have not changed the core functions in which the Army trains its new Soldiers: lethality and survivability. The unvarying trend for the last century shows an increase in lethality and survivability skills after the nation enters combat, often learning harsh lessons. As soon as the conflict ends, however, the training emphasis reflexively moves back toward garrison-type activities. The length of initial entry or Basic Combat Training (BCT) has also waxed and waned over the years, ranging from as long as 17 weeks (1943) (not including OSUT) to as short as 8 weeks (1980). There were always external factors that affected the amount of training time available, such as budgets, force structure, institutional infrastructure, and end strength. This study focuses largely, however, on how the Army used the time allotted. The analysis focuses primarily on infantry skills, but also examines other training where necessary for clarity. (Non-infantry, especially sustainment MOSs, have traditionally received less marksmanship training.) The unifying concept is that all initial entry-training categories have remained the same for Soldiers throughout the period, while time spent on each category has fluctuated. Soldiers received different training in specialties.


Author: Dr Conrad C Crane, Dr Michael E Lynch, Shane P. Reilly, Jessica J. Sheets

Published: February 2019


This study analyzes the initial entry training programs for Army inductees for the last 100 years, to identify the patterns that have shaped that training. Technology has changed over the years, and training has adapted, but technological change has been a less important factor than the oscillation between wartime and peacetime methodologies. Changes in technology have not changed the core functions in which the Army trains its new Soldiers: lethality and survivability. The unvarying trend for the last century shows an increase in lethality and survivability skills after the nation enters combat, often learning harsh lessons. As soon as the conflict ends, however, the training emphasis reflexively moves back toward garrison-type activities. The length of initial entry or Basic Combat Training (BCT) has also waxed and waned over the years, ranging from as long as 17 weeks (1943) (not including OSUT) to as short as 8 weeks (1980). There were always external factors that affected the amount of training time available, such as budgets, force structure, institutional infrastructure, and end strength. This study focuses largely, however, on how the Army used the time allotted. The analysis focuses primarily on infantry skills, but also examines other training where necessary for clarity. (Non-infantry, especially sustainment MOSs, have traditionally received less marksmanship training.) The unifying concept is that all initial entry-training categories have remained the same for Soldiers throughout the period, while time spent on each category has fluctuated. Soldiers received different training in specialties.


Author: Dr Conrad C Crane, Dr Michael E Lynch, Shane P. Reilly

Published: February 2019


The history of the U.S. Army in Operation Iraqi Freedom is replete with tactical and operational studies, and the shifts in strategy are well documented. The Chief of Staff of the Army’s (CSA) official study, The U.S. Army in the Iraq War, provides an excellent analysis of the operational level of war. “Riding the Hydra,” however, examines the institutional Army, specifically the Army staff, and its efforts to prepare the Army for war.
When President George W. Bush made the decision to launch the war in Iraq, the Army faced a two-front war for the first time since World War II. Though the Army in 2002 was much better trained, equipped, and ready than its predecessor sixty years before, it still showed the effects of declining budgets and lack of strategic focus. The modern, professional Army requires bureaucratic processes in order to coordinate a complex and highly sophisticated system. The defense budgets have declined over the years, but they remain as much as 14 percent of the total federal budget. Managing those funds properly and legally requires a system of firm controls.


Author: Dr Conrad C Crane, Dr Michael E Lynch, Shane P. Reilly, Jessica J. Sheets

Published: February 2019


Task Force Smith at the beginning of the Korean War has often been used as a metaphor for military unreadiness. While the story of that first US action of the war provides a timeless cautionary tale for commanders, the story of unreadiness for war in June 1950 went much further than the tactical failures of one infantry battalion. The lack of readiness was caused by a very disruptive interwar period that saw drastic and often chaotic changes to Doctrine, Organization, Training, Materiel, Leadership, Personnel, Facilities and Policy (DOTMLPF-P).
This case study examines the political, economic, military, and strategic environment in the years between 1945 and 1950 to illustrate the complexity of the readiness issue. Readiness in the strategic context concerned many more issues than simply personnel status or equipment availability.


Published: January 2019

During the development of The U.S. Army in Iraq, Volumes 1 and 2, the Operation IRAQI FREEDOM Study Group relied heavily on over 30,000 pages of previously classified unit reports, assessments, and briefings, as well as numerous correspondences between senior U.S. military and civilian leaders. These documents are available to the public and can be accessed here.


Author: Dr Conrad C Crane, Dr Michael E Lynch, Dr James D. Scudieri

Published: April 2016

When Gen. Gordon Sullivan was Chief of Staff of the Army, he kept two books on his desk, The Seeds of Disaster: The Development of French Army Doctrine, 1919- 1939 by Col. Robert Doughty, and America’s First Battles, 1776-1965 by Lt. Col. Charles E. Heller and Brig. Gen. William A. Stofft. Sullivan was determined that his Army would be fully prepared for the next war. Other chiefs have echoed Sullivan’s concern, and the lessons learned from studying past battles are not always lost to history, yet they are sometimes forgotten. Landpower is by its nature a complex activity, and military history is replete with examples of unreadiness or unpreparedness for battle. The case studies that follow extend those covered in America’s First Battles. Heller and Stofft determined that the nation and its armed forces routinely arrived on the field of battle unready for the challenges they faced due to lack of adequate and timely training, miserly budgets, and an atrophied force structure. America’s First Battles examined the first battle in each of America’s wars, from the American Revolution to Vietnam, in order to gain insights into how the nation fared in these encounters. Its look at first, often disastrous, encounters presents a sobering reminder of the need for readiness. The following case studies also illustrate how complexity defines operations and affects both readiness and outcomes. The studies illustrate three themes from which insights may be drawn.


Author: Dr Conrad C Crane, Dr Michael E Lynch, Dr James D. Scudieri

Published: April 2016

The potential changes in the operating environment (OE) and the character of war in the next 15-20 years are unknowable and history cannot provide a predictive model or “cookbook” to anticipate future events. The last 250 years, however, have provided many examples of shifts in the character of war caused by emerging technology, political shifts, economic changes and diplomatic crises. This context may prove very useful for senior leaders. There will doubtless be technological advances in the future, and some may be “game changers.” Intellectual development is just as important as technological development. The Army learned during the interwar years between the world wars that maintaining intellectual capital was critical to later success. Technological change is constant, and all armies adapt to it, yet not all technological changes affect the character of war. The machine gun and the computer, for instance, revolutionized tactics, but had little effect beyond the battlefield. The advent of submarines, airplanes, and nuclear weapons, however, fundamentally altered how war is conducted—the character of war. These case studies address periods during which the character of war changed.


Author: Dr Conrad C Crane, Dr Michael E Lynch, Dr James D. Scudieri

Published: October 2014

Numerous Ebola outbreaks have devastated West African communities. Beginning in March 2014, 7,470 people contracted the virus and 3,431 died in Liberia, Sierra Leone, Guinea, Senegal, and Nigeria. The disease has since spread to the United States and Europe. In the U.S., the disease has claimed one life and two more have become infected.2 Previous Ebola epidemics occurred in West and Central Africa in 1976, 1995, 2000, and 2007. The 2014 outbreak is by far the deadliest, already approaching ten times the number of cases of the 1976 outbreak, the previous worst in history and the year of the virus’ discovery.3 The magnitude of the epidemic has caused a global crisis and evoked a powerful response from the United States Government. On September 16, speaking at the Centers for Disease Control and Prevention headquarters in Atlanta, Georgia, Pres. Barack Obama resolved to “make [Ebola] a national security priority.” The President’s strategy comprises four elements: containing the spread, countering negative economic and communal ramifications, coordinating a global response, and developing public health systems in affected countries for the future. Further, President Obama announced the establishment of a military command center and field hospitals in Liberia, a healthcare training center in Senegal, and an “airbridge” to the region for supply and personnel transfer.4 President Obama’s actions initiate a trend: military operations specifically targeting disease containment. This approach is comprehensive, but not new. Moreover, viruses such as Ebola emphasize the unpredictable nature of disease, emerging sporadically, without warning, and potentially virulently. Early planning for the aftermath of an outbreak is an essential component of containment and mitigation. The U.S. military has encountered disease on a large scale throughout its history. This latest deployment benefits from centuries of combined wisdom in disease control. From 1776 until 1918, the so-called “Disease Era” of American conflict, the microbe, rather than the enemy combatant, was the Soldier’s most lethal adversary. Indeed, all casualty counts must include a “disease and non-battle injury” (DNBI) category to include those who succumb to such maladies. Scientific and medical advancements have since learned the causes of various diseases, provided treatments, improved sanitation, and promoted hygiene. Disease rates in the military subsequently plummeted.5 Despite those successes, and the now-universal use of vaccines to protect the military and civilian workforce, their families, and retirees, disease remains a constant and growing threat. “Old” diseases thought to be eliminated, such as typhoid fever, or at least controlled, such as influenza, have returned, sometimes in new and more virulent form. Diseases such as Ebola, previously thought to be limited to developing nations, have appeared in more modern societies. “New” diseases, such as Severe Acute Respiratory Syndrome (SARS), have emerged. This survey provides three case studies from American history in which epidemic disease affected U.S. Army operations. The Yellow Fever in Havana, Cuba in the 1890s and in Panama in the early 1900s demonstrates a case in which disease eradication required multiple Army control measures. Success was critical to complete the Panama Canal. The 1918-19 Spanish influenza demonstrates a case in which pandemic swept through the Army, taking advantage of mass mobilization as it devastated civilian populations as well. Diseases as debilitants during World War II and later conflicts demonstrate scenarios in which medicine taken according to a precise regimen drastically reduced mass infection. These examples demonstrate how military forces have fought or contained disease of epidemic proportions. Although the diseases in these case studies use different vectors and vary greatly from Ebola in numerous respects, the Army’s response to them provides some similarities. The nature of Ebola, its speed of transmission, and the regions in which it is currently rampant provide the commander with significant challenges. The challenges to protect the force are much more complex under the threat of widespread infectious disease. This study offers some considerations for the commander and staff planning operations in support of mitigating the Ebola outbreak.


Author: Dr Conrad C Crane, Dr Michael E Lynch, Dr James D. Scudieri

Published: August 2014

This study analyzes the US Army's experiences from the twentieth century' to the present, given the demands of modem war and associated structures, including political, industrial, and military. 2 These examples generally mirrored those from earlier wars. Broadly speaking, United States defense policy has relied upon a small regular army (RA), expandable upon the outbreak of war. That expanded army then largely demobilized upon war's end. Reliance upon state militias to augment the regular army in the American Revolution of 1775-83 and the War of 1812 to 1814 changed to volunteer troops vice militia in the Mexican War of 1846-48 and the American Civil War of 1861-65. The war with Spain in 1898 was then a sobering experience in preparedness, especially strategic deployment and logistics. The United States had in essence inherited and perpetuated major aspects of English, then British, military policies, given traditional political suspicions of standing armies. American practice has been to assume strategic, operational, and tactical risk at the start of a conflict, whose history has become associated with the theme of "America's first battles."3 One enduring theme that stands out among all the conflicts studied is the lack of preparedness for immediate action, driven either by strategic surprise or lack of popular and political will. World War II provides an exception because of preparatory steps taken prior to the attack on Pearl Harbor, but the declaration of war still found the US Army ill-trained and ill-equipped to take the field immediately.