The collaboration of technology and weapons development occasionally yields strategic advantages, dramatically changing the way war is waged and significantly shifting power projection and great power alignment. Many believe lethal autonomous weapons (LAWs) to be in that category. Others, however, contend that removing human oversight from the offensive targeting process violates the Law of Armed Conflict (LOAC), specifically the principles of discrimination and proportionality. In order to stop their development, numerous organizations are calling for an international ban on the development of LAWs, claiming their use violates the basic human code of morality derived from Just War Theory. Conversely, developers are pursuing programmable, human-like intelligence, capable of the autonomous application of International Humanitarian Law and the LOAC. Regardless of the opposition, technology continues to advance. The author addresses both sides of this issue for consideration and offers recommendations on a possible compromise for the way ahead.