Jump to content

Lentis/Cryptography and Government Surveillance

From Wikibooks, open books for an open world

Introduction/Background

Phillip Rogaway

[edit | edit source]

Phillip Rogaway is a professor in the department of Computer Science at the University of California, Davis, USA. He has been at the university from 1994-2016.[1] Before going to UC Davis he completed his Ph.D. at MIT’s Theory of Computation group in 1991. While being a professor at UC Davis he has focused on research to “obtain provably-good solutions to protocol problems of genuine utility.”[1] However, Professor Rogaway is also known for his research and enthusiasm in social and ethical issues connected to technology.[1]

On his outreach page hosted by UC Davis he says “I’m embarrassed to work at a campus with a chancellor known for her ethical indiscretions: the pepper spray incident; moonlighting with the DeVry Group, defense contractor EMAG Technology, publisher John Wiley & Sons, and the devious King Abdulaziz University of Saudi Arabi; supporting the FBI/DoD National Security Higher Education Advisory Board; having UCD buy search engine manipulation services from various “reputation management” companies; and, before all that, the University of Illinois clout scandal. The UC system itself is headed by a former spy chief. What can you do?”[1] Professor Rogaway is also known for several (50+) ethical speeches and papers in the field of technology and cryptography.[2]

He is also known for his stance of the ethical obligations that cryptographers and computer scientists have to serve the public good, specifically in the areas of internet privacy and digital surveillance. This is apparent in the aforementioned quoted paragraph that outlined his embarrassment working under the leadership at UC Davis.

In Rogaway’s paper, The Moral Character of Cryptographic Work, he suggests that there are 3 events that kicked started “Social Responsibilities of Scientists.”[3]

  1. The Atomic Bomb
  2. The Nuremberg Trials
  3. The Environmental Movement

The dropping of the Atomic bomb made the scientists and engineers that developed it have remorse and second thoughts. A member of the Manhattan Project, Albert Einstein said, “I made one great mistake in my life - when I signed the letter to President Roosevelt recommending that the atom bombs be made; but there was some justification - the danger that the Germans would make them.” And Leo Szilard, another member of the Manhattan Project said, “Suppose Germany had developed two bombs before we had any bombs. And suppose Germany had dropped one bomb, say, on Rochester and the other on Buffalo, and then having run out of bombs she would have lost the war. Can anyone doubt that we would then have defined the dropping of atomic bombs on cities as a war crime, and that we would have sentenced the Germans who were guilty of this crime to death at Nuremberg and hanged them?” Leo Szilard and Albert Einstein both show in these quotes that dropping the atomic bomb came with serious consequences and that it was immoral to use a weapon of that destruction. [4]

The Nuremberg Trials were a series of 13 judicial trials held after World War 2 to bring Nazi war criminals to justice.[4] The greatest significance of these trials is that “crime against humanity was given a legal definition.”[5] The defense of the accused repeatedly suggests that they were simply following orders, however, this view was rejected and judged that following orders did not excuse legal and moral responsibility.[2] These trials also helped bring about the modern ethical standard for medical doctors as The Nuremberg trials began with the Medical Case, the prosecution of 23 scientists, physicians, and other senior officials for gruesome and routinely fatal medical experiments on prisoners.”[2]

Rogaway finally suggests the environmental movement contributed to the rise of ethical responsibility. In the 1962 publication of Rachel Carson’s Silent Spring, she painted a picture of the end of life due to the disappearance of birds, oversized activities of chemical manufacturing, and non-specific pesticides.[2]

Rogaway gives three basic tenets for an ethical scientist/engineer:[2]

  1. Not to use your work to contribute to social harm[2]
  2. Actually use your work to contribute to social good[2]
  3. 1 & 2 stem from your specific training and professional role.[2]

Item three suggests that ethical responsibility is defined by profession. Meaning that there isn’t a rigid definition of ethical responsibility.

Unfortunately, according to Rogaway, “The Ethic of responsibility is in decline.” “In nearly 20 years advising students at my university, I have observed that a wish for right livelihood almost never figures into the employment decisions of undergraduate computer science students. And this isn’t unique to computer scientists.”[2] Where the right livelihood is the respect for humanity as a whole. He is able to illustrate this idea in a social experiment. This experiment consisted of searching on google for help choosing between job offers. Among the first five retrieved articles, none of them mentioned anything about the social or ethical nature of the jobs and choosing a job that didn’t violate any moral/social code.

Rogaway’s applies the aforementioned ideas of ethical responsibility to his field of research, cryptography. (His third tenant for being an ethical engineer). He says that cryptography can determine the outcomes of wars, political and economic movements. This has been proved in history and has been portrayed in recent popular culture references such as A Beautiful Mind, and The Imitation Game.

Cryptography

[edit | edit source]

Cryptography is the study of using mathematical and logical means to secure data. In the context of communication, messages are the data that needs to be secured. Cryptography aims to find how someone can structure a message so that only the message’s intended recipients can read it.

History

[edit | edit source]

Cryptography has been in use for thousands of years.

Originally, cryptography was only a method of scrambling letters in a reproducible way: if you knew the method for scrambling, you could unscramble the letters. The Caesar Cipher is a famous example of this, and was used during the times of Julius Caesar. The security of these methods of encryption were based on keeping the scrambling method secret.

In the 1800s, the use of keys in cryptography became popular. Cryptosystems were now designed so that their scrambling method was based on a key: usually a short, random string of letters. This meant that the cryptosystem could be known to the enemy, but as long as the key was kept secret, the cryptosystem was (ideally) still secure. The security of key-based cryptosystems were formalized by Kerckhoff’s principle.

These cryptosystems are classified as symmetric-key cryptography. All parties who want to communicate must have a copy of the same key. The Enigma, used by the Germans in World War 2, was a famous example of such a cryptosystem.

Modern Cryptography

[edit | edit source]

The foundations of modern cryptography come from research done in the field of complexity theory. Complexity theory established mathematical notions of what a computer program is and how to analyze its efficiency. From this, researchers were able to start categorizing computer programs into different classes of efficiency. Also, they were able to categorize theoretical problems in computer science into classes based on how much work a computer would have to do to solve that problem.

Modern cryptography is designed after a specific class of those problems, nondeterministic polynomial time, which have the following convenient attributes:

  • If someone knows the decryption key, then it’s trivial for a computer to decrypt the data
  • If someone doesn’t know the decryption key, it’s virtually impossible for the computer to decrypt the data

In general cryptography is used when one wants to use secure communication: e.g. mobile phones, banks, web browsers, the military, etc. Despite modern electronic cryptography having been available for decades, it’s still not used in every scenario.

Cryptography on The Internet

[edit | edit source]

The internet was not originally designed with security in mind. Encryption is never enabled by default; users and companies have to intentionally add it to their system.

Additionally, the United States government has made it difficult to spread the use of encryption. For example, in the 1990s, it classified encryption as a munition, and was therefore illegal to export from the United States. Intelligence agencies claim that the widespread use of encryption would severely hinder their ability to prevent attacks (see: “Going Dark Issue”).[6]

Mass Surveillance

[edit | edit source]

In an abstract sense, the internet is a network of networks, which means that all data transactions online travel through 3rd parties before reaching their destination. The internet is a hierarchy of networks, ranging from the local network in your home/office, to the network of your internet service provider, to the networks that the ISP’s connect to.

If an entity has access to the connection points between networks and the data that travels through these points is unencrypted, then they have access to all of the data that travels through that point. The NSA performs this surveillance as part of its [w:PRISM (surveillance program)|PRISM program]] and the British Government Communications Headquarters as part of its Tempora program.

Politics of Cryptography

[edit | edit source]

Cryptography was a political topic since the mid-1970s. Many cryptographers, like Whitfield Diffie, entered the field because “they were worried about privacy in the world.”[7] In the 1980s, the tone shifted in the academics. There has been a movement to remove politics from the sciences. Many started to believe the researchers’ job “is not to change the world, but to interpret it.”[8] Rogaway mentions how many computer scientists could not explain their views on their ethical responsibility to the public. One scientist even stated that she was “a body without a soul.”[3]

Rogaway identified two papers to show the split[3], Chaum’s paper on Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms[9] and Goldwasser’s and Micali’s paper on Probabilistic Encryption.[10] In Chaum’s paper, the author first framed the topic as a social issue. He states that how his research can also be used to solve other social issues, such as email, electronic voting, and record keeping. The word, “privacy”, is a keyword.[9] In contrast, Goldwasser and Micali frame the topic as a technical problem. They are trying to solve an interesting problem. The abstract and introduction is filled with research jargon such as “proof”, “model”, and “theory.” The word, “privacy”, is not found once in their paper.[10] There is little overlap between cited papers.[3]

While professional cryptographers have, a group called cyberpunks have been keeping the field political.[3] This group worry about the balance of power between the government and the public. Cyberpunks have made technology such as Bitcoin, Tor, and Pretty Good Privacy to decentralize and weaken government influence. They believe encryption is an important check to government power.

Ethics of Cryptography

[edit | edit source]

Ethics organizations, such as the NSPE, IEEE, and the ACM, have been formed. None are specifically for cryptographers, but the aforementioned groups do include them, especially the ACM. There are two codes in the ACM code of ethics that specifically pertains to cryptographers. Code 1.7 is about respecting the privacy of others. “It is the responsibility of professionals to maintain the privacy and integrity of data describing individuals.” [11] The code does allow for a professional to break the code “in cases where it is evidence for the violation of law, organizational regulations, or this Code.” Code 2.8 is about access of data. “No one should enter or use another's computer system, software, or data files without permission.” [11]

Research has been done to strengthen and weaken encryption. Both sides need to consider the consequences of cryptography. Those that have been strengthening encryption need to consider code 1.7. Encryption can be used by criminals to hide their crimes. FBI Director Louis B. Freeh told a Congressional committee, "All of law enforcement is also in total agreement on one aspect of encryption. The widespread use of uncrackable encryption will devastate our ability to fight crime and prevent terrorism."[12]This group of cryptographers need to think if they have the responsibility of keeping the technology from criminals and if it is their duty to help law enforcement. Those weakening encryption need to consider code 2.8. Encryption is commonly used to protect sensitive data. Breaking encryption is accessing unauthorized data. Many researchers receive funding to break encryption such as the Justice Department. [13] Cryptographers need to consider how such groups will use the technology and whether is ethical to do the research.

Conclusion

[edit | edit source]

References

[edit | edit source]