The Art of Cyberwar: Security in the Age of Information
Cybercrime is an increasingly serious issue both in the United States and globally; the estimated annual cost of global cybercrime has reached $100 billion. Almost 560 million people are victims of cybercrime yearly — more than 1.5 million victims a day.
The U.S. Director of National Intelligence has ranked cybercrime as the top security threat — higher than the threat of terrorism, espionage and weapons of mass destruction, according to a 2014 national report on cybercrime that surveyed businesses, law enforcement and government agencies. As FBI Director James B. Comey explained, “The United States faces real [cybersecurity] threats from criminals, terrorists, spies, and malicious cyber actors.” One of the primary reasons for the severity of threats is that cybersecurity professionals are being outgunned. “The cybersecurity programs of US organizations do not rival the persistence, tactical skills and technological prowess of their potential cyber adversaries,” the report said.
What Is Cybercrime?
According to the Bureau of Justice Statistics, there are three general categories of cybercrime:
- A cyberattack is a crime in which the computer system is the target. Cyberattacks consist of computer viruses, denial of service (DDoS) attacks and electronic vandalism or sabotage.
- Cybertheft refers to a crime in which a computer is used to unlawfully acquire money or other items of value. Examples include embezzlement, fraud and theft of intellectual property or personal or financial data.
- Other computer security incidents include those made through theft of information. They are carried out with the help of adware, spyware, hacking, phishing, pinging, port scanning and more.
Notable Events in the United States
Major cyberattacks are becoming more and more common as hackers become more skilled. In recent years, several large-scale data hacks have occurred within major companies, including the following.
- Target: In December 2013, hackers gained access to customer credit and debit card information through Target’s website. The breach occurred long before it was detected, but Target was able to remove the malware that the hackers installed.
- Home Depot: A hack similar to Target’s occurred in September 2014 on Home Depot’s website, affecting almost 60 million customer credit cards.
- JPMorgan Chase: In August 2014, cyberattackers hacked into JPMorgan Chase’s information systems and compromised the accounts of 76 million households and 7 million small businesses. This security breach is among the largest ever recorded.
- Sony: Another highly publicized hack occurred when Sony Pictures Entertainment suffered a large-scale data breach in November 2014. The information stolen included personal employee data, internal emails, financial information, copies of unreleased films and more. The hackers in this case had a stated agenda: they demanded the cancellation of the release of the film The Interview, a comedy about an assassination plot against North Korean leader Kim Jong-un.
The Origins of Hacking
In the tech world, hacking is defined as “any technical effort to manipulate the normal behavior of network connections and connected systems.” Historically, the term “hacker” referred to clever, non-malicious technical work that was not necessarily related to computers. But while early hackers were enthusiasts who were primarily interested in modifying and optimizing programs for specific applications, malicious attacks became the norm as popularity grew. No longer satisfied with benign exploration of systems merely to learn how they worked, hackers began to use their skills for personal gain.
A Culture Divided
During the 1980s, a turning point occurred in the history of hacking, a direct result of the introduction of personal computers by companies such as IBM and Apple. Rather than working strictly within existing networks, hackers could now purchase computers for their own use. This meant that more and more individuals were learning to hack — and a larger number of active hackers created divides within the hacking community.
Before this division, essentially all hackers had dishonest motives: to illegally and unethically take control of both computers and networks. However, two distinct types of hackers emerged during the 1990s and still exist today. These “black hats” and “white hats” have very different views of how tech prowess should be put to use.
- Black-hat hackers, or malicious hackers, are criminals whose common job is to identify vulnerabilities in computer systems and manipulate them for gain. This is the classic definition of a hacker, as it identifies someone who purposefully seeks to commit theft or vandalize networks. Black-hat hackers are gifted but unethical computer experts who seek personal gain.
- Ethical hackers, or white hats, are one modern answer to malicious black hats. They may be employed by organizations to test computer systems and networks for vulnerabilities. These hackers use the same methods as black-hat hackers, but their goal is to fix computer security vulnerabilities and run tests to prevent malicious hacks from being carried out.
The Demand for Defense
Though ethical hackers are a source of additional help when it comes to identifying system vulnerabilities, trained cybersecurity professionals are more qualified to protect networks and create secure environments. They have an extensive number of methods available, such as firewall analyzers and portable anti-virus programs.
No matter how useful and high-tech these tools are, skilled security analysts are necessary if these tools are to be used effectively to protect companies. By 2017, the global cybersecurity market is expected to grow to a staggering $120.1 billion. In fact, the demand for cybersecurity experts is growing at 3.5 times the pace of the overall IT job market — and 12 times faster than the job market overall. In terms of salary, The Wall Street Journal states that engineers, analysts, architects and other types of trained cybersecurity professionals average $101,000 based on advertised information. This is well above the expected salary for IT professionals, which is $86,000, according to the Bureau of Labor Statistics.
Individuals interested in this career can consider pursuing an education in cybersecurity. To learn more about working in the cybersecurity field, follow #CyberAware and this week’s NCSAM materials, which focus on the future of the cyber workforce.
About the Author
Tricia Hussung has been writing about higher education since 2013. She specializes in areas such as technology and cybersecurity and enjoys writing informative and engaging content for current and future students. To read more from Tricia and learn more about cybersecurity education, visit online.sage.edu.