WASHINGTON – The roots of Apple’s struggle to avoid compromising the security of its famed iPhone reach back to Edward Snowden’s leak of nearly 2 million documents classified by the National Security Agency.

In the aftermath of that massive breach in 2014, Apple tweaked its operating system, iOS 8, so that it was nearly impossible to hack into an iPhone’s encryption.

In this operating system, the password is only known to the user and irretrievable by Apple itself; 10 incorrect password guesses can automatically wipe the device clean of its contents. Apple CEO Tim Cook said in a statement that his company has never allowed the government to create a backdoor to its products and servers — and “never will”.

Now, Cook and the FBI are locking horns over whether Apple will create software that can bypass the iPhone security of the deceased San Bernardino shooter, Syed Rizwan Farook. Apple argues that would endanger the one billion active users who own iPhones. Farook and his wife, suspected to have been inspired by Jihadist ideology, killed 14 people and injured 22 at a holiday party.

A magistrate judge in California is considering whether Apple should be compelled to create new software for the government to bypass the phone’s encryption. A near-identical case in New York went in favor of Apple on Monday, which may bolster the tech mammoth’s arguments against the government.

Technology versus cybersecurity debates have been lighting up all over the country since Cook issued his war-like statement opposing a court order sought by Apple.

FBI Director James Comey foreshadowed the tech and government clash in a 2014 speech at the Brookings Institution, entitled “Going Dark.”

Comey warned, that the future might present such “sophisticated encryption” that the government would no longer be able to extract data from high-end devices, leaving agents at a dead end—“all in the name of privacy and network security.”

After the FBI swept the iPhone C of Farook, it asked Apple to remove limitations, in order to mine for potential co-conspirators and other data. But complying with the FBI would require the installation of a separate, essentially defective software and a unique installation key that can only be cracked by Apple.

The technology implications are serious, cybersecurity experts say. Matt Gardner, the CEO of California Technology Council, said that obtaining the software sets a dangerous precedent, where the government hypothetically could use a favorable court ruling to boost its ability to use the same software in future cases.

“It’s agreed that it’s a bad idea for the federal government to compel a tech company to write (code),” Gardner said in a phone interview. Those who would succeed Obama – the 2016 presidential candidates – should not assume they would hold such power over the security of personal devices – “now or ever,” he said.

Creation of a break-in code, if it came into possession of the FBI, could also turn into a malicious “holy grail” for hackers, according to CATO cybersecurity expert Edward Peddington. As the Edward Snowden leaks and Office of Personnel Management data breach have proved, it would be a matter of not if, but when this danger could happen, he said.

“The Department of Justice said (the software) wouldn’t be vulnerable, and that is a clear, absolute lie,” Peddington said. “Virtually every person in connection to the Internet will be more vulnerable to cybercrime and identity theft.”

On the tech side, Apple could be exaggerating the vulnerability, because creating any software theoretically introduces some type of risk, according to Susan Hennessey, a Brookings cybersecurity expert and former attorney for the NSA.

The FBI has suggested that Apple could develop its software exclusively within its facilities, and that once the information was handed over, Apple could destroy the evidence, minimizing risk.

But that suggestion itself, according to Heritage cybersecurity expert Steve Bucci, is no more secure. He said the code is intellectual property that could be exploited at the hands of the government.

“They can say they ‘destroyed’ it…but the FBI will be coming back to them saying ‘we really need it again’, and to say that they will never do that again is complete and utter fantasy,” Bucci said.

The Snowden leaks ramped up sensitivity to cybersecurity threats in Silicon Valley and started what Hennessey calls an “anti-government publicity race.”  The leaks likely prompted Apple to tighten up its iOS security in the first place to make it virtually unhackable, and the move was heavily marketed.

To that effect, Apple needed to rebuild consumer trust which was violated by companies with “household names that started with G’s and Y’s”, said Gardner. This case is a PR win-win for Apple even if it loses, which is waging war on behalf of its billion and more customers’ rights.

In a similar smart phone security case, , New York officials last October took the phone of a drug dealer who has since pled guilty. Law enforcement asked Apple to unlock the phone, but the device in question ran on iOS 7, an older operating system which doesn’t require the backdoor sought by the government in the San Bernadino case.

Since Apple couldn’t make a claim against creating new software, the company claimed that unlocking the phone in the New York case risked reputational harm and put an undue burden on the firm. Apple said that its business plan for preserving customer data has since changed and the company needed to cater to its clients.

“This doesn’t square up because Apple is arguing that it would undermine its legal credibility and tarnish its brand, when no [breach] could have happened,” Susan Hennessey said.

Heritage expert Bucci said Apple needs to act more like a tech company than a shareholding company that is protecting its brand.

But there is another area Apple is now entering, the more sinister “Going Dark” discussion, which deals with how much access government should appropriately have to access data, regardless of the technology consequences involved.

The reality is that ISIS and other bad actors are tapping into the dark web, using anonymous browsers and encrypted apps like Tor and Telegram, neither developed by U.S. tech companies.

The San Bernadino case may define where the government can extend its bounds for the shooter’s iPhone only, but there are concerns that technology in general could outrun the government and law enforcement and become a catch-up race for lawmakers.

Harvard’s Berkman Center recently released report aptly titled “Don’t Panic,” quashes some of these concerns.

Technology companies are not likely to ever use end-to-end encryption and entirely obscure user data, because they need a degree of customer information to assess revenue streams and future product development, the report found. This includes metadata, or smart data which tracks locations, and telephone records.

“Today’s debate is important, but for all its efforts to take account of technological trends, it is largely taking place without reference to the full picture,” the report said.