Company Failing to Encrypt Hardware Fined $3 Million Drew Wilson | November 10, 2019 While some US lawmakers are trying to discourage companies from using encryption, one company has been fined for not using encryption. The US Attorney General, William Barr, has been waging a war on encryption with the help of other nations. As far as Barr is concerned, companies that use effective encryption are a threat to national security and public safety. This is because the government might not have so-called “backdoor access”. In turn, spy agencies need to spy on user communications under the excuse of needing to catch “the bad guys”. So, if you are a company, it almost sends the message that you could land in hot water if you use encryption. So, you might be better off not protecting user communications as a result. For some, that kind of thinking is understandable. This is especially so if this whole talk about encryption is simply over your head. Unfortunately, what we are hearing recently might throw yet another wrench into the system. A New York Medical centre wound up recently settling out of court. They were ordered to pay a $3 million fine for failing to encrypt their devices. This fine stems from violations of HIPAA (Health Insurance Portability and Accountability Act of 1996). From Lexology: On November 5, 2019, the Department of Health and Human Services (“HHS”) Office for Civil Rights (“OCR”) announced that a New York Medical Center (“Medical Center”) will settle potential violations of the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) by paying a civil penalty of $3 million and entering into a Corrective Action Plan. The Medical Center is a HIPAA covered entity that includes hospital and academic medicine components. According to OCR, the Medical Center had experienced several issues with lost or stolen unencrypted devices. OCR investigated the Medical Center in 2010 in a matter relating to an unencrypted flash drive and had provided technical assistance to the Medical Center. In the course of receiving that technical assistance, the Medical Center identified a lack of encryption as a high risk to its electronic protected health information (“ePHI”). Despite identifying this risk, the Medical Center continued to allow the use of unencrypted mobile devices. In 2013, the Medical Center notified OCR of the breach of unsecured ePHI, specifically the loss of an unencrypted flash drive. In 2017, the Medical Center notified OCR that an unencrypted personal laptop that contained Medical Center ePHI had been stolen, which resulted in the Medical Center impermissibly disclosing the ePHI of 43 patients. OCR did not consider the risk analysis conducted by the Medical Center to be an accurate and thorough analysis of all potential risks and vulnerabilities to the confidentiality, integrity and availability of all of the ePHI the Medical Center was responsible for safeguarding. Further, OCR determined that the security measures implemented by the Medical Center to reduce risks and vulnerabilities to a reasonable and appropriate level were insufficient. OCR further found that the policies and procedures governing hardware and electronic media, including receipt and removal and movement of such hardware and electronic media in, out and within the Medical Center were also insufficient. Finally, OCR determined that the Medical Center did not implement mechanisms that were sufficient to either (1) encrypt and decrypt ePHI, or (2) document why encryption was not reasonable and appropriate while implementing an equivalent alternative measure to safeguard ePHI. So, at least in the medical field, if this whole encryption thing is over your head, you might find yourself thinking “darned if you to, darned if you don’t”. If you do encrypt data, you are letting “the bad guys” win. If you don’t encrypt, you could be in violation of the law. You might as well throw your hands up in despair in all of this at this point. So, what is it? Is encryption this nebulous evil thing that lets the bad guys win or is it a mechanism for increasing safety and privacy? Some would argue that things are much more nuanced then that. What is being described in the Facebook issue is encrypting messages between users. Meanwhile, what is being talked about here is device encryption. This is a fair point assuming you are one that can navigate your way through these issues and parse between device encryption and messaging encryption. This requires a technological background with a side of legal. Now, let’s look at the more practical aspect of having to explain all of this to an MBA (Masters in Business Administration) running the company so a direction can be decided on. Could you get lucky and get someone who has a technological or legal background? Maybe. At the same time, it is entirely possible to get someone who is all about spreadsheets and slideshow presentations more than understanding the inner workings of the company itself. One might get someone who is “more about being a person that manages people”. Are they really going to understand these nuances and make a well-informed decision? Maybe, maybe not. This is why such things popping up makes things even more messy to begin with. Ensuring private entities across the board understand all these legal intricacies becomes pretty impractical pretty quick. With someone like Barr trying to take on encryption, it makes things much more difficult to understand at the basic level. With all of this, it makes Barr’s position much more problematic. It also further adds to the reasons why the push to stop effective encryption is ill advised. Drew Wilson on Twitter: @icecube85 and Facebook.