Will Your Recruitment Initiatives Invite and Welcome Computer Hackers?

inside threat

It is very clear that the current landscape is replete with stories of improper intrusion and hacking of computer systems leading to improper dissemination of proprietary or other types of protected information.

Most organizations try to block the unwanted intruder (hacker) from ever gaining access to their computer systems. A common method utilized by hackers is known as phishing, which is an e-mail fraud method in which the perpetrator sends out legitimate-looking email in an attempt to get the unsuspecting victim to click on a particular link, oftentimes seeking private information. Clicking on that link may also allow for malware/viruses to enter the unsuspecting victim’s computer system. So far, we see nothing new.

I recently read that there is a variant on the phishing scheme which comes into play when a company advertises that it is seeking to fill a position. In essence, it is inviting applicants to send resumes which normally and, in fact, are expected to be sent as email attachments. The person tasked with hiring, oftentimes HR, or in smaller organizations, someone with admin responsibilities receives a series of e-mails from would-be applicants. The attachment, however, can contain malware which would not necessarily be detected.

Frankly, I found this situation to be alarming because the general rule of “don’t open e-mails or attachments from people you don’t know” realistically falls by the wayside. In fact, the refrain “you really should have known better” also falls by the wayside.

How many people has your organization hired by placing ads on websites and then sifting through the e-mail responses?

Antivirus software and keeping current on software patches are an obvious first step.

Internal firewalls with dual factor authentication may be the next step.

 

How Much Does a Data Breach Cost in Dollars and Cents?

Online Security

In my last few posts, I wrote about causes of HIPAA breaches and the possible course of a compliance agreement. ( “The Most Detailed and Costly Compliance Agreement You Are Ever Likely to See” , “Seven Noteworthy HIPAA Breaches & the Recent Enforcement Actions” , “The Seven Most Likely Causes of Major HIPAA Breaches” , “The Five Most Likely Types of Major HIPAA Breaches” ) A basic question though is how much does a data breach cost in dollars and cents?

I am reasonably certain that as with all statistical matters, depending on how you skew the numbers, there can be vastly different results. I recently came across a report by the Ponemon Institute/IBM dated May 2015, which deals with global data breaches (not restricted to healthcare and/or HIPAA breaches) which I believe is both timely and highly informative.

Some of the key findings of this report indicate that there has been a 23% increase in the total cost of data breaches since 2013 (understanding that this 2015 report represents 2014).

The simple study of 350 companies dealt with data breaches. The average cost of a breach increased from $3.52 to $3.79 million during a one year period.

An interesting finding was that 79% of C-level US and UK executives surveyed said that executive level involvement is necessary to achieve an effective incident response to a data breach and 70% believe that board level oversight is critical. The reason I point out this factoid is that too many small to medium companies approach HIPAA compliance (which to me is really a subset of the need for data security) with the belief that outsourcing compliance is enough.

All of the participating companies experienced a data breach ranging from a low of approximately 2,000 to slightly more than 100,000 compromised records. For the purposes of this study, a compromised record was one that identified the individual whose information was lost or stolen in a data breach. A breach was defined as an event in which an individual’s name plus a medical record and/or financial record or debit card is potentially put at risk. (Obviously, the report did not deal with the 19 identifiers relating to HIPAA.)

Malicious or criminal attacks were 47% of the root causes as opposed to 42% a year earlier, and similarly the report shows an increased cost from $159 to $170 per record. The cost is highest in the United States, with an average of $230 per record.

The smaller the breach the greater the likelihood, and apparently, the higher the cost per record.

Costs relating to detection increased as well from $0.76 million to $0.99 million. The costs included forensic and investigative activities, assessment and audit services, crisis team management and communications to executive management and board of directors.

The cost of the data breach ranges by industry, and while the average is $154, the average cost for a healthcare organization is $363.

The cost can vary based on the initial safeguards put in place.

While notification costs are relatively low, the cost associated with lost business is increasing.

The general attitude of NIMBY (Not in my backyard) seems to be a common mindset with small to medium Covered Entities (CEs) and/or Business Associates (BAs) – this only happens to the other guy. The threat of a data breach is real.

In communication I had with the FBI Cyber Crime and US Attorney prosecutors, the question they pose is not IF you will have a breach, but rather WHEN you will have a breach. The key is preparation and implementing safeguards.

When virtually every company surveyed had a breach of some size, it is fair to assume that this mindset (even absent the significant regulatory issues) is misguided.

 

The Most Detailed and Costly Compliance Agreement You Are Ever Likely to See”

The Most Detailed and Costly Compliance Agreement You Are Ever Likely to See

Corporate integrity agreements or the consent agreements which are reached between the government (HHS) and Covered Entities and Business Associates can be extremely detailed, comprehensive and costly.

In my last post (http://bit.ly/1RsCwLP )  I went so far as to say that these agreements and their implementation are often more expensive than the actual fines, and that I would discuss one of the most far reaching consent agreements I had ever seen, namely, the corporate integrity agreement between OIG-HHS and Nason Medical Center.

While I cannot incorporate the totality of an agreement that is over 50 pages long into a few paragraphs, I think that I can convey the spirit of this agreement.

  1. The length of the agreement is five years.
  2. The people covered by the agreement include all owners, officers, directors, managers (which include members of the mandated “Management Committee”) and all employees, contractors, subcontractors, agents and other persons who provide patient care items or services or who perform billing or coding functions on behalf of Nason, as well as all physicians or other non-physician practitioners who work within one or more of Nason’s facilities.
  3. Establishment of a Compliance Officer and Compliance Committee – and with respect to the Compliance Officer, that individual must be a member of senior management, report directly to the CEO, cannot be subordinate to the General Counsel or CFO, and must be required to visit each location where Nason provides patient services at least every two weeks.

Responsibilities include developing and implementing policies, procedures and practices designed to ensure compliance, making periodic (at least quarterly) reports regarding compliance matters directly to the “Management Committee” with written reports to the “Management Committee” made available to OIG on request, as well as monitoring the day-to-day compliance activities engaged in by Nason.

Not surprisingly, Nason must report to OIG in writing any changes in the identity or description of the compliance officer.

  1. Compliance committee, which at a minimum must include the Compliance Officer and other members of senior management, including senior executives of relevant departments such as billing, clinical, human resources, audit, and operations as well as at least one employee who works at least 20 hours per week at each building where Nason sees patients. The Compliance Officer chairs the Compliance Committee. The Compliance Committee must support the Compliance Officer in fulfilling his/her responsibilities.
  2. Management Committee’s compliance obligations include meeting at least quarterly to review and oversee Nason’s compliance program, the performance of the Compliance Officer and the Compliance Committee, submitting to OIG a description of the documents and other materials reviewed as well as any additional steps taken in its oversight of the compliance program. In addition, each reporting period, the committee must adopt a resolution signed by each “manager” of the “Management Committee” summarizing its review and oversight of Nason’s compliance with Federal Health Care program requirements and the obligations of the agreement.

This resolution at a minimum must certify that the Management Committee” has made reasonable inquiry into the operations of Nason’s compliance program including the performance of the Compliance Officer and the Compliance Committee. Based on its inquiry and review, the Management Committee must be able to conclude that, to the best of its knowledge, Nason has implemented an effective compliance program to meet Federal Health Care program requirements and the obligations of this agreement. Conversely, if they are unable to provide the required conclusion, they must provide an explanation to OIG explaining why.

  1. In addition, managers (people with management responsibilities) are specifically expected to monitor and oversee activities within their areas of authority and annually certify that the applicable Nason department is in compliance with applicable Federal Health Care requirements and with the obligations of this agreement. These employees include but are not limited to the billing manager; director of Human Resources; medical director; Nason medical center manager and CEO; laboratory director; radiology director; business administration manager; accounting director; director of business analysis; and parent company CEO.

The certification must include language that “I have been trained on and understand the compliance requirements and responsibilities as they relate to my department, and/or facility, an area under my supervision” ensuring that the department complies with all applicable Federal Health Care program requirements, obligations of the agreement, and Nason policies, and that they have taken steps to promote such compliance. To the best of their knowledge, except as specifically stated in the certification, they must attest that Nason is in compliance with all applicable Federal Health Care program requirements and the obligations of this agreement.

The list goes on and on, and in fact I have just turned to page six of the agreement. At this point, you could probably imagine that the cost of compliance, and the responsibility placed on the majority of the organizational chart (including new positions that were created based on this agreement) will have a heavy impact on the operations of the organization.

  1. An independent monitor selected by OIG must be retained. The monitor may retain additional personnel including independent consultants to help meet the monitor’s obligation under the agreement. The monitor may confer and correspond with Nason, OIG, or both. The monitor is not an agent of OIG; the monitor, however, may be removed by OIG at its sole discretion. If the monitor resigns or is removed, Nason must retain another monitor selected by OIG within 60 days. The monitor is granted virtually unlimited access to all of Nason’s records and documents. The length and breadth of the reports that the monitor must prepare is extensive. Nason is responsible for all reasonable costs incurred by the monitor in connection with the engagement, including labor costs, indirect labor costs, consultant and subcontractor costs, material costs and other direct costs such as travel, etc.

Nason must pay the monitor’s bills within 30 days of receipt. Failure to timely pay the bills constitutes a default under the agreement with OIG, unless said bills are contested and taken up with OIG.

In case you thought that this was not oppressive enough, the agreement also requires engaging an independent review organization.

  1. The independent review organization, such as an accounting, auditing or consulting firm, must perform various reviews on Nason. This organization is charged with the responsibility of reviewing Nason’s coding, billing and claims submission to Medicare and state Medicaid programs and the reimbursement received. Of course, OIG reserves the right to do its own independent reviews. The independent review organization must certify its independence and objectivity.

I could go on and “get into the weeds” regarding the highly detailed requirements (both in terms of staff compliance, report generation, and resulting certifications) but I am concerned that I will lose the readers’ attention and distract them from the point I am trying to make.

Noncompliance with HHS-OIG may result in a corporate integrity agreement or consent agreement which is set forth in news releases. The cost of the actual fine, however, does not necessarily begin to give the reader the picture of the burdens, costs, and potential liability that these agreements create.

HIPAA, HITECH and the Omnibus Rule place specific requirements on covered entities and their business associates. Audits can be triggered randomly (as HHS is ramping up audits) or can be triggered by a reported breach by the entity or by an individual whose privacy was violated. In addition, audits have been triggered by media reports and/or reports brought by members of the public at large.

The bottom line is that an ounce of prevention is worth a pound of cure. What do you think?

Seven Noteworthy HIPAA Breaches & the Recent Enforcement Actions

Puzzle words

The following unlucky seven were subject to substantial fines. The costs associated with defending the audit, negotiating the settlement and the cost of implementing the invariable forward-going consent agreements/corporate action plans (CAP), however, are separate and above (and often higher) than the reported fine.

These cases range from relatively small to admittedly large breaches, from the unlikely event to situations that could happen to any entity without implementation of well thought out and vigorously monitored policies and procedures.

In my next post, I will detail one of the most burdensome consent agreements I have ever seen, namely, the Corporate Integrity Agreement between the Office of Inspector General of the Department of Health and Human Services and Nason Medical Center.

It is evident that the ever increasing enforcement of HIPAA and the Omnibus Rule, as well as both the increased use of electronic data and the commonplace reports of mass data breaches are forcing Covered Entities (CE) and their business associates (BA) to increase the resources dedicated to compliance with the Omnibus Rule.

1.    Cornell Prescription Pharmacy ($125,000)

The Denver compounding pharmacy will pay this fine after HHS learned of the potential HIPAA violations from a television news report that PHI was improperly disposed of after a garbage dumpster with un-shredded PHI was discovered. Cornell also agreed to develop and implement a comprehensive set of policies and procedures to comply with HIPAA rules, and to provide staff training. OCR Director Jocelyn Samuels stated that “Regardless of size, organizations cannot abandon protected health information or dispose of it in dumpsters or other containers that are accessible by the public or other unauthorized persons.”

2.    Anchorage Community Mental Health Services, Inc. ($150,000)

Malware compromised the security of ePHI due to a failure to update software patches as well as unsupported software.

HHS Office for Civil Rights (OCR) received notification from ACMHS, a non-profit, regarding a breach of unsecured electronic protected health information (ePHI) affecting 2,743 individuals due to malware compromising the security of its information technology resources. It was later determined that ACMHS had not timely installed patches to its software as mandated by its very own policies and procedures. The takeaway is that entities are not only required to follow the regulations, but they are also being held accountable for compliance with their own policies and procedures.

3.    Parkview Health System ($800,000)

OCR opened an investigation after receiving a complaint from a retiring physician alleging that Parkview had violated the HIPAA Privacy Rule. In September 2008, Parkview took custody of medical records pertaining to approximately 5,000 to 8,000 patients while assisting the retiring physician to transition her patients to new providers, and while considering the possibility of purchasing some of the physician’s practice.  On June 4, 2009, Parkview employees, with notice that the physician was not at home, left 71 cardboard boxes of these medical records unattended and accessible to unauthorized persons on the driveway of the physician’s home, within 20 feet of the public road and a short distance away from a heavily trafficked public shopping venue. Parkview entered into a one year corrective action plan without admission of any wrongdoing.

4.    NY Presbyterian Hospital and Columbia University Medical Center ($4.8 million)

An investigation revealed that a breach was caused when a physician employed by Columbia University Medical Center who developed applications for both New York  Presbyterian Hospital  and CU attempted to deactivate a personally-owned computer server on the network containing NYP patient ePHI. The noteworthy point is that it seems that the person who caused the breach had all the right intentions but the result was catastrophic.

Because of a lack of technical safeguards, deactivation of the server resulted in ePHI being accessible on Internet search engines. The entities learned of the breach after receiving a complaint by an individual who found the ePHI of the individual’s deceased partner, a former patient of NYP, on the Internet. Another noteworthy point is that knowledge of a breach is often only discovered by the breaching entity after receiving reports from third parties. This general situation was confirmed to me by an FBI cybercrime agent.

In addition to the impermissible disclosure of ePHI on the Internet, OCR’s investigation found that neither NYP nor CU made efforts prior to the breach to assure that the server was secure and that it contained appropriate software protections.  Moreover, OCR determined that neither entity had conducted an accurate and thorough risk analysis that identified all systems that access NYP ePHI.  As a result, neither entity had developed an adequate risk management plan that addressed the potential threats and hazards to the security of ePHI.  Lastly, NYP failed to implement appropriate policies and procedures for authorizing access to its databases and failed to comply with its own policies on information access management.

NYP has paid OCR a monetary settlement of $3,300,000 and CU paid $1,500,000, with both entities agreeing to a substantive corrective action plan which includes undertaking a risk analysis, developing a risk management plan, revising policies and procedures, training staff and providing progress reports.

5.    Concentra Health Services ($1,725,220)

OCR opened an investigation following a reported breach that an unencrypted laptop containing the ePHI of 870 individuals was stolen from one of its facilities, the Springfield Missouri Physical Therapy Center.

The investigation found that Concentra had previously recognized, in multiple risk analyses, that a lack of encryption on its laptops, desktop computers, medical equipment, tablets and other devices containing electronic protected health information was a critical risk.  While steps were taken to begin encryption, Concentra’s efforts were “incomplete and inconsistent over time,” according to an HHS press release, leaving patient PHI vulnerable throughout the organization.

Essentially, Concentra did not sufficiently implement policies and procedures to prevent, detect, contain, and correct security violations under the security management process standard when it failed to adequately execute risk management measures to reduce its identified lack of encryption to a reasonable and appropriate level from October 27, 2008, (date of Concentra’s last project report indicating that 434 out of 597 laptops were encrypted) until June 22, 2012 (date on which a complete inventory assessment was completed and Concentra immediately took action to begin encrypting all unencrypted devices).

Concentra did not make any admissions of liability but entered into a CAP – corrective action plan.

6.    Adult & Pediatric Dermatology, P.C. ($150,000)

An investigation of Adult & Pediatric Dermatology was initiated upon receiving a report that an unencrypted thumb drive containing the electronic protected health information (ePHI) of approximately 2,200 individuals was stolen from a vehicle of one its staff members. The thumb drive was never recovered.  The investigation revealed that A&P Derm had not conducted an accurate and thorough risk analysis as part of its security management process.  Further, it did not fully comply with requirements of the Breach Notification Rule to have in place written policies and procedures and train workforce members. It did not admit liability and entered into a CAP.  The takeaway is that the use of thumb drives to store ePHI is inherently problematic and the use of unencrypted storage devices is courting disaster.

7.    Affinity Health Plan, Inc. ($1,215,780)

OCR’s investigation indicated that Affinity impermissibly disclosed the protected health information of up to 344,579 individuals when it returned multiple photocopiers to a leasing agent without erasing the data contained on the copier hard drives. In addition, the investigation revealed that Affinity failed to incorporate the electronic protected health information stored in copier’s hard drives in its risk analysis as required by the Security Rule, and accordingly failed to implement policies and procedures when returning the hard drives to the companies from whom it leased its copiers.  Affinity did not admit liability and entered into a short term CAP.  The takeaway is the required scope, detail and individual nature of the required risk analysis.

 

About Mendel Zilberberg:

An attorney, visionary and entrepreneur admitted to practice in New York, New Jersey and Florida who has represented and counseled clients with nationwide interests in many areas of the healthcare arena.

The use of ePHI is growing exponentially, the likelihood of a breach is ever increasing, and the regulating authorities are ramping up their audit/enforcement programs.  Covered Entities (CE) and Business Entities (BA) must understand the importance of maintaining the integrity of ePHI, compliance with the relevant regulations as well as thoroughly understand the potential consequences for non-compliance. 

The Seven Most Likely Causes of Major HIPAA Breaches

Computer Security

While it is important to comply with all of the mandates of the Omnibus Rule, I think it is instructive to know from where the most vulnerable areas of breach of PHI arise.

In a recent presentation to a limited number of attorneys in which I participated, an investigator for the Office for Civil Rights (OCR) advised that with respect to breach notification of major HIPAA breaches (those in which the PHI of 500+ individuals had been disclosed), as of February 27, 2015,  OCR’s  records indicate that the following were the percentages attributable to the causes/circumstances for those breaches:

  1.   Paper records 22%
  2.   Laptop 21%
  3.   Desktop computer 12%
  4.   Network server 12%
  5.   Portable Electronic device 11%
  6.   Email 7%
  7.   EMR 4%
  8.   Other 11%

 

The Five Most Likely Types of Major HIPAA Breaches

The Five Most Likely Types of Major HIPAA Breaches

While it is important to comply with all of the mandates of the Omnibus Rule, I think it is instructive to know from where the most vulnerable areas of breach of PHI arise.

In a recent presentation to a limited number of attorneys in which I participated, an investigator for the Office for Civil Rights (OCR) advised that with respect to breach notification of major HIPAA breaches (those in which the PHI of 500+ individuals had been disclosed), as of February 27, 2015, OCR’s records indicate that the following were the percentages attributable to the types of breaches:

  1.   Theft 51%
  2.   Unauthorized Access/Disclosure 19%
  3.   Loss 9%
  4.   Hacking /IT Incident 7%
  5.   Improper Disposal 4%
  6.   Other 9%
  7.   Unknown 1%

Does the FDA Need a Comprehensive Reassessment?

testing-i

Has technology outpaced the laws and regulations that guide/drive the FDA?

In recent years, advances in technology have precipitated quantum leaps in bothmedical/diagnostic and treatment alternatives. The controlling laws and regulations which guide and govern the FDA may either not have kept pace, or as result of technology advances, be subject to unintended consequences which may negatively impact the very people the FDA seeks to protect.

Prevailing wisdom, law and popular opinion strongly allow for and suggest:

  1. That if medical data is properly and responsibly aggregated and analyzed, the process has the capacity to lead to significant improvement and efficiencies in the delivery of medical care. (The issue of the protections needed with the aggregation and de-identification of data is beyond the scope of this post, but in any case does not appear to be an FDA concern.)
  2. Thatpatients have unrestricted access to their personal medical data.

On the other hand, the FDA is guided by a statutory framework that goes back to the late 1950s/early 1960s.

As many of you may be aware, in the late 1950s thalidomide was first marketed in West Germany and was primarily prescribed as a sedative or hypnotic. There were also claims that it might cure anxiety, insomnia and tension among other assorted conditions. Thereafter, it was apparently used in the treatment of nausea and to alleviate morning sickness in pregnant women. On October 1, 1957,thalidomide became an over-the-counter drug in West Germany. The popularity of thalidomide, particularly among pregnant women precipitated an unmitigated catastrophe. Thousands of infants were born with malformation of the limbswith an approximate 60% mortality rate.

Not surprisingly, these events sent shock waves through the global medical/pharmaceutical world.  It is readily apparent that not enough had been done to ensure the safety of this drug before it was approved.

The United States responded with the passage of the Kefauver Harris Amendment or “Drug Efficacy Amendment” as a 1962 amendment to the Federal Food, Drug and Cosmetic Act.  This amendment required proof of efficacy in addition to safety for the approval of new drugs — despite the fact that the thalidomide crisis was entirely a safety issue. Proving efficacy is apparently much more expensive and timeconsuming than proving safety.

It is important to note that the authority of the FDA extends both to drugs and medical devices. In order to understand the possible issue here, it is important to understand the difference between the two. Even a cursory review of the FDA website highlights the distinct difference between drugs (which are generally ingested) and medical devices which are generally used outside of the body for diagnostic or treatment purposes.

More particularly, a medical device is an instrument, apparatus, implant or similar or related article that is used to diagnose, prevent or treat disease or other conditions, and does not achieve its purposes through chemical action within or on the body.

On the other hand, drugs achieve their principal action by pharmacological, metabolic or immunological means.

So far so good.

The problem arises when we have reached the point where in a totally safe way (a cheek swab), we are able to obtain enough genetic information to be able to assess the genetic makeup of an individual. The twofold advantage with this technology (in no particular order of significance as I am not sure which is more important) is that individuals are able to gain insight into their personal health, and the data can be aggregated and analyzed allowing for an unprecedented view into our collective health. Both these areas have the potential to yield significant personal comfort and preservation of health, as well as a better understanding of both the role of genetics and the relationship between possible predisposition and incidence of numerous medical issues, which ultimately may point us in the direction of prevention or cure.

In fact, one company, 23andMe, is and was able to complete a relatively low cost genetic analysis that was available to individual consumers and allowed for the aggregation and analysis of data.

There seems to be little doubt that this type of testing does not pose any safety issue. The FDA, however, has determined that by definition it is a MEDICAL DEVICE,and therefore not only must the safety of this service be proven (which is apparently not a problem) but that 23andMe has not yet proven the efficacy of its broad rangetesting.  As a result, in 2013 the FDA issued a demand that 23andMe stop marketing its personal genome service.  The FDA allowed 23andMe to continue marketing the service to possibly help find customers’ relatives – if they were in the database.

As this service is available in Canada, a visit to the Canadian 23andMe website is extremely informative and sets forth that its genome service covers more than 40 inherited condition reports, more than 10 drug response reports, more than 10 genetic risk factor reports, and more than 40 reports relating to varioustraits.

On the other hand, the FDA might be concerned that the information should not be handed over to patients without an interpretation by a physician.  The two answers that come to mind are either to require prominent labeling (it can’t be worse than cigarettes) or to recognize that there is virtually nothing (meaningful or of FDA concern) that a person can do with the information without enlisting the services of a physician.

It is beyond the scope of the article to explain how these reports can and should be used, however the 23andMe website is straightforward.  In addition, I think that when giving patients access to their medical records, it must be assumed that people have a certain minimum level of native intelligence.

Apple (yes, the iPhone, iPad, iWatch company) is also entering this arena with its recently announced ResearchKit, which will aggregate data from individual participants.  Apparently, there is a real possibility that allowing this type of activity may actually inure to the benefit of the general public.

The FDA may finally have realized the possibility that its stand and reasoning was somewhat flawed as it announced in February 2015 that it would allow the direct marketing by 23andMe of a specific test for Bloom Syndrome to the general market.  There are also indications that in the future, the FDA may allow other tests to be marketed directly.  There is no protocol in place for this process, however, nor is there any indication of how long it will take. Clearly, it is a meaningful first step, but I think it really misses the point.

How many millions of dollars – how many years – and how much lost opportunity will we suffer, either directly or through opportunity cost (the lost time in which substantive progress could have been made) because the FDA worldview is not keeping pace with medical technology?

As a lawyer, I may not have the educational background that many of my readers who are more closely allied with the medical/Pharma world may have. In addition, I am sure that there are many differing perspectives on this issue.

My basic question is if the FDA, which is functionally charged with determining the efficacy of drugs and medical devices, should be subjected to a similar examination with respect to the efficacy of the guidelines under which it currently operates?

What do you think?

Nurses make fun of their dying patients. Is that ok.

Nurses

The linked article in the Washington Post raises an interesting question, namely, if it is appropriate for dark humor in a medical setting to possibly offset the difficulties inherent in dealing with the sick and infirm. The question may in fact be a little deeper, namely, if it is appropriate to enact myriad rules and regulations that may generally have a negative effect in the hopes of protecting a few instances where through unintended consequences, third parties are offended. I thought the article was thought-provoking and would love to hear what you think.

http://www.washingtonpost.com/opinions/2015/04/13/18ecc874-d309-11e4-ab77-9646eea6a4c7_story.html?hpid=z2

What Is an Elephant? – An Ant Built to
Government Specification

Elephant

When I was a lot younger, the title to this post was a joke that was often bandied about.

It is entirely possible, however, that the new elephant is what covered entities and/or business associates (which, for purposes of brevity I will refer to as covered entities) must be ready for with respect to HIPAA audits.

The notion that health information should be held private has metastasized into a set of requirements and protocols that have the capacity to virtually capsize any small-to- medium sized covered entity unless it places significant resources, effort and focus on compliance.

Failure to do so is essentially playing Russian roulette with your practice, company or entity.

I am generally not an alarmist, but the apparent lack of awareness of the parameters of the regulatory landscape causes me to take pause. In this article, I will address two of the 168 enumerated sections of the current draft of what OCR has set forth as the HIPAA Audit protocols. As an aside, advance notice has already been given that there is an updated set of protocols being prepared that will reflect the Final Omnibus Rule. I think it is fair to assume that the new protocols will not be any less cumbersome than the current list. Much to the contrary, the prevailing view is that it may be even more detailed.

Of the 169 current items, there are issues that relate to Security (78), Privacy (81), and Breach (10).

Within these three classifications, though, 40 are required, 27 are addressable, and the remainder are n/a as they deal more with what the auditors have to contend with than with what the covered entity has to do.

If this is not enough, “addressable” does not really mean optional in the typical sense of the word, as failure to address the issue must be accompanied with a reason why it was not addressed.

Rather than write in the abstract, I thought it would be much more productive to take the first required/security item as well as the first addressable/security item in the protocols and try to parse out what the regulations, protocols and ultimately the auditor will be looking for (the information in the boxes is from the HHS website).

Number 1

Section Established
Performance
Criteria
Key
Activity
Audit Procedures Implementation
Specification
HIPAA
Compliance
Area
§164.308 §164.308(a)(1):
Security Management Process §164.308(a)(1)(ii)(a) – Conduct an accurate and thorough assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of electronic protected health information held by the covered entity.
Conduct Risk Assessment Inquire of management as to whether formal or informal policies or practices exist to conduct an accurate assessment of potential risks and vulnerabilities to the confidentiality, integrity, and availability of ePHI. Obtain and review relevant documentation and evaluate the content relative to the specified criteria for an assessment of potential risks and vulnerabilities of ePHI. Evidence of covered entity risk assessment process or methodology considers the elements in the criteria and has been updated or maintained to reflect changes in the covered entity’s environment. Determine if the covered entity risk assessment has been conducted on a periodic basis. Determine if the covered entity has identified all systems that contain, process, or transmit ePHI. Required Security

I will not repeat what has already been set forth with respect to conducting a risk assessment. It is important, however, to note the following:

  1. The potential risks and vulnerabilities will vary significantly from one organization to another. This is not a one-size-fits-all document. As such, in order to comply with this requirement/protocol, it is important to have a real and thorough assessment of the physical layout of the operation, as well as a thorough understanding of how and where ePHI is stored and how it is communicated. Without a data map, it might prove difficult to be able to properly set forth the risk assessment. There are many things we understand but are very difficult to put to paper. For example, most people know how to tie their shoes, but if directed to write the various steps involved in this well understood activity, it would be a daunting task. In very general terms, you may know where your data is stored, but detailing this information with the required degree of specificity in a risk assessment may prove to be a very different story.
  1. Completing a risk assessment is apparently not enough. Not only do actual changes in the operation of the entity require updates of the risk assessment, the auditor is tasked with determining if the covered entity has conducted a risk assessment on a periodic basis, and if the assessment identified ALL systems that contain, process or transmit ePHI. It would seem that doing it the first time is the most difficult, but this is something that has to become part of the entity’s routine operation.

Let’s jump to the first “addressable” security requirement

Section Established
Performance
Criteria
Key
Activity
Audit
Procedures
Implementation
Specification
HIPAA
Compliance
Area
§164.308 §164.308(a)(3)(ii)(A):
Workforce security – Implement procedures for the authorization and/or supervision of workforce members who work with electronic protected health information or in locations where it might be accessed.
Implement Procedures for Authorization and/or Supervision Inquire of management as to whether the level of authorization and/or supervision of workforce members has been established. Obtain and review the entity’s organizational chart or other formal documentation and evaluate the content in relation to the specified criteria to determine the existence of chains of command and lines of authority. If the covered entity has chosen not to fully implement this specification, the entity must have documentation on where they have chosen not to fully implement this specification and their rationale for doing so. Addressable Security

Once again, I will not repeat what has already been stated except to point out that in order to address this issue documentation is required.

Either an organizational chart or similar documentation is necessary relating to a covered entity.  In addition, workforce members that need access to ePHI to carry out their duties must be identified. For each workforce member or job function, the covered entity must identify the ePHI that is needed, when it is needed, and make reasonable efforts to control access to the ePHI. Covered entities must provide only the minimum necessary access to ePHI that is required for a workforce member to do his or her job.

For addressable implementation specifications, covered entities must perform an assessment to determine whether the specification is a reasonable and appropriate safeguard in the covered entity’s environment. After performing the assessment, a covered entity decides if it will employ the addressable implementation specification, utilize an equivalent alternative measure that allows the entity to comply with the standard, or not implement the addressable specification or any alternative measures if equivalent measures are not reasonable and appropriate within its environment. Covered entities are required to document these assessments and all resulting decisions.

Factors that determine what is “reasonable” and “appropriate” include cost, size, technical infrastructure and resources. While cost is one factor entities must consider in determining whether to implement a particular security measure, some appropriate measure must be effected. An addressable implementation specification is not optional, and the potential cost of implementing a particular security measure does not free covered entities from meeting the requirements identified in the rule.

Once again, this protocol is probably not a “do it once and file it away” issue.
My analysis is far from comprehensive, and is not meant to convey any legal advice or opinion.  At a practical level, a great deal of how an audit plays out depends on the totality of circumstances (including if the audit is random or precipitated by a breach), the totality of compliance, and the general preparedness of the company.
The purpose of this article is to delicately scratch the surface of what a HIPAA audit may include and alert readers that failure to take a very serious look at the requirements and prepare accordingly is essentially playing Russian roulette.
The good news is that there are many qualified consultants and/or lawyers that can be very helpful. It is important to remember that one advantage a law firm brings to the table is attorney/client confidentiality, which in many cases is an extremely important protection.

HIPAA Audits – Imagine Tax Payments without IRS Audits

audit

We can probably all agree that no one (except possibly accountants) looks forward to an IRS audit. At its most elemental level, there is virtually no upside, a possible downside and a deep feeling that, at best, it will disrupt our lives.

HIPAA audits are essentially no different.

One major difference is that for almost all taxpayers, the idea and the real possibility of an audit existed when they filled out their tax returns. With respect to HIPAA, initially enacted approximately 20 years ago, there was (and, in some cases, still is) some mental block or disconnect regarding audits, penalties, and fines for noncompliance — choose one.

For a little historical background, HIPAA was enacted as a broad Congressional attempt at healthcare reform; it was initially introduced in Congress as the Kennedy-Kassebaum Bill.  The landmark Act was passed in 1996 with two objectives.

  1. One was to ensure that individuals would be able to maintain their health insurance between jobs. This is the Health Insurance Portability part of the Act. Because of its successful implementation, it has become “part of the system” and does not get much coverage.
  2. The second part of the Act is the “Accountability” portion. This section is designed to ensure the security and confidentiality of patient information/data.

Over the years, there have been many additions, clarifications and new portions added to this legislation. All of the changes and details are far beyond the scope of this post; that said, I will list a few.

HIPAA Requirements – Security
Compliance Date – April 20, 2005

The HIPAA Security Rule became effective on April 20, 2005. The Security Rule standards define how we are to ensure the integrity, confidentiality, and availability of our patients’ electronic protected health information (ePHI). The Security Rule requires that we have administrative, physical and technical safeguards for protecting ePHI.  Some (but clearly not all of the ) examples are:

Administrative Safeguards:

  1. Assigning or delegating security responsibility to an individual – Chief Security Officer.
  2. Training workforce members on security principles and organizational policies/procedures.
  3. Terminating workforce members’ access to information systems.
  4. Reporting and responding to security incidents.

Physical Safeguards:  mechanisms to protect electronic systems, equipment and the data they hold from threats, environmental hazards and unauthorized intrusion.

  1. Limiting physical access to information systems containing ePHI (i.e. server rooms).
  2. Preventing inappropriate viewing of ePHI on computers.
  3. Properly removing ePHI from computers before disposing or reusing them.
  4. Backing up and storing ePHI.

Technical Safeguards:  automated processes used to protect data and control access to data.

  1. Providing users with unique identifiers for accessing ePHI.
  2. Accessing ePHI during an emergency.
  3. Encrypting ePHI during transmission.
  4. Automatically logging off users after a determined time period.

Patient Privacy/Security and Technology
As we use technology to improve patient care, we are faced with additional challenges to protect patient information from unauthorized use and disclosure.

In February 2009, the Health Information Technology for Economic and Clinical Health Act (“HITECH”) was enacted as part of the American Recovery and Reinvestment Act of 2009 (“ARRA”). HITECH makes significant changes to HIPAA’s administrative simplification provisions pertaining to privacy and security, including notifying individuals (and in some instances, media outlets) when there has been a privacy/security breach.

Previously, covered entities (healthcare providers, health plans and healthcare clearinghouses) were obligated to mitigate harm caused by unauthorized disclosures of protected health information (“PHI”), but not required to give notice to the individuals whose information was inappropriately disclosed. With HITECH, covered entities and business associates are required to notify individuals when security breaches occur with respect to “unsecured” information. Unsecured information means information not protected through technology or methods designated by the Federal government. In addition, if the breach involves 500 or more individuals, notice to the U.S. Department of Health and Human Services and the media is also required. Depending on the number of people affected by the breach, the time to report the breach changes as well.

While very large healthcare providers have been forthcoming with respect to breach notification, and other providers have been caught when information was breached, we have not yet really had an audit process that would significantly motivate medical providers (especially smaller organizations) to deal with these laws/regulations with the same attention they might give their tax returns. It is only natural that people act based on the consequences of their actions. That is not to say that we should not take the laws seriously, but human nature is still human nature. If I am wrong, the IRS would have no need to audit taxpayers.

To that end, a pilot program was initiated to develop protocols and evaluate HIPAA COMPLIANCE of 115 covered entities. In addition, the methodologies employed in ascertaining compliance were also audited for their effectiveness. In the fourth quarter of 2011, 20 covered entities were selected and received a letter requesting documents, and thereafter on-site reviews began in the first quarter of 2012.

The audit protocol is available at

www.hhs.gov/ocr/privacy/hipaa/enforcement/audit/protocol.html

Subsequently, more entities were audited, and the result of the phase one findings (in this case, findings are not good) showed that approximately 11% of the 115 entities had no findings.  The 11% were comprised of two providers, two clearinghouses and nine health plans.

Additionally, 60% of the findings related to security, which were more than privacy and breach notification findings. This is actually reasonable considering that every entity has security obligations but not every entity has a breach or a breach notification issue. The same rationale applies to privacy issues.

Providers had 65% of the findings and observations although they were only 53% of the entities reviewed.

The frightening part is that the smaller entities had issues with everything.

With respect to security, two-thirds of the entities did not have complete or accurate risk assessments. The other problem areas for providers ran the gamut of issues.

In cases where there were breaches, notification to individuals was the biggest issue.

What we can expect in 2015?

OCR will contact approximately 550 to 800 covered entities for pre-audit surveys; it will use the survey results to select 350 covered entities for an audit. Those entities will have to identify their business associates and provide contact information, at which point OCR will select business associates for audit.

OCR plans to conduct on-site audits as well as desk audits which will be presumably staffed by OCR.

Entities will have two weeks to respond to data requests. All information submitted must be current as of the date of the request. Therefore, after an entity receives a request, it should not then begin to review and update its HIPAA policies and practices. Failure to respond to the request may lead to referral for a compliance review.

It is difficult to know how quickly this will be rolled out in 2015.

There are many entities that should be preparing themselves, as there are many law firms, consultancies and other entities that are gearing up to provide assistance to (virtually) the full vertical of medical coverage that could be subject to this ever-increasing audit regimen.

From a practical perspective, the more audits, the more fines, the more money, the greater expansion of audits.

A word of caution — this article is not meant to offer any legal advice, does not represent the totality of legal/regulatory requirements, the scope of the audits, compliance or remedial measures that entities should take.  In addition there may be state laws and regulations that come into play.

The real concern is that the smaller practices or covered entities may be caught totally off guard. These laws are an important component of the operations of these entities. In sum, it is the new reality.