Sensitive and confidential patient health records not adequately protected: Survey

NewsGuard 100/100 Score

News Facts:

  • According to a new survey of healthcare IT professionals, sensitive and confidential patient health records are not adequately protected from theft or loss.
  • Most do not protect the data – Despite the data's sensitivity, 51 percent of those surveyed do not protect patient data used in software development and testing.
  • Losses can easily go undetected – 78 percent are not confident or else are undecided as to whether their organization could even detect the theft or accidental loss of real data in development or test.
  • Breaches are commonplace – 38 percent have had a breach involving data in a development and test environment and 12 percent are unsure if they have had a breach or not.
  • Consequences are high – 59 percent of those experiencing breaches consequently experienced disruption of operations, 56 percent faced regulatory action and 36 percent suffered reputation loss.

The survey findings are being published today in a new report by the Ponemon Institute titled, Health Data at Risk in Development: A Call for Data Masking. The survey was sponsored by Informatica Corporation (Nasdaq:INFA), the world's number one independent leader in data integration software.

Examining the widespread use of real patient data in healthcare application development and test environments, the Informatica-sponsored report details how this is exposing healthcare organizations to the risk of non-compliance to various regulations such as the Health Insurance Portability and Accountability Act (HIPAA). Additionally the research provides guidelines for reducing exposure – including the now vital practice of masking and securing live data.

Other key research findings, based on a survey of more than 450 IT professionals in U.S. healthcare organizations, include:

  • Outsourcing and cloud computing increase the security risk – Outsourcing development and test activities and/or using cloud computing resources introduce additional risk factors, which often prevent healthcare organizations from turning to these potentially advantageous resources. 40 percent do not outsource due to security concerns, while a mere 19 percent are confident or very confident about security in a cloud environment.
  • Healthcare industry disillusioned with data protection goals – Protection of real data in the development and testing environment is important to respondents but the majority does not know or believe they are successful in achieving this goal. Seventy-four percent say that meeting privacy and data protection requirements in the healthcare services industry is important but only 35 percent say they believe their company is successful in achieving this goal.
  • With only 35 percent of respondents believing their organization is successful at protecting patient privacy in development and test environments, Ponemon Institute recommends immediate actions including:
  • Centralized executive oversight – Create a single point of executive-level responsibility coupled with policies and procedures for safeguarding your organization's real data in non-production environments.
  • Data masking – Invest in key technologies including tools to "transform or mask sensitive or confidential data without diminishing the richness of the data necessary for successful testing and development".
  • Data masking helps safeguard sensitive, private or confidential data such as protected health information (PHI) or personal health records (PHR) by masking it in-flight or in-place. As a result, fully functional, realistic data sets can be used safely in development, testing, training and other non-production environments. Regardless of whether the work is managed in house, offshored or outsourced, companies have the peace of mind knowing they will not be exposed to malicious or inadvertent data spills or in violation of the Health Information Technology for Economic and Clinical Health Act (HITECH Act) or other regulations.
  • With Informatica Data Masking, sensitive data can be discovered and systematically de-identified using algorithms that obfuscate the original data, but retain its original format and properties so that applications that depend on that data continue to function properly during development and test activities.

Tweet this: Ponemon research reveals dire risks in Healthcare apps development. Data masking key to mitigate #security risks http://bit.ly/ehxkpd

Quotations

  • "Health Data at Risk in Development: A Call for Data Masking is a wake-up call for the healthcare industry, where the average per-victim cost of a data loss is $294 – a whopping 44 percent higher than the norm across all industries," said Dr. Larry Ponemon, chairman and founder, Ponemon Institute. "Healthcare organizations have achieved great success in safeguarding their data in production environments. Now it is time to act just as resolutely and systematically to protect patient confidentiality and privacy in non-production environments."
  • "Development and test environments have emerged as the new data security battlegrounds, and data masking is proving to be a critical component of any enterprise risk and compliance program," said Adam Wilson, general manager, Application Information Lifecycle Management, Informatica. "Used by some of the largest healthcare companies in the world, Informatica Data Masking provides organizations with a comprehensive solution for data masking across all relevant IT operations."

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Feeling lonely? It may affect how your brain reacts to food, new research suggests