US Health Data News



 
Datacom 160 million - Health dept extends Datacom outsourcing deal for $160m
 
Data brokers / diginomica - Data brokers and the implications of data sharing
 
DXC / CMS - DXC books $81M CMS data warehouse support order
 
DE 8-State - Delaware joins eight-state health care data sharing initiative
 
GA DPH Grants - Georgia Department Of Public Health Awarded Grants
 
HHS 2019-11 fine - Texas health department to pay $1.6M for HIPAA violations
 
IBM 100 IT - IBM Announces $100 Million Health IT Program
 
Governance Key - Governance key to creating effective health data warehouse
 
Kaiser Apps - Analysis: App-Happy Health Care Full of Optimism Money
 
UITALD - Interactive Tools to Assess the Likelihood of Death
 
Michigan DW - Michigan saves $1 million per business day with data warehouse
 
Digital Health Care - The Digital Health Care Environment
 
Medicare Use - Feds to allow use of Medicare data to rate doctors hospitals and other health care providers
 
Most Expensive - 10 Most Expensive Hospitals in the U.S.
 
NAPHSIS FOD - NAPHSIS Releases New Fact of Death Query Service
 
Premier-IBM - IBM and the Premier healthcare alliance to integrate nation's healthcare data
 
SC PHG - S.C. public health group gets $11.25 million grant
 
TX HHSC #1 - Problem-plagued Texas data project delayed again
 
TX HHSC #2 - Massive Health Data Warehouse Delayed Again
 
TX HHSC #3 - Texas HHSC privacy breach may affect 1.8k individuals
 
UPMC - $100 Million Investment in Sophisticated Data Warehouse and Analytics
 
Veritas / TR - Veritas to Buy Thomson Reuters Health Care Data Management Line
 
VT Data Warehouse - Audit Questions Health Information Exchange Oversight in VT
 


Beckers's Hospital Review - November 9, 2019

Link to original article

The Office for Civil Rights at the HHS slapped the Texas Health and Human Services Commission with a $1.6 million fine for HIPAA violations, according to a Nov. 7 news release.

Specifically, the OCR was penalizing the Department of Aging and Disability Services for its data breach in 2015. The department reorganized into the Texas Health and Human Services Commission in September 2017.

In a report to the OCR, the department indicated that the electronic protected health information of 6,617 individuals was accessible online. Patient data that was exposed included names, addresses, Social Security numbers and treatment information.

The department said during the move of an internal application from a private server to a public server a flaw in the software code allowed unauthorized users access to individuals' information. The OCR investigation found that the department failed to conduct and enterprise-wide risk analysis and implement access and audit controls for its information systems and applications.

"Covered entities need to know who can access protected health information in their custody at all times," said the OCR Director Roger Severino. "No one should have to worry about their private health information being discoverable through a Google search."
 


Silver Spring, MD (PRWEB) March 01, 2017

Link to original article

The National Association for Public Health Statistics and Information Systems (NAPHSIS) announced today the release of a new Fact of Death (FOD) Query Service http://www.naphsis.org/evvefod, providing credentialed organizations the ability to quickly, reliably, and securely discover if a death record exists. This service is part of the NAPHSIS Electronic Verification of Vital Events (EVVE) System and is the only service in existence with the ability to match authorized queries against the databases of state or local vital record jurisdictions, where all death records in the nation are stored.

We've been swamped by requests for death data from a variety of industries. Access to complete, timely, and accurate death record data does not currently exist in the United States," says Anthony Stout, manager of EVVE products and services. ?EVVE Fact of Death resolves this problem and can save the country and our customers millions, if not billions, of dollars a year".

Before today, the Social Security Administration (SSA) Death Master File (DMF) https://www.ssa.gov/dataexchange/request_dmf.html was the primary source for death record data. However, its usefulness has been severely hampered since November of 2011, when the SSA was no longer allowed to include state protected death records in the DMF. As a result, millions of death records are missing every year from the DMF, making it woefully incomplete and unusable for many organizations requiring this information to help prevent fraud, protect identities, reduce waste, and streamline business processes.

Currently, there are 37 of 57 states and jurisdictions participating in the EVVE FOD service, allowing credentialed users to match against more than 55 million death records. The number of participating jurisdictions is increasing steadily, and all 57 states and jurisdictions across the nation are working to join the EVVE FOD service as soon as possible.

As death record data includes highly sensitive and personal information, confidentiality and security of such data is of upmost importance. To ensure this service utilizes the highest levels of security possible, NAPHSIS has partnered with LexisNexis® VitalChek Network Inc. (VitalChek) http://vitalcheknetwork.com/ to maintain the EVVE Fact of Death Query Service. VitalChek adheres to all major InfoSec standards such as PCI-DSS, SOC 1, SOC 2, and uses public key / private key encryption technology to ensure incoming requests and outgoing results are secure.

An organization that has a valid need for death record data, and belongs to one of the following current categories, may be credentialed to use EVVE Fact of Death:

Federal-Benefits or Admin State/Local-Benefits or Admin Pension/Retirement Insurance Receivables Financial

Organizations can become credentialed EVVE Fact of Death users by visiting the website at http://www.naphsis.org/evvefod, clicking on the "Get Started Now" link at the bottom of the page and following the prompts. The process is easy, and qualified customers can expect to be using EVVE Fact of Death within a week. There is a minimal per-record price for credentialed private companies and/or government agencies to use the EVVE Fact of Death service. About NAPHSIS: The National Association for Public Health Statistics and Information Systems (NAPHSIS) is the national nonprofit organization representing the state vital records and public health statistics offices in the United States. Formed in 1933, NAPHSIS brings together more than 250 public health professionals from each state, the five territories, New York City, and the District of Columbia. Contact: Anthony N. Stout Manager- EVVE Products and Services 301-563-6005 / evvefod(at)naphsis(dot)org
 


Audit Questions Health Information Exchange Oversight in VT
Health IT Interoperability
By Kyle Murphy, PhD
October 06, 2016

Link to original article

An audit of health information exchange activities in Vermont has yielded more questions than answers about healthcare interoperability in the state.

Officials at the Department of Vermont Health Access (DVHA) tasked with overseeing the development of a statewide health information exchange have drawn criticism from the state's auditor for their oversight of millions of dollars in grants and contracts. Health information exchange in Vermont In a report released late last month, State Auditor Douglas R. Hoffer found DVHA to have fallen short in two areas: evaluating the actions taken by Vermont Information Technology Leaders, Inc. (VITL) — the exclusive operator of the statewide HIE network — and measuring the latter's performance over the previous two fiscal years, FY 2015 and 2016.

According to the report, the state department issued $12.3 million during that time, representing close to one-third of total funding ($38 million) paid to VITL since 2005. Oversight of the VITL HIE contracts and grants fell to both DVHA and the Agency of Administration (AOA).

Deficiencies in oversight have raised doubts about the development of a clinical data warehouse to be used for health data analysis and reporting.

"Although the State assented to VITL building the warehouse, it was not explicitly included in any agreement as a deliverable, nor did the State define its functional and performance requirements. Without such requirements, the State is not in a position to know whether the clinical data warehouse is functioning as it intends," the report states.

Upon closer inspection, the building of a clinical data warehouse casts doubt on the state's handling of agreements with VITL, which according to the state's audit cited unclear language as authorization for the system.

"Even if we accept that this language authorizes the construction of a clinical data warehouse, which we believe is unclear, no evidence was provided to indicate that the State defined the functional and performance requirements of the warehouse," the report reads. "Without such requirements, the State is not in a position to know whether the clinical data warehouse is functioning as it intends."

Uncertainly also extends to the ownership and use of the clinical data warehouse as a lack of explicit language appears to indicate that the state is the licensee of the software used but its ability to make use of the data is restricted by the healthcare organizations providing the information comprising the system.

"Accordingly, VITL contends that the agreements do not currently permit VITL to disclose the personal health information in the warehouse to the State and, therefore, the State does not have any rights to access, use, or disclose this data," the report states.

As it turns out, the case of the clinical data warehouse was a microcosm of a much larger issue of poor programmatic and financial oversight of VITL. DVHA often failed to finalize agreements with VITL prior to project start dates — in five of six agreements. These delays had several consequences:

First, having VITL perform work without a signed agreement inhibited the State’s ability to hold VITL accountable to desired standards because they had not been formally documented and agreed upon. Second, the Green Mountain Care Board reported that delays in finalizing VITL’s contracts resulted in uncertainty about what terms would ultimately be agreed to or omitted, what work should be prioritized, and if and how to allocate staff, contractors, and other resources to various projects. Third, because of the four-month delay in signing contract #30205, VITL and the State agreed to eliminate two required deliverables (connecting the Cancer Registry and the Vermont Prescription Monitoring System to the VHIE). VITL also reported that the delays in signing other agreements resulted in a reduction in the number of completed activities (e.g., fewer interfaces were developed) and certain projects being completed later than expected (e.g., the event notification system was delayed four months).

State officials chalked up the delays to difficulties in receiving federal approval.

As for measuring DVHA's measuring of VITL's performance over the previous two fiscal years, the State Auditor concluded that agreements "contained few performance measures" to assess quality or impact.

"While DVHA’s agreements with VITL did contain quantity measures (how much), there were very few quality measures (how well), and no impact measures (is anyone better off). Further, the state’s current Vermont Health Information Technology Plan (VHITP) does not specify any performance measures for gauging the performance of the VHIE," the report states.

The state's audit reveals that state officials have taken steps to address these deficiencies, including requiring more detailed invoices from VITL which prompted an investigation into the allowability of some costs (conclusion still pending) and the decision by DVHA to fund an impact assessment of VITL's work

Ultimately, the audit concludes that the states is in no position to determine the functioning of the clinical data warehouse nor measure the performance of VITL in developing HIE services that have a positive impact on improving care quality and reducing care costs.

"Without quantifiable performance measures, the State’s ability to judge VITL’s efforts and gauge success is significantly inhibited," it closed.

Given the uncertainty surrounding health information exchange activities, the state of healthcare interoperability in Vermont remains problematic.
 


Health dept extends Datacom outsourcing deal for $160m
By Justin Henry June 25, 2020

Link to original article

Two more years.

The federal Department of Health has extended its IT outsourcing deal with Datacom for a further two years amid the ongoing coronavirus pandemic.

The department handed the company the two-year extension last month at a cost of $159.7 million, bringing the infrastructure and support services deal to $506.3 million over seven years.

It means the contract, which covers the provision, maintenance and refresh of all hardware and software, has now more than doubled in cost since Datacom scooped the deal from IBM in 2015.

The deal also covers a range of enterprise data warehouse services that the department had previously sourced from Accenture.

The extension follows two additional amendments last year, which added $92.9 million ($67.7 million and $25.2 million) to the cost of the contract.

The larger of the two amendments related to an increase in the department’s consumption of services over the term of the contract.

A spokesperson told iTnews that the latest amendment would see the term of the contract pushed out until 30 June 2022.

“The original term of the contract was set to expire on 30th June 2020. The contract has been extended for two years,” the spokesperson said.

“The Department has chosen to exercise a contract extension option available under the contract.”

However, unlike the two amendments last year, the spokesperson said “no new services have been added as part of the extension”.

When Datacom became the incumbent provider five years ago, it helped shift the department to a contemporary outcomes-based model with consumption-based pricing to reduce annual IT costs.

The transition, which took six months, involved establishing a support capability for the department’s enterprise data warehouse, data centres and 490 servers, according to Datacom.

It followed 15 years with a traditional IT services outsourcing model from IBM – a deal that was renewed six times, including one in which ministerial approval was granted to keep it going.

The department currently has an average staffing level (ASL) of 3800.
 


Data brokers and the implications of data sharing - the good, bad and ugly
By Neil Raden July 19, 2019

Link to original article

Summary: The term "data sharing" is expanding, but in a problematic way that raises flags for companies and consumers alike. Neil Raden provides a deeper context for data sharing trends, dividing them into the good, bad and ugly.

The term "data sharing" has, until recently, referred to scientific and academic institutions sharing data from scholarly research.

The brokering or selling of information is an established industry and doesn't fit this definition of "sharing," but it is popping up. Scholarly data sharing is mostly free of controversy, but all other forms of so-called sharing present some concerns.

Information Resources (IRI), Nielsen and Catalina Marketing have been in the business of collecting data and selling data and applications for decades, but the explosion of computing power, giant network pipelines, cloud storage and, lately AI, is a fertile ground for the creation of literally thousands of data brokers, mostly unregulated and presently a challenge to privacy and fairness:

Currently, data brokers are required by federal law to maintain the privacy of a person's data if it is used for credit, employment, insurance or housing. Unfortunately, this is clearly not scrupulously enforced, and beyond those four categories, there are no regulations (in the US). And while medical privacy laws prohibit doctors from sharing patient information, medical information that data brokers get elsewhere, such as from the purchase of over-the-counter drugs and other health care items, is fair game.

Selling Healthcare Data:

One might assume that your medical records are private and only used for the purposes of your healthcare, but as Adam Tanner writes in How Data Brokers Make Money Off Your Medical Records:

IMS and other data brokers are not restricted by medical privacy rules in the U.S., because their records are designed to be anonymous-containing only year of birth, gender, partial zip code and doctor's name. The Health Insurance Portability and Accountability Act (HIPAA) of 1996, for instance, governs only the transfer of medical information that is tied directly to an individual's identity.

It is a simple process for skilled data miners to combine anonymized and non-anonymized data sources to re-identify people from what is supposed to be protected medical records:

One small step toward reestablishing trust in the confidentiality of medical information is to give individuals the chance to forbid collection of their information for commercial use-an option the Framingham study now offers its participants, as does the state of Rhode Island in its sharing of anonymized insurance claims. "I personally believe that at the end of the day, individuals own their data," says Pfizer's Berger [Marc Berger oversees the analysis of anonymized patient data at Pfizer]. "If somebody is using [their] data, they should know." And if the collection is "only for commercial purposes, I think patients should have the ability to opt out."

There are also legitimate data markets that gather and curate data responsibly. Most notable lately is Snowflake, which I'll cover below. Others are Datamarket.com, which is now part of QLIK, Azure Data Marketplace (Microsoft) and InfoChimps.com.

One I can't get my arms around is Acxiom. They are a $1B business that collects all sort of information about people in 144 million households. Apparently their business is creating profiles so advertisers can target you more accurately. That seems innocent enough, but I don't know if that's the whole story. However, about five years ago, Acxiom launched https://aboutthedata.com/portal which allows you see what data they have about you.

Even more remarkable, you can correct mistakes and you can opt out. According to Acxiom, though, if you do opt out, you can expect to get a lot of ads you're not interested in. Keep in mind, though, that this business is still unregulated, so it would take an investigative reporter to validate these claims.

Then there is this: Acxiom, a huge ad data broker, comes out in favor of Apple CEO Tim Cook's quest to bring GDPR-like regulation to the United States:

In the statement, Acxiom said that it is "actively participating in discussions with US lawmakers" on consumer transparency, which it claims to have been voluntarily providing "for years." Still, the company denied that it partakes in the unchecked "shadow economy" which Cook made reference to in his op-ed.

The good - let's start with data.gov

From Wikipedia: Data.gov is a U.S. government website launched in late May 2009 by the then Federal Chief Information Officer (CIO) of the United States, Vivek Kundra. Data.gov aims to improve public access to high value, machine readable datasets generated by the Executive Branch of the Federal Government. The site is a repository for federal, state, local, and tribal government information, made available to the public. Data.gov has grown from 47 datasets at launch to over 180,000 (actually now over 250,000).

This chart gives a sense of the vastness and variety of free, open and curated data on data.gov:

Don't confuse this with: The Open Data Initiative

The Open Data Initiative (ODI) is a joint effort to securely combine data from Adobe, Microsoft, SAP, and other third-party systems in a customer's data lake. It is based on three guiding principles: - Every organization owns and maintains complete, direct control of all their data - Customers can use AI to get insights from unified behavioral and operational data - Partners can easily leverage an open and extensible data model to extend solutions

ODI is an ambitious effort with admirable goal, but it is not the subject of this article.

There are also legitimate data markets that gather and curate data responsibly. Most notable lately is Snowflake, which I'll cover below. Others are Datamarket.com, which is now part of QLIK, Azure Data Marketplace (Microsoft) and InfoChimps.com.

The bad

Epsilon (recently acquired in April, 2019 for $4.4B) refused to give a congressional committee all the information it requested, saying: "We also have to protect our business, and cannot release proprietary competitive information." information onpeople who are believed to have medical conditions such as anxiety, depression, diabetes, high blood pressure, insomnia, and osteoporosis.

Sprint, T-Mobile, and AT&T said they were taking steps to crack down on the "misuse" of customer location data after an investigation this week found how easy it was for third parties to track the locations of customers. (Misuse? They SOLD the data).

Experian sold Social Security numbers to an identity theft service posing as a private investigator.

The ugly

Optum. The company, owned by the massive UnitedHealth Group, has collected the medical diagnoses, tests, prescriptions, costs and socioeconomic data of 150 million Americans going back to 1993, according to its marketing materials.Since most of this is covered by HIPPA they are very clever in getting around the regulations. But that socioeconomic thing is real red flag.

What it means, at the very minimum, is the use of the "Social Determinants," income and social status, employment, childhood experiences, gender, genetic endowment. That's just the start. You have to ask yourself, why would anyone want to use this information? Life insurance, car insurance, mortgage, education, adoption, personal liability insurance, health insurance, renting, employment…there is no end to it and you will never know what's in there.

The World PrivacyForum found a list of rape victims for sale. At one data broker, the group found brokers also selling lists of AIDs patients, the home addresses of police officers, a mailing list for domestic violence shelters (which are typically kept secret by law) and a list of people with addictive behaviors towards drug and alcohol.

Tactical Tech and artist Joana Moll purchased 1 million online dating profiles for 136€ from USDate, a supposedly US-based company that trades in dating profiles from all over the globe.

Snowflake's Data Sharing

Snowflake is a cloud-native data warehouse offering. Their secret sauce is the separation of data from logic. So taking Amazon as an example (Snowflake also runs on Google Cloud and Microsoft Azure shortly). Your data will reside in S3, where costs are asymptotically approaching zero, and you basically only pay for processing on EC2. Everything works as a "virtual data warehouse," meaning you create abstractions over the data and nothing moves or is copied. You can have virtually thousands of data warehouses with one copy of the data.

I don't know this sure, but I suspect Snowflake, despite their success, saw the need to create some other technology as data warehouses are a limited market. What they came up with was using their existing technology to provider a mechanism for data providers to locate their data in a Snowflake region, and allow others to "rent" data without copying or downloading it. Beside this obvious productivity and cost-saving, Snowflake added feature for their data sharing product including some level of curation and verification of the data. I get the impression this is still a work in progress.

And, because all access to data is through (virtual) data warehouse views, integration of data sources, reference data and a level of semantic coherence - all qualities of a data warehouse - are there. In contrast to a bucket of bits you can download and wrangle later, this seems like a good idea to me

I asked Justin Langseth, Snowflake's CTO, if he was concerned about criminal, civil or even ethical exposure to Snowflake from the data provided. His e-response was:

Legally no we're just the communication platform, the provider of the data is responsible for their data... but we are looking at some tools that can detect hidden bias in models and data though, so it is an area of interest. Should this be enough of a reason to not have people share data? There's tons of social good that can come from this as well.

The problem with his response is two-fold: First not just the data, but any calculations and modeling a customer will do takes place within Snowflake. Secondly, legal responsibility is an abstract term. You may not be legally responsible, but you may still be charged or sued and have to defend yourself, with uncertain outcome.

Besides all of the issues, I'm wondering how many companies have data someone else would want to buy? If you dig into data lakes, the volume comes from things like log files which would be useless without context and imported data, which may not be resealable anyway. Between data.gov and Google and Facebook et al, is there really a market for this? I'm also thinking about edge data; how would you package that, because the trend is not to bring it back to the cloud (though I still don't understand how you do machine learning at the edge).

Langseth also just posted an article on Medium recently, with The article covers the "hardest" issues data marketplaces will face: 1.Faked and Doctored data 2.Sales of Stolen, Confidential, and Insider Data 3.Piracy by Buyers of Data 4.Big Data can be really Big 5.Data is Fast 6.Data Quality can be Questionable 7.Lack of Metadata Standards

And in conclusion, he asks: So how do IotA, SingularityNET, and Datum address these issues?

Mostly they don't, at least so far. Most of the projects working on decentralized data marketplaces have simply not hit these issues yet as they are just in a test mode on a test network. To the extent they have thought about the trust-oriented issues, most of them propose either a reputation system or a centralized validation authority. Reputation systems for data marketplace are highly prone to Sybil attacks (large #'s of fake accounts colluding), and if you need a centralized authority forever you're defeating the purpose of a decentralized crypto system and may as well do everything the old way.

My take

The battle for privacy is already lost. Once data is out, it's gone. Stemming the flow of current data could eventually dilute the value of the data brokers, but that requires regulation which is unlikely in the USA. To reign in data brokers who exist in the shadows, as opposed to a polluting coal-firing power plants, will require digital enforcement, and for-good trolls sniffing out the bad guys. The only question is, who will pay for the development and operation?
 


DXC books $81M CMS data warehouse support order
By Ross Wilkers | Dec 20, 2017

Link to original article

DXC Technology has won a one-year, $81.6 million task order with the Centers for Medicare and Medicaid Services for enterprise IT services to help operate the main portion of CMS’ data warehouse.

CMS received five offers for the order it awarded via the National Institutes of Health’s $20 billion CIO-SP3 contract vehicle, according to Deltek data.

The company will be responsible for the integrated data repository’s information systems architecture and data models. DXC said in a release it will also carry out extract transform and load, user support, data quality and support functions.

Within the data repository is a Hadoop and Teradata enterprise data warehouse that handles data related to CMS’ program benefits.

Task order work will aim to to ensure that the data and data services provided by the repository for Part A and Part B claims are "payment grade," CMS said in an October 2016 sources sought notice.

CMS defines payment grade as automated validation that the data loaded into the repository is exactly how it was received from the sending source and the installation and enforcement of internal controls to maintain separation of duties.

The agency also determines payment grade based on automated reporting and reconciliation processes in place to confirm the data that was loaded. In most cases CMS expects the reporting to be automated but all reconciliation is manual.

DXC is in the process of separating and merging its U.S. government business into Vencore and KeyPoint Government Solutions to create a new, publicly-traded company.
 


Delaware joins eight-state health care data sharing initiative
By Nick Ciolino - Jun 25, 2018

Link to original article

Delaware is now part of an initiative to share best practices for collecting and using health care data.

The National Governors Association created the project which also includes Arkansas, Colorado, Indiana, Iowa, Minnesota, Vermont and Washington. It seeks to determine the best use for data analytics to inform Medicaid and other state health spending policy.

Dr. Elizabeth Brown is Medical Director of the Delaware Division of Medicaid and Medical Assistance. She says state health officials have set up a data warehouse meant to inform Delaware’s decisions as the state moves from a volume-based to a value-based health system and sets a healthcare spending benchmark. She adds this is an opportunity to share what the First State has learned and get input from other states.

We’re going to take a step back look at all of our data systems, look at what best practices are across the country and make sure we are aligning with those best practices,” said Brown.

As a state where the cost of health care is growing faster than its economy, data plays a large role in Delaware’s health spending policy.

But Brown says it’s important to realize the strengths and weaknesses of the data that’s available.

“And that’s actually one of the reasons that projects like this are so important,” she said. “We are analyzing what we can get out of data, what the questions that can be answered accurately and completely with our data are, and where we need to be thinking outside of just the claims data.”

With the support of the NGA, the state health systems will be sharing data techniques with one another over the next 16 months, but will not share the data itself to protect patient privacy.

About 230,000 Delawareans receive Medicaid.
 


Texas HHSC privacy breach may affect 1.8k individuals
Written by Julie Spitzer | June 20, 2017

Link to original article

The Texas Health and Human Services Commission notified clients after discovering a box containing protected health information outside an unsecured dumpster belonging to a commission eligibility office.

The forms in the box — which included client information of 1,842 people in the Houston area — may have contained information such as names; client numbers; dates of birth; case numbers; phone numbers; mailing addresses; Social Security numbers; health information; and bank account numbers. HHSC is offering those affected by the breach one year of free credit monitoring services, although the agency currently has no evidence that anyone viewed the information, Texas HHSC Assistant Press Officer Kelli Weldon confirmed to Becker's Hospital Review via e-mail.

Ms. Weldon said HHSC is reviewing its processes and procedures for disposing documents that contain private information to prevent this type of incident from occurring in the future.
 


Massive Health Data Warehouse Delayed Again, A Decade After Texas Pitched It
The Texas Tribune
By Jim Malewitz and Edgar Walters
August 15, 2016

Link to original article

Texas health regulators are starting from scratch in designing a system to store massive amounts of data — after spending millions of dollars trying to roll out a version that’s now been scrapped.

Charles Smith, executive commissioner of the Texas Health and Human Services Commission, said Monday that his agency had recently nixed a $121 million contract to create an Enterprise Data Warehouse, an enormous database that would store a wide range of information about the many programs the agency administers. First funded in 2007, the project was expected to be up and running a few years after.

Because the original design would not link enough programs at the sprawling agency, regulators would essentially start from scratch on a much larger — and therefore more useful — system, Smith told members of the Texas House State Affairs Committee at a hearing on state contracting reform efforts.

"We were in the process of building a two-bedroom, two-bath home," he said, likening the effort to a home construction project. "You get it ready to prep your foundation, and I realize my spouse is pregnant with quadruplets."

The most recent design, which was largely focused on storing data on Medicaid and the Children’s Health Insurance Program "isn’t going to meet the needs of our family," he added.

The update stirred concerns from some lawmakers about the lack of progress on a pricey project with a troubled history.

"Thirty-five million dollars we’ve spent on a project that was supposed to cost $120 million. For that, we have nothing?" asked Rep. Dan Huberty, R-Houston.

"Are we getting back to where we started?" asked Rep. Four Price, R-Amarillo.

Texas has spent $35 million on the project so far, with most coming from federal funds, said Smith, who was appointed to his post in May. About $6 million was tapped from state funds.

Smith did not have an estimate about how much the new, larger project would cost, because those assessments won’t begin until next fall — after the legislative session that begins in January.

He pushed back against suggestions that spending thus far was for naught, noting that the agency — as part of the planning process — had moved to a new software system that would be used in the new data warehouse.

"We’ll go through and develop a plan, and a timeline, and we’ll come back next session with everything we need to obtain through the process," he said.

Since the project was first funded, it has suffered myriad delays, as well as uncertainty about whether the federal government would pitch in with additional funding.

In 2013, the Health and Human Services Commission finally invited private companies to submit proposals for the contract. The next year, state officials chose Truven Health Analytics, a Michigan-based firm, as their tentative winner.

But after a series of contracting scandals at the agency prompted the resignation of several high-ranking officials, the state started over, and in November 2014 asked companies to re-apply for the funding.

Those proposals were due in February 2015, and state officials anticipated the project would begin on Sept. 1 of that year, according to the state’s latest published timeline for the project.

At the time, a spokeswoman for the health commission told the Houston Chronicle that the quality of the project was “more important than the timeline.” The agency nonetheless said it was “still possible” the project would be up and running by the end of 2015.

Smith said his agency needed a warehouse that would give his agency instant access to more data than the scrapped plans accounted for — such as information related to foster care.

"I’m talking to our staff about what is the capacity of our system," he said. "We don’t know how many families are willing and able."

Such concerns come at a time when his agency is growing in size and scope. Three of the state’s five health and human services agencies are consolidating into a single "mega-agency" — a reorganization ordered by state lawmakers in 2015.

The other two agencies, which oversee the state’s foster care system and public health infrastructure, respectively, will be considered for consolidation in 2017. State leaders have said that changing the Health and Human Services Commission’s configuration would streamline services and improve efficiency.

Some lawmakers took heart that Smith had refused to follow through with the warehouse’s original design, calling it a thoughtful approach.

"It sounds like the contract was inadequate," said Rep. Byron Cook, a Corsicana Republican who chairs the State Affairs Committee. "I appreciate that."
 


Problem-plagued Texas data project delayed again
Houston Chronicle
By Brian M. Rosenthal
Tuesday, June 28, 2016

Link to original article

AUSTIN -- Texas state health officials once again are delaying a massive data project that has struggled to get off of the ground for more than a decade.

The state Health and Human Services Commission informed lawmakers Tuesday it was pausing the "Enterprise Data Warehouse" project, a plan for an elephantine database housing dozens of information sets about everything from welfare benefits to Medicaid.

"HHSC and the other Health and Human Services agencies are going through a transformation process..." the commission explained to the lawmakers. "Therefore, we are reevaluating our long-term data needs and want to ensure the best investment of state resources."

In a separate letter to the company that was set to run the project, the state officials said they would "revisit this necessary project after the transformation process has been substantially completed."

The commission said it was canceling the contractor solicitation process altogether, which means that even if officials decide to restart the project, it will be years before a vendor is chosen.

The decision is the latest twist in a project that has experienced an almost-comical series of setbacks and controversies.

First discussed in 2005, the project was envisioned as a way to improve services and spur savings through better data analysis. Lawmakers funded the project in 2007, calling for it to be operational by February of 2009.

Over the years, state budget writers have set aside more than $100 million for the project -- money that could not be used elsewhere -- and spent more than $12 million, mostly on consultants.

After a slew of delays caused by both the state and federal governments, the health commission thought it finally had gotten the project on track in the spring of 2014, when officials began negotiating a contract with Truven Health Analytics of Ann Arbor, Michigan.

Then came the eruption of a contracting scandal over alleged favoritism by commission officials toward another data company, 21CT of Austin. In a meeting in August of 2014, commission lawyer Jack Stick, who already had steered a Medicaid fraud detection project to 21CT, seemed to imply in a meeting that that company could do the Enterprise Data Warehouse for less money than Truven.

Two weeks later, negotiations with Truven were over. The commission blamed the company's asking price and said there had been a leak that led Truven to learn about Stick's comment.

Stick and four other commission officials eventually resigned in connection with the 21CT scandal, and the Medicaid fraud project was canceled.

The data warehouse project was put out for bid again in November of 2014.

This February, the health commission disclosed that Truven once again had emerged as the winning bidder and would be given a $104 million contract -- nearly $35 million less than what was being discussed in 2014, said the spokesman, Bryan Black.

"The Health and Human Services Commission is excited the contract is signed and we are moving forward," Black said in February.

The fate of the contract may have shifted when former Executive Commissioner Chris Traylor retired last month. His replacement, Charles Smith, opted for the new approach, records show.




International News - Health Data



Date last updated: Dec-19-2018

Send corrections or suggestions

Copyright 2020 · EHDP Home Page