DMVitals: A Data Management Assessment Recommendations ToolPoster
In an effort to develop an understanding of how science and engineering researchers at the University of Virginia (UVa) manage their research data, UVa Library’s Scientific Data Consulting (UVa SciDaC) group began a series of research data interviews. The goals of the data interview process include identifying common research data problems, identifying research support needs, and providing recommendations on improving data management. In practice, however, providing objective suggestions for data management practices proved to be troublesome. It was difficult to make reliable customization recommendations; be objective in a timely fashion.
In response to these challenges, the UVa SciDaC group developed a system (DM Vitals) that would easily and objectively rate the current state of the researcher’s data management practices. Using best practice statements from UVa sources (Information Technology Services’ Risk Management Program and SciDaC guidelines) and the Australian National Data Service’s (ANDS) long-term sustainability scoring model, the system compares the information collected during the data interview process with these data management best practice statements. The model then further correlates the researcher’s data management practices with the eight data management practice components developed by the SciDaC group: File Formats and Data Types, Organizing Files, Security/Storage/Backups, Funding Guidelines, Copyright & Privacy/Confidentiality, Data Documentation & Metadata, Archiving & Sharing and Citing Data.
To provide a framework for comparing and improving departmental data management practices, we took the value resulting from the average of the data management best practice statements and compared them the Crowston and Qin Capability Maturity Model (CMM). Using this model as a basis, the data management maturity levels are defined as: Level 0: Initial (this includes current practices that can be seen as counter-productive or even “risky” from a security standpoint), Level 1: Managed (the researcher begins to uniformly apply some of the lower level/easier best practices, really starting to “manage” the situation), Level 2: Defined (the researcher is further “defining” their DM practices), Level 3: Quantitatively Manage (the researcher begins to use central and outside services to manage their data), and Level 4: Optimizing (the researcher are continually improving their data management practices).
The strength of the DM Vitals tool is in generating tasks customized to each researcher. These tasks can then be easily grouped into phases, creating a data management implementation plan for each researcher based on their personal data interview and subsequent information gathering. The DM Vitals assessment tool differs from the Digital Curation Center’s (DCC) CARDIO in that its focus is not on consensus and collaboration by individuals responsible for the research data (PI, IT, Data manager, etc.).
Once this tool is fully integrated into the existing UVa SciDaC Data Interview and Data Management Plan process, it will expedite the recommendation report process by providing valuable actionable feedback that the researcher can use immediately to improve the sustainability of their data.
Data Management, Research Assessment, Data Interview
All rights reserved (no additional license for public reuse)
Lake, Sherry, and Susan Borda. "DMVitals: A Data Management Assessment Recommendations Tool." 7th International Digital Curation Conferecne, Bristol, UK. 2011.
University of Virginia
December 6, 2011
The DMVitals Tool was demonstrated along with the poster. The DMVitals poster won 2nd place in the "Best Poster" competition.