Data Digest: Data Quality Teamwork, Modern Data Integration, Data Deduplication Best Practices
What high-quality data means and how to achieve it by working as a team, tips on de-duping multiple areas of the enterprise, from cloud servers to data, and a new paradigm for data integration.
- By Quint Turner
- February 26, 2016
Data quality is hard to keep up considering the flood of data. Regardless, improving data quality means improvements across the board in all fields of the enterprise. This article examines what high-quality data means and how to achieve it by working as a team.
Read More at Database Journal
When there is already so much unique data taking up precious storage space, duplicated data can be a real hassle. De-duplication is the process of getting rid of any duplicate data; a useful, but difficult, process. This article has tips on de-duping multiple areas of the enterprise, from cloud servers to data.
Read Full Story at Computer Technology Review
Similar to de-duping, data integration is another important step of data preparation that can be overwhelming. However, the old process of data integration just is not cutting it. This article brings a new paradigm for data integration; it may look similar to the old one but is far more effective.
Read More at Big Data Quarterly
About the Author
Quint Turner is an editorial intern at TDWI and an undergraduate English student at Skidmore College. Follow his blog at pungry.com.