By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Upside - Where Data Means Business

The Big Data Wave: 1980s Redux?

The RDBMS challenges of the 1980s are being replayed in the world of big data.

Anyone who has been involved with technology for a few decades -- and younger folks who appreciate history -- will find it interesting to hear how similar today's wave of big data technology is to the advent of relational databases and reporting tools in the late 1980s and early 1990s.

Prior to arrival of relational DBMS, products used proprietary database formats, each with different access tools. Worse, they were closed to access from all but a few specific applications. There was a need to open them up and make it easier for organizations to report on and analyze their data.

During the early 1980s, the RDBMS arose and established itself as the preferred way to store and organize data. It was primarily designed to be an open standard. SQL, or structured query language, created a common method of pulling data from structured data stores. By developing this universal language, the idea was to foster innovation, competition, and rapid improvement that would benefit all users of data.

During this time, accessibility to the technology was limited to the technical domain. After all, you literally had to learn a new language -- essentially a programming language. Business users had to hand over written requirements and specifications to a SQL programmer who would create the desired report.

This led to a time-consuming and painful back-and-forth process; it's usually difficult for business users to know what they want until they actually see something. Because almost every business question needed a new report, queues of report requests would mount, adding to the frustration and inefficiency.

In these early days, report builders used simplistic tools to extract information and deliver it in a static, tabular report format. Soon more user-friendly tools came on the scene and began to lower the technical barriers. At least they allowed reporting specialists to more quickly develop reports without having to be SQL experts. Crystal Reports is a famous application that was popular in part because of its affordability and accessibility.

The parallels to the development of big data technologies are striking. After a couple of years of competing proprietary storage and access formats, the Hadoop platform -- which is an industry standard technology -- has become well established. This is helping accelerate the advancement of reporting tools, which nowadays means dashboards and data visualizations.

Big data technology is still highly complex, with significant barriers to accessibility. It requires developers with wide knowledge in open source and/or commercial software just to get started. It also requires developers and data scientists, the latter of which is a new role for domain and data experts who can also write code.

However, because there aren't good alternatives yet, people are using the old tools that treat big data stores as if they were another type of data warehouse -- the mature but somewhat antiquated version of an RDBMS -- but because big data is composed of semistructured datasets stored in a nonrelational format, this is not adequate.

Fortunately, in 2016 new tools are already emerging that dynamically use big data infrastructure and its inherent advantages to deal with much larger oceans of data. Having learned the lessons of the past, tools are being created to serve business users' ad hoc demands -- and that's a big improvement over the decades it took the old RDBMS world to reach the same place!

About the Author

Luke Liang is CEO and cofounder of InetSoft, a business intelligence software company. You can reach him at [email protected].

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.