CASE STUDY - SEC Raises Its IQ to Protect Investors and Maintain Market Integrity
Commentary by Lewis Walker, Assistant Director for Application Development in the Office of Information Technology, U.S. Securities and Exchange Commission
The U.S. Securities and Exchange Commission (SEC) is charged with protecting investors and maintaining the integrity of the securities markets. The laws and rules that govern the securities industry in the United States are based on a straightforward concept: all investors, whether large institutions or private individuals, should have access to certain basic facts about a prospective investment prior to buying it.
To address this mission, the SEC requires public companies, stock exchanges, broker-dealers, investment advisors, mutual funds, and public utility holding companies to disclose meaningful financial and other information to the public so investors can judge for themselves whether a company’s securities are a good investment. The SEC monitors the activities of these organizations to ensure they are complying with securities laws. Each year, the SEC brings between 400–500 civil enforcement actions against individuals and companies that are not in compliance.
High Transaction Volumes Create Multiple Challenges
The SEC tracks the daily stock transactions conducted by the many brokerage houses in the United States. It uses the information it gathers to look for suspicious activity. When it discovers such activity, it initiates an investigation.
With billions of shares traded on the various stock exchanges each day, the SEC needs the ability to collect and analyze large volumes of data. Until fairly recently, the Commission relied on a mainframe-based system comprising an Adabas database and numerous applications written in Natural, COBOL, Java, and Sybase PowerBuilder to do this. Increasingly, it found the system to be labor intensive and costly to maintain, as well as limiting in terms of its ability to handle the ever-growing volume of data, facilitate data analysis, and protect data in the event of a system failure.
To simplify its systems while eliminating performance and analytical constraints, the SEC decided to consolidate its information infrastructure. It migrated to a Sun Solaris platform running Sybase Adaptive Server Enterprise (ASE), and adopted a common application development environment.
The resulting solution was an ASE-powered data warehouse employing Business Objects as the front-end analysis tool.
Growing Data Volume Creates Additional Challenges
When the SEC migrated off the mainframe, it had less than a terabyte of data under management. With its new data warehouse in place, however, it began developing new applications that generated copious amounts of new data to be managed and analyzed.
The Commission realized it now faced a new challenge: how to gather, maintain, and rapidly analyze much larger volumes of data without increasing storage costs.
It found its answer in Sybase IQ, Sybase’s highly optimized analytic engine, also noted for its unmatched storage efficiency. “We were going for high-volume data, speed, and savings on storage,” explains Samuel Foster, president of FosterSoft, Inc., an SEC IT contractor, “and replacing Sybase’s ASE general-purpose database with Sybase IQ enabled us to achieve that.”
Ensuring Data Availability
There was still the matter of ensuring data availability in the event of a system failure or service disruption. The SEC addressed this by building a mirror image of its primary system at a remote site 12 miles away. With this new disaster recovery system in place, failover to the secondary system takes just a few minutes, essentially providing continuous business operations.
Dramatic Results
The migration off the mainframe, the implementation of the new data warehouse, and the creation of a disaster recovery system have paid off handsomely for the SEC.
“We were going for high-volume data management, rapid analytical performance, savings on storage, and a robust and speedy disaster recovery capability. This new information infrastructure has enabled us to achieve all of that,” says Lewis Walker, assistant director for application development in the SEC’s Office of Information Technology.
Results to date include a 35 percent improvement in query response times, an ability to create and run more complex queries, a 50 percent reduction in storage requirements and costs, no unplanned downtime, a robust and rapid disaster recovery system, and an infrastructure that will accommodate future growth.
Several terabytes of data are now stored in the SEC data warehouse, where they are accessed by hundreds of SEC investigators and analysts. The next step is to make the data available to the Commission’s economists, who use SAS tools to identify long-term economic trends.
All of this enables the Commission to make more data available to more people for more purposes with greater efficiency. “That opens the possibility for better analysis and improved enforcement,” says Walker.
This article originally appeared in the issue of .