15 Oct Robust Data Management Required for Big Data
Big Data may not be living up to its hype among large asset managers, who seem unwilling to explore its use for more esoteric and less established analysis techniques, according to a panel at the Buy-Side Technology North American Summit in New York on Thursday, Oct. 10.
“For the most part, we’re not doing the kind of Big Data [analysis] that we’re talking about Google and others doing,” said Andrew Eiler, director of back-office technology at Highbridge Capital Management, a New York-based asset manager with $29 billion in assets under management, prompting moderator Dennis Gonzalez, senior consultant for hedge funds front-office investment technology at Deutsche Asset Management, to suggest that other industries are outpacing the financial markets in terms of how they exploit data.
However, Mihir Shah, chief technology officer at Fidelity Investments, suggested that firms’ reluctance may simply be a reflection of unsuitable propositions. “I tell people, ‘Go away and give me a use case.’ And nine out of 10 come back and say it will allow them to do sentiment analysis, and I tell them we’re not going to do sentiment analysis on Twitter,” Shah says. “What you can do [with Big Data] is simulate the impact of different market conditions, such as inflation rate changes, on a portfolio of bonds. We used to run 20 variations and scenarios. Now we can do a thousand variations in a million scenarios, to give more than a trillion outcomes─and we can do that in 10 minutes.”
In fact, firms may still be struggling to achieve the underlying data management techniques required for any Big Data analysis, as well as for good data governance.
“Most companies don’t even have their structured data in order before talking about jumping into Big Data,” according to Shah, who also said that integration of non-real-time data should not present big challenges for firms.
“Asset management data hasn’t changed much…. While the business and opportunities might be complex, data isn’t─there are only 13 or 14 categories of data,” he said, adding that the process can become challenging when trying to migrate away from older infrastructures and applications. “You need to separate data from its applications─then you can do more with it…. You can plug in any engine on top of your data, if the data is exposed.”
Another challenge can be the constantly-changing data demands of new regulatory requirements, which require flexible technologies, said Alex Tsigutkin, chief executive of AxiomSL.
“The regulatory reporting area is unique─especially if you are operating globally. Every country has different requirements for timing and filings… so you need a specific rules-based system, not just data, so that you can react to changes in the rules, or trigger actions off filings,” Shah added.
However, new regulations can present opportunities around data, as well as challenges. “We’ve had a data warehouse strategy for some time… and felt we had a pretty good solution, but with Form PF [and other new rules], we realized we only had about 80 percent of the data that regulations required, because others hadn’t collected it in the past,” Eiler said. “But in some ways, that’s a good thing, because it forced us… to readdress some more detailed risk analytics and aggregate investor data.”