Neil Smyth, marketing and technology director at StatPro, tells FTF News that 2016 was a turning point for the acceptance of cloud computing. He recently spoke to FTF News about StatPro winning Best Performance and Attribution System award via the 2017 FTF Technology Innovation Awards.
(Neil Smyth, marketing and technology director at StatPro, reports that 2016 was a turning point for the acceptance of cloud computing. FTF News got time with Smyth to talk about StatPro Revolution and its wins as Best Performance and Attribution System via the 2017 FTF Technology Innovation Awards competition. Smyth oversees global marketing at StatPro along with group IT functions including on-premise and cloud infrastructure, security, and info-sec compliance. Smyth started the research project that became StatPro Revolution in 2008, officials say. He led an initial group of architects and developers to produce prototypes and betas that facilitated the offering’s commercial launch in March 2011. He also works with product managers, developers and sales to formulate StatPro’s marketing strategy.)
Q: In your opinion, was 2016 a turning point for the acceptance of cloud computing not only for StatPro’s offerings but in general?
A: We’ve seen attitudes toward cloud computing change a lot over the last few years with mainstream players such as Amazon, Microsoft and Google really gearing up their enterprise features that enable vendors such as StatPro to produce applications that are far superior in performance, resiliency, flexibility and security compared to on-premise legacy software.
We also saw industry regulators welcoming cloud computing as a way for financial services firms to realize its many benefits, including the Monetary Authority of Singapore.
Q: Along those lines were there any performance measurement developments in 2016 that StatPro anticipated and benefited from?
A: Performance measurement as a discipline is a very well understood part of the asset management middle office, but developments do continue.
StatPro spent years designing and developing Revolution Performance to help our clients create a Performance Book of Record from a single system, covering all asset classes and handling any data volume.
The increase in data volume, in particular, is something we saw in 2016 and it continues as regulators and clients want more granular data, and sales wanting detailed analysis to help explain the investment strategy.
The combination of performance and risk is also a theme within the industry that continues and makes a lot of sense.
Using the same data set and platform to explain both performance and risk can help firms reduce the number of systems they have to maintain and also reduces operational risk and the reliance on manual processes and data management headaches.
Q: Looking ahead, what regulatory initiatives will have the greatest impact upon performance measurement practices in 2017?
A: Regulation continues to play a major part in our clients’ portfolio analytics strategy and also at a wider business level.
Costs continue to rise and fees are under tremendous pressure with the rise of so many passive investment products. All this has been leading our clients to look at their existing application environment and the ways they manage technology.
Bringing data together from separate systems using outdated data management processes makes it difficult and expensive to produce output for regulators.
The cloud brings opportunities to consolidate sources of data while using Web APIs [application programming interfaces] to link back to on-premise systems. Cloud-based systems also make it easier to securely share information online to stakeholders such as regulators.
Q: How has the global push for more regulation impacted the status of performance measurement staffs within a firm?
A: Performance measurement teams are already under stress servicing so many internal and external stakeholders.
An increase in demand from regulators means performance measurement teams need to be very efficient. They can only do this with the right tools. Performance measurement today needs accuracy, automation, scalability and speed, something legacy technology cannot deliver.
Performance measurement staff are spending too much time on data management issues, ad-hoc requests, and manual processes associated with moving data from one silo to another.
This wasted effort reduces the time they have for value add tasks such as the performance and attribution analysis itself and working with the front office and sales teams. Middle office teams are a crucial and central part of a successful asset manager, but they need a new generation of tools to deliver.
Need a Reprint?