FTF News interviews Justin Wheatley, group chief executive for StatPro, about his company winning the Best Performance Measurement System award, and some of the issues on the horizon for performance teams.
(FTF News recently got time with Justin Wheatley, group chief executive for StatPro, which won the Best Performance Measurement System honor of the FTF News Technology Innovation Awards for 2018. Wheatley’s career got its start in 1989 when he was a salesman with Micropal, an independent information provider covering mutual funds. By 1994, Wheatley founded StatPro and floated the business in May 2000 to expand StatPro internationally. For this Q&A, Wheatley focuses on the reasons why Revolution has taken off among end-user firms and some of the issues ahead for performance measurement teams.)
Q: What would you say was StatPro’s biggest achievement in 2017?
A: I would say our biggest achievement during 2017 was the successful acquisition of the UBS Delta platform and starting the integration with Revolution while still delivering on our core roadmap. Adding the Delta team and their expertise across fixed-income and risk is going to help make Revolution the best multi-asset class performance and risk analytics solution on the market.
Q: Why was StatPro’s Revolution Performance designed exclusively for the cloud?
A: We realized that existing legacy technology was a barrier to performance teams.
The overnight calculation window is getting smaller and many performance systems take several hours to calculate results. By designing a brand-new performance measurement solution as cloud-native, we’ve been able to remove technology barriers and make our client’s daily routine easier and simpler.
Elastic computing scalability means we can dynamically scale up our calculation engines to handle spikes in volume or urgent calculation requirements. This puts the client back in charge and not the technology.
Q: What are the major data issues that firms are still struggling with?
A: With daily transactions-based performance data, firms still struggle with the number of data points and transaction types that need to be managed every single day and with high levels of accuracy.
Some asset managers are outsourcing the performance data management process to service providers. By doing this, they can focus on the analysis and other core functions instead of dealing with data management issues.
For asset managers that still run their entire performance process internally, we’ve made it easier to automatically manage source data issues with over 90 configurable data controls in Revolution. This also includes data control over calculated results.
By adding intelligent data automation with full audit logging, we can help analysts spend their time on more productive and value add tasks.
Q: Why are some firms still using manual workarounds and manual systems?
A: This is a hangover from the legacy technology that still exists in many asset managers. Gaps in functionality are often managed with extensions or workarounds. Such on-premise workarounds are expensive to maintain and create operational risk, but some don’t have a choice if the underlying technology isn’t flexible enough to keep up with changing investment strategies and products.
Q: Why did StatPro acquire the Franco-German financial services group ODDO BHF?
A: This acquisition adds specialist risk knowledge and managed services revenue to StatPro. We believe managed services are a growth area as asset managers continue to focus on core activities and look to become more operationally efficient.
Q: Will StatPro be pursuing third-party partnerships in 2018 and beyond?
A: Partnering is something StatPro has done in the past and will continue to do going forward. Cloud and API [application programming interface] technology is making it easier to integrate and link up with other FinTech providers and also with asset service providers to create innovative solutions for the market.
Our partnership with Broadridge Financial Solutions has been a good example of this during recent months.
Q: What disruptive technologies excite you and why?
A: We are excited by many aspects of new technology, especially the continued and seemingly endless innovation from the AWS [Amazon Web Services] and [Microsoft] Azure platforms.
Everyone is looking to say how they are utilizing AI and machine learning in their solutions, but much of this is fake when you dig into the details?
We have benefited from machine learning within some of our risk analysis engines to make them more efficient by chaining calculations together automatically based on the requirement. Intelligent automation is also something that can really benefit products that work from huge amounts of source data. Ensuring data quality is high is the only way to ensure output is accurate. This is especially important in performance measurement when every single data point is part of the end result.
Q: What major product trends do you foresee for the rest of 2018 and beyond?
A: We see more consolidation in front-to-back office systems but also continued growth in specialist areas such as GIPS [Global Investment Performance Standards] compliance and risk compliance reporting.
We also see a growing trend and requirement for self-service analysis and reporting through flexible interfaces and APIs [application programming interfaces]. There are many stakeholders of portfolio analysis information and the days of servicing these stakeholders with static PDF reports are coming to an end.
By tailoring output to the audience, tracking what is successful and popular, you can create better and more valuable output for everyone.
It’s like providing a Netflix service for stakeholders and clients versus blindly pushing the same old PDF reports. We think this trend will continue as the middle office and service providers alike look to add more value and increase service levels.
Need a Reprint?