Fourth annual data quality & management conference for financial institutions in APAC

0

With the rising complexity in managing massive volume of data in financial institutions, Marcus Evans will be organizing the 4th Annual Data Quality & Management in Financial Institutions 2020: Optimise, Integrate & Advance Conference, 16th – 18th March 2020 (Mon – Wed), in Singapore. More than 10 world-class speakers will be sharing their insights on how to prepare the data for current, new, and upcoming environment without compromising on quality and consistency.

Some of the world-class speakers include of Giancarlo Pellizzari – Head of Banking Supervision Data Division of European Central Bank, Germany; Tom Cronin – Head of Data Science & Data Engineering of Lloyds Banking Group, United Kingdom; Rishiraj Singh – Head of Enterprise Data Governance & Data Quality of United Overseas Bank, Singapore; Vinay Sharma – Senior Vice President Personalisation & Analytics – CBG (APAC) of DBS Bank, Singapore among others.

“In my view the importance of Data Privacy should not be measured in terms of regulatory compliance, but as a fundamental requirement to do business. Why do I say that? Customers place their money with a financial institution based on Trust. Trust that their money will be safe with the institution, and Trust the institution will have their best interest at heart in every aspect of its dealing with the Customer. In today’s connected world, keeping ‘information’ safe’ has become a pre-requisite to keeping our ‘money’ safe. So addressing data privacy and security concerns of our customers (whether there are regulations or not), is critical to winning and retaining customer trust.”, said Senthil Pasupathy, Head of Enterprise Data Models of Standard Chartered Bank.

“Data Quality must be controlled at the source and managed tightly throughout the life cycle of the data flow. Establishing a comprehensive Data Quality Management Framework that cuts across the organisation and encompasses Business, Technology and Processes is the first step. It needs to be backed by strong buy-in’s from data owners and data consumers on managing, controlling, measuring and reporting data quality issues. System level controls only work for new data being captured in the system, so there needs to be a clear approach and governance on handling data quality issues in legacy data as well. In case the issue cannot be handled at the source, the reporting process needs to have adequate controls and automated processes to manage the quality issues prior to reporting.”, Senthil further explained.

“Always work with business requirements and understand their roadmap. Work in agile approach in identifying critical data elements as first step. Once you have those critical and agreed with business, then you scope out all relevant information that you need to focus for data quality instead of boiling the ocean.”, added Melecio Valerio, Head of Enterprise Data Management of FWD Life Insurance.

This conference also features a full day post-conference workshop, themed “Designing a Robust Data Quality Framework that Drives Real Commercial Value”, designed to help participants to explore into structuring a robust data quality framework for a high commercial value.

Share.

Comments are closed.

Visit Us On TwitterVisit Us On FacebookVisit Us On LinkedinVisit Us On Youtube