Beyond the hype: Belgian Financial Services Embrace AI
Size: 14.48 MB
Language: en
Added: Jun 25, 2024
Slides: 20 pages
Slide Content
17 institutions contributed to the questionnaire 10 in-depth interviews 5 case studies T ogether, they hold a significant share of the market* *70% of Retail Banking, 70% of Insurance and 60% of Wealth Management & Private Banking ‹#›
‹#› Predictive AI Generative AI Key objectives Identify patterns, anomalies, correlations to predict outcomes Create new content based on learned patterns and rules Use cases Forecast customer behavior, manage risk, detect fraud, market trends Generate personalised customer interactions, financial reports, marketing copy, synthetic data Algorithms used Regression, time series analysis, anomaly detection, decision trees Large language models, GPT models, variational autoencoders Training data Historical transactions, market data, customer profiles Financial news, regulatory documents, customer feedback, product descriptions Limitations Requires high-quality data, model interpretability can be challenging Risk of generating inaccurate or misleading information, potential for misuse
‹#›
‹#›
‹#› 1. RETURN
‹#› 1. RETURN
‹#› 1. RETURN
‹#› 1. RETURN
‹#› Benefits Roadblocks 2. RESOURCES
‹#› 2. RESOURCES
Centralised Centralised with CoE Coordinated Decentralised Resources All AI expertise concentrated within the central AI team. AI specialists report to both the CoE team and business unit leaders. Business units have their own AI teams or contract external expertise. Business units fully responsible for resourcing their AI projects. Governance Strict central control over data, algorithms, model selection, and deployment. CoE sets overall AI standards and guidelines. Central AI team provides tools, platforms, and best practices guidelines. Minimal central guidelines, focus on data privacy and basic ethical standards. Project Initiation Business units submit project proposals, central team prioritizes and executes. Collaboration between business units and CoE to define use cases. Business units drive their own AI initiatives, a central "steering committee" ensures alignment and coordination. Business units independently identify and execute their AI initiatives. ‹#› Centralised Decentralised CXO Business Units AI team Direct reporting Functional reporting 2. RESOURCES
‹#› 70% of employees rarely, if ever, use AI applications in their daily tasks. * *2024 survey - 1300 respondents
‹#› Figure 15: AI Act’s Risk Based Approach High risk Ex: Credit scoring, algorithmic trading 2 Limited risk Ex: Chatbots, robo-advisors 3 Minimal risk Ex: Budgeting tool, fraud detection, etc. 4 Unacceptable risk Ex: Social scoring 1 Risk level Obligation Prohibited Conformity assessment Transparency obligation No obligation 3. REGULATION Source: European Commission, Sailpeak
‹#› June 2023 Agreement on AI Act by European Parliament March 2024 AI Act adopted by European Parliament June 2024 Publication of the AI Act End of 2024 Bans on certain AI practices like social scoring will be enforced Mid 2025 High-risk AI systems (e.g. credit scoring) need to comply Mid 2026 All remaining regulations of the AI Act come into effect. Time to prepare systems, processes, conformity assessment, documentation, etc. Figure 16: AI Act Timeline Source: European Commission, Sailpeak 3. REGULATION