AI Audit For Financial Services
PrivacyLabs is pleased to offer audit services specifically for the financial-services sector. As banks and financial-service providers increase use of automation and machine learning to cut costs and to remain competitive, there is an ever-increasing need to ensure the robustness and fairness of their machine learning quest.
Given the underlying breadth of our services around privacy, security, technology and, in particular, artificial intelligence, we maintain an unrivaled competitive advantage at a surprisingly low cost whether you need assistance in preparing for an audit or an objective third-party to perform an audit. These services require an unavoidably intertwined overlap of the following areas that PrivacyLabs is uniquely qualified to deliver.
AI Audit Services Areas
Assurance of the proper design and performance of the machine learning efforts to include adherence to explainability and bias / fairness requirements.
Ensure that data collection and use are compliant with company policy as well as state, national and international privacy and data-protection laws.
No audit within this context is complete without careful and holistic consideration of every aspect of the security of data, the architectures / solutions and machine learning processes.
There will be many other related issues as part of any audit that cannot be stated in detailed form. General information governance practices will be leveraged by our teams to ensure a broad and thorough audit plan.
Financial Crime and Anti-Money Laundering
PrivacyLabs maintains specific talent and technology resources to address this highly-regulated vertical.
Paul Starrett, founder of PrivacyLabs, works as a liaison between the non-technical and technical teams to ensure overall coherence for a plan that is holistic and very efficient.
Paul has worked as general counsel and chief risk officer for an international, publicly traded information technology company. Paul has many years of experience as an information security software engineer and IT professional as well as seven years of experience in machine learning for the legal sector.
Paul’s education includes a Master of Science in Data Science (Predictive Analytics) degree from Northwestern University in Evanston, IL and a Master of Laws in Taxation from Golden Gate University School of Law. This combined experience gives Paul a sweet spot of skills to bridge the gap between privacy and compliance professionals and the deep technical needs necessary for an outstanding audit result.
Paul began his career in fraud investigations and audit which included audit management and development for Fortune 500 companies. He has attained the status of a Certified Fraud Examiner and a computer forensics expert (OpenText’s Encase, EnCE). This additional experience also allows Paul to manage audits related to financial institutions with a focus on financial crime and anti-money laundering.
Paul also works in the area of synthetic data which may help with compliance needs. This type of data is used to generate highly-performant machine-learning models that greatly assist with explainability and privacy-regulation requirements.
Jermand Hagan, who heads audit management, is a well-rounded, 25-year Technology executive with poise and presence. His core specializations include leadership, technology risk, information security, cloud governance, business resiliency, audit and technology compliance.
Jermand managed regulatory interactions with Freddie Mac’s (Government Sponsored Entity) Board Committee members and senior leadership for the Technology, Finance, Enterprise Project, and Resiliency offices during the company’s challenging time in conservatorship. During his tenure, he became the acting head of the Regulatory Affairs department, maintaining leadership continuity and encouraging change. Jermand has also successfully transformed the relationship between the company’s primary regulator and management by fostering transparency, while providing strategic advice to leadership regarding safety and soundness matters.
Jermand has also founded and led the Technology Compliance (second line of defense) sector at TIAA. Jermand developed and led a team of Compliance professionals toward providing a holistic second line of defense; ensuring adherence with internal policy, legal and regulatory expectations of the OTS, OCC, SEC, FINRA and insurance regulators of all 50 states. Prior to this Jermand ran a global ethical hack program for Citigroup including a portfolio of systems that transferred over $400 trillion per day.
Last, Jermand has also held a number of audit, risk and information security positions at Coopers and Lybrand (PWC,) Citigroup, Société Générale, AXA (Equitable,) Johnson and Johnson, Philip Morris and held a US Secret Security Clearance.
Gibson Martin, who coordinates the technical side of our explainability offering, is an experienced Python developer and Deep Reinforcement Learning expert.
He is constantly engrossed in the latest machine learning research papers and models, and has invested thousands of hours aligning tensor pipelines and training data with model frontends. Investigating model failure modes to detect and remedy architecturally imposed biases is the primary foundational skill necessary for high level Deep Reinforcement Learning, and is the fundamental skill that encompasses all technical aspects of the assessment and auditing of machine learning models. Consequently, this provides Gibson with a unique background in this context.
Armed with explainability libraries, such as SHAP, LIME and similar tools, as well as his relevant domain knowledge, he brings a critical talent to the team that cannot be found anywhere else.