A lot has been said about machine learning and artificial intelligence (AI) regarding the sweet rewards they will bring to the mortgage industry but understanding the potential can be tricky to navigate. Before we get into dispelling the tricks or treats of some popular urban legends, here are a few quick definitions of AI and machine learning for context.
AI is used to describe any technology capable of applying knowledge to solve complex problems and find the optimal solutions related to a specific task or a range of tasks. It is a catch-all term that generally refers to a number of technologies capable of analyzing data and identifying patterns to make a decision and effect an outcome. One example could be using AI reasoning to make a credit decision.
Machine learning, on the other hand, is a subset of AI. It is a system that is “trained” to complete a task on its own based on large datasets, human instruction and self-learning algorithms that can recognize patterns. It allows systems to learn and improve as new information is made available without specific programming instructions. The best illustration of this is loan document classification, whereby exposure to multiple examples of the same document greatly increases the system’s ability to quickly recognize the document automatically.
With that in mind, let’s assess tricks and treats related to the deployment of AI and machine learning in mortgage lending.
Trick or Treat: Mortgage lending operations are already using AI and machine learning to reduce their costs and pave the way to the digital mortgage.
Treat! These emerging technologies are already treating mortgage lenders to a wealth of operational efficiency in mortgage lending by reducing the error prone data entry tasks that are prominent in doc processing. Automation more accurately brings data inconsistencies to the surface so that more complex decisioning can be deployed using that clean data.
It is the later where we, as an industry, are just scratching the surface. To begin to realize significant return on investment, mortgage data needs to be shared and validated in real-time across the POS, back-office and closing process. This will fuel the ability to move loan quality management to the forefront, analyze large data sets sooner, automate decisioning and continue to eliminate a wide range of manual tasks done by human staff. Doc processing is a solid first step to fully realize the digital mortgage.
Trick or Treat: AI is a smart “go it alone” activity for mortgage lenders.
Very much a Trick. AI skills are not likely found inside most mortgage lenders. AI tool providers, data providers and data scientists should be assembled as “part of the team” for which lenders will need to orchestrate engagement and manage contracts.
But you also might find that there are vendors you are using today who employ AI and machine learning in the delivery of the services they provide to you. They may already be tapping into the technologies and expertise that exist in and outside the mortgage industry to enable industry specific applications. Those with deep industry knowledge and access to rich “training datasets” may enable you to have a quicker path to using AI and machine learning in your business.
Trick or Treat? Use of AI in lending will lead to regulatory challenges for lenders.
This one is definitely tricky as it likely will, but questions still remain with how and for whom. The American Bankers Association posted an article back in April of this year that discussed who in this AI world now owns the model risk. The net-net from the article and from one of the industry’s primary pieces of industry guidance on model risk, the Federal Reserve’s 8 year old SR 11-7, is financial institutions must pay close attention to model development, validation, use and ongoing controls, tasks even more difficult and challenging for AI. As of now regulation for AI is a bit murky but getting ahead of any adverse impact or litigation is wise so you don’t get bit!
Trick or Treat: AI solves the “fair lending problem” because computers can’t be prejudiced.
Don’t be fooled into thinking this will solve the problem. You can prejudice an AI application through data selection, data curation and the algorithms deployed. Taking it one step further, there’s also the risk of the human programmers behind these intelligent tools that may introduce personal bias.
Because “fair lending” issues stem from intentional discrimination as well as disparate impact, both should be taken into consideration when designing AI applications that automate your lending practices. Checks and balances should be put in place to closely monitor and evaluate lending trends and patterns and adjust AI learning algorithms to avoid regulatory consequences.
Understanding AI and machine learning may be scary to some, but those that continue to do their research and work with others exploring these emerging technologies will ultimately be treated to successful outcomes.