Cristina Pullen, J.D. Class of 2028
We’ve had credit reports since the 1800s. Only back then, agencies employed a combination of espionage and town gossip to gather information about someone worth selling. Highly subjective, and financially lucrative, these agencies worked to determine one thing: whether or not, and to what extent, a person was trustworthy.
Today, these agencies, formally known as “credit reporting agencies” (CRAs), often play a role in most important contracts you make in your life, such as those related to housing, leasing, employment, loans, insurance, credit cards, and more. Unlike in the 1800s, we now have the Fair Credit Reporting Act (FCRA) to make sure that modern “espionage” is done more or less fairly…or at least, accurately. But with technology becoming more and more ingrained in this already imperfect system, furnishers and CRAs run a higher risk of breaking the law, and leave the majority of consumers to bear the brunt of the consequences.
FCRA and the Accuracy Requirement
According to the FCRA compliance rules, CRAs must “follow reasonable procedures to assure maximum possible accuracy of information concerning the individual about whom the report relates.” Also, furnishers of the information have a responsibility “not [to] furnish information relating to a consumer to any consumer reporting agency if the [furnisher] knows or has reasonable cause to believe that the information is inaccurate.” Further, 15 U.S.C. § 1681s-2(A)(2)(B) mandates that a furnisher has a duty to report any corrections of incomplete or inaccurate information to the CRAs they supply for.
Also, it’s important to note that neither a CRA nor furnisher has a legal obligation to just take a consumer’s word for it. A consumer must share proof, such as paystubs, receipts, prior records of communication, or other documents to show that the mistake is real. The FCRA requires a 30 day turnaround time to either initiate the change or notify the consumer as to why they have ended the investigation.
This policy assumes that a consumer has an incentive to want to improve their credit reports, and that a truthful consumer would have the necessary proof to counter any inaccuracy. Unfortunately, document collection can be a high hurdle for consumers who pay in cash, are robbed, or have lost or lack records for whatever reason. Moreover, CRAs and furnishers commonly dismiss disputes as “frivolous” or “irrelevant” unless meticulously and professionally crafted, which is a big disadvantage to low-income communities or consumers with limited education.
A Credit Profile’s Tech Touchpoints
As for creating consumer profiles, data is fundamental. In theory, the more data a CRA has, the more representative their reports and scores will be. With the rise of AI and automation, technology is now involved in most—if not all—steps in the data collection and analysis process. This makes sense; there is a lot of data that goes into a credit profile.
Do you pay your bills on time? How many credit cards do you have open, how much money have you borrowed, do you own or rent a home, or are you employed? These questions now have real-time answers, gathered from places such as mobile banking, websites, or call centers.
While furnishers are tasked with collecting and submitting consumer information, the CRA’s role is to make sense of this data. After a furnisher submits the information to a CRA, the CRA plugs the information into algorithms, with proprietary formulas, to categorize the data in ways that show up in a profile.
Where Technology Exacerbates Inaccuracy
Without proper governance and tech oversight, it’s possible for technology to create a greater risk of non-compliance with the FCRA. Non-compliance can lead to serious financial consequences for consumers, costly lawsuits, and a big headache for all parties involved:
- An automation-loop nightmare. While the FCRA requires a meaningful investigation into disputes when made, automated systems can miss vital steps in the process. These misses can send consumers into a loop, effectively avoiding human review and keeping inaccurate info in the system longer.
In a recent case, a nurse tried to dispute inaccurate information tying her to a different person entirely. After receiving the same letter seven times, asking for more and more information to “prove” her case, she sued a major CRA and was awarded $18.4 million in punitive damages.
- Missed edge cases. One of the biggest draws—and challenges—of technology is its ability to reduce repetitive tasks. But often, the technology is trained on problem sets that reflect common issues. Many CRAs and furnishers are unlikely to know about uncommon or emerging issues until an algorithm has made the same mistake enough times to notice. By then, the damage is done.
- AI hallucination. Whenever AI is used to handle sensitive data, there’s always the chance of the AI going rogue and generating information that “seems correct but isn’t grounded in actual data.” In context, this may look like falsifying missed payments, triggering account matches between people with similar names, or even responding to a consumer directly with made-up information.
- Big-data black boxes. According to the World Bank, “big data acquisition and processing, use of cloud computing, reliance on third-party vendors and the black boxes associated with AI/ML systems can be contrary to regulations.” With a massive influx of data, and without transparency, it will be all the more difficult to know where or how to improve these high-tech processing systems to maintain consumer data integrity.
The Case for Technology in Credit Reporting
Despite the risks, investment in technology isn’t slowing down any time soon; the alternative is too risky for furnishers and CRAs that want to stay competitive. But hope is not lost for those that want to reap the benefits of tech innovation without disrupting compliance with the FCRA, and more importantly, without losing consumer trust.
If technology is adopted with practical governance in place, technology could greatly benefit everyone. For example, if AI and automation are used as the first step in resolving consumer disputes, many cases can be resolved a lot quicker. A human representative is likely only needed when a consumer has an edge case or needs to escalate an issue.
Further, AI can serve as a front-desk secretary. AI can guide consumers initially through a dispute with instruction on what’s needed for a valid report, creating less back and forth between the consumer and the furnisher or CRA human representatives. AI can triage requests, draft responses, scan relevant documentation, and flag items for review.
Finally, AI can be a much better tool for improving data accuracy, as long as there’s a human on the other end. For example, furnishers can scan data for accuracy before sharing any with a CRA. In turn, CRAs can use AI to flag major changes in a person’s report to verify the result. AI can also serve as a clean-up crew, embedded specifically to find outdated information and suggest deletion, tweaks, or general maintenance steps based on data patterns.