5 Years of GDPR | The Right to Privacy in Credit Scoring
As technological advancements impact the financial industry, subsequent privacy issues often come as an afterthought rather than playing an integral part in product development.
In the case of credit scoring, credit agencies and other credit scoring firms gather large amounts of data in order to produce creditworthiness scores. In this process certain privacy risks are taken, often unknowingly. In this article, we will highlight some potential risks credit agencies take when conducting credit risk assessments, the effect it has on the financial industry and how we need to support new technology to ensure sustainable privacy.
The right to privacy is being challenged
Credit risk assessments play a significant role within the financial sector, and technological advancements making these assessments more efficient have been proven to often overlook privacy issues. One way that the credit risk assessment process has advanced is through so-called open banking. In short, open banking allows for third-party financial service providers to access account information, transaction data, and other types of financial data from banks and other financial institutions. Apart from the data accessed through the open banking system, third-party companies could be gathering publicly available information to be able to evaluate a person’s creditworthiness. This information is then typically analysed by an algorithm which presents a score based on certain factors. How the data is weighted and what factors play a role in the calculation of the credit score is often not revealed, meaning there is a lack of transparency of the processes used for the scoring, i.e. a black box problem.
Through the new processes, there is also an issue of accuracy with the scores being produced. Although the processes for credit risk assessments have become more efficient, there is a risk that the data being processed, is incorrect, contextual, or biased. This combined with the lack of transparency of how the data is weighted could result in distrust among customers and inaccurate decisions being made based on the creditworthiness scores.
Recent cases illustrate the problem
Credit algorithms and the GDPR
Schufa Holding AG, a credit agency, denied access to the specific information used for the calculation of a data subject’s credit score. The credit score created by Schufa had been used by a financial institution as a basis to deny a loan to the data subject. According to Schufa, information about any specific data or methods used in the scoring process could not be disclosed as these were deemed to be trade secrets. This resulted in the data subject lodging a complaint to the Hessian Data Protection Authority (HBDI). However, the HBDI declined to take action against Schufa, claiming that the company was compliant with the German Federal Data Protection Act. The data subject then filed a lawsuit with the Administrative Court of Wiesbaden which led to a request for a preliminary ruling on whether the automated creation of the credit score constitutes an automated decision within the meaning of the GDPR.
In the opinion of the Advocate General (AG), since Schufa’s credit score is calculated automatically using an algorithm, it constitutes an automated decision-making process. Specifically in the form of profiling within the meaning of the GDPR. Although the financial institution made the decision to reject the data subject’s application for a loan, in the area of consumer loans it is important to keep in mind that credit scores play a decisive role in the decision making by financial institutions. If the view of the AG were to be shared by the court, that would mean that credit scoring by Schufa, as well as other credit agencies, would only be permissible if a legal basis under Art. 22 GDPR applied. This would also strengthen the right to access to information in these cases, meaning that Schufa would have to provide detailed, meaningful information on the scoring methods used.
noyb complaint against the credit ranking agency CRIF GmbH and the address publisher AZ Direct
It was revealed that the credit ranking agency CRIF GmbH had, in addition to open banking data, been collecting addresses, date of birth and names of a majority of Austrian citizens in order to calculate creditworthiness. The data was collected through the address publisher AZ Direct, which is only allowed to pass on this data for marketing purposes, not credit scoring purposes. The privacy organisation None of Your Business (noyb) filed a complaint against CRIF as well as AZ Direct for the problematic approach of the companies regarding the shared data and the lack of transparency. The collection of data and subsequent calculation of credit score had been done without consent, and with CRIF arguing that the assessment of creditworthiness falls within the area of activity of an address publisher, allowing them to transmit address data for the purpose of credit scoring. This argument did not convince the Austrian Data Protection Authority (DSB), which deemed the processing to be non-compliant with the GDPR. This use of data had resulted in the creation of credit scores which had then been used as a basis to deny data subjects cell phone contracts and electricity contracts.
The Austrian DSB reached the conclusion that the processing of data was non-compliant with the GDPR. However, the authority has not prohibited the processing in and of itself, instead referring to a general official prohibition which is still pending.
Supporting new technology while maintaining privacy
We need to keep moving forward, developing new solutions to ensure progress within the credit area, and we need to do this while maintaining personal integrity. The practices within credit risk assessments as described above, can entail high risks for the rights and freedoms of data subjects. Therefore, it is of importance to be aware of these potential risks. This caution applies to both credit agencies and other credit scoring firms as well as financial institutions which can use the credit scores when making decisions affecting individuals. Therefore, close attention should be given to the privacy issues and we need to support the advancements on what to do, rather than informing on what not to do.
What key activities need to be addressed?
- Firstly and maybe most importantly, we need to ensure transparency concerning the data processing. This includes providing the data subjects all the information they need in order to make informed decisions about the use of their personal data.
- Secondly we need to ensure data accuracy by controlling the data source and the logic behind any algorithm used. Implemented processes for carrying out Data Protection Impact Assessments (DPIA) will support this work. Further, ensuring privacy by design and by default will mitigate the risk of data breaches.
- Finally, we need to conduct well-made balance of interest assessments to ensure that the data subjects interest of privacy is well taken into consideration when using new technology for credit risk assessment.
A long-term vision and strategy for personal data processing, combined with clear step-by-step processes to hold on to, is what every organisation needs in order to establish a GDPR-compliant and sustainable processing.
5 Years of GDPR
May 25th, 2023, marks the five-year anniversary of the enforcement of GDPR. This spring we reflect and review on the first comprehensive privacy regulation in a series of publications and events. Stay tuned for insights and perspectives on expectations vs. realties of a sustainable privacy arena, the legal ecosystem of GDPR, the future role of tech and much more.