{"id":111576,"date":"2025-12-08T15:55:14","date_gmt":"2025-12-08T14:55:14","guid":{"rendered":"https:\/\/www.hiig.de\/?p=111576"},"modified":"2025-12-08T16:34:13","modified_gmt":"2025-12-08T15:34:13","slug":"automated-credit-lending","status":"publish","type":"post","link":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/","title":{"rendered":"The Human in the Loop in automated credit lending \u2013 Human expertise for greater fairness"},"content":{"rendered":"\n<p><strong>Not every credit decision can be left to machines. Although banks use automated assessment systems to save time, real-life situations are often too complex for purely algorithmic models. This is precisely where human expertise becomes essential. In our report, &#8216;Human in the Loop in Credit Decision-Making&#8217;, we demonstrate how front-desk staff, risk analysts, and external agencies collaborate with automated models to evaluate creditworthiness. Our analysis highlights why human judgement is indispensable in credit lending, particularly in ambiguous cases. The question is: how can human and machine decision-making interact to make credit assessments fairer, more transparent and easier to understand?<\/strong><\/p>\n\n\n\n<p>In the digital age, automated processes that make decisions with or without human intervention are becoming increasingly common. These processes are based on algorithms, artificial intelligence (AI) or rule-based systems, and are used in areas such as healthcare and finance, including for loan approvals. The first case study of our <a href=\"https:\/\/www.hiig.de\/en\/project\/human-in-the-loop\/\" target=\"_blank\" rel=\"noreferrer noopener\">&#8216;Human in the Loop?&#8217; research project<\/a>, examines precisely this area. Autonomy and Automation in Socio-Technical Systems, examines precisely this area. How do automated processes and human actors collaborate in the context of lending decisions? Who is responsible for oversight? Who ensures the quality of decisions?<\/p>\n\n\n\n<p>The aim of lending is to evaluate applications efficiently and fairly. First, algorithms analyse data such as income, credit history, existing debts and repayment behaviour. This initial check is typically performed by third-party providers, such as credit bureaus. The resulting credit scores are then fed into an internal bank traffic light model that calculates the credit default risk. If the risk is within a predefined range, the application is either approved or rejected immediately. In unclear cases, risk analysts review the financial data and make a different decision if necessary.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Humans and machines<\/strong> working together<\/h2>\n\n\n\n<p>Interacting with machines can therefore bring considerable advantages. Automated systems can process large amounts of data quickly and consistently, while human decision-making allows for flexibility and the consideration of individual circumstances. At the same time, automation reduces emotional or subjective influences, which can lead to more objective lending decisions. Humans, in turn, can correct misjudgements by considering aspects that algorithms overlook, such as sudden changes in income due to parental benefits, or alternative collateral, such as property. This prevents overly strict system rules from leading to unjustified rejections. However, for this interaction to lead to fair and well-founded decisions, the algorithms used must be critically examined, distortions must be identified, and human assessments must be used in a targeted manner. Only then can the system remain both economically efficient and socially just.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>The Human in the Loop project<\/strong><\/h2>\n\n\n\n<p>The key question of our project <a href=\"https:\/\/www.hiig.de\/en\/project\/human-in-the-loop\/\">&#8216;Human in the Loop? Autonomy and Automation in Socio-Technical Systems<\/a> based at the Alexander von Humboldt Institute for Internet and Society (HIIG), is: How can humans and machines collaborate effectively to exploit the advantages of automation without losing important human skills and values? Funded by the Mercator Foundation, we are examining this topic through various case studies. In this blog post, we present key findings from a <a href=\"https:\/\/www.hiig.de\/publication\/hilo-kreditvergabe-praxisbericht\/\" target=\"_blank\" rel=\"noreferrer noopener\">practical report on credit lending<\/a>, analysing how lending decisions are made in practice, from the initial consultation to risk assessment, and examining the importance of human expertise and automation in this environment.<\/p>\n\n\n\n<p>It is important to note that current widespread procedures are predominantly based on rigid if-then rule-based systems. Modern, adaptive AI solutions, on the other hand, are not yet widely used in creditworthiness assessments. Instead, banks use deterministic systems supplemented by human experience and expertise. The human contribution remains indispensable, especially in special cases where automated processes reach their limits.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Humans in the Loop: Front-desk staff and risk analysts<\/strong><\/h2>\n\n\n\n<p>The concept of a &#8216;human in the loop&#8217;, whereby a single person monitors and controls an automated system, does not reflect the reality of lending. Within a bank, several people are usually involved at different stages of the decision-making process. They often not only passively monitor the results of automated systems, but also actively intervene in the decision-making process. A key finding of the project is that human actors perform a variety of different functions in lending; two stand out in particular:<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">1. Front-desk staff as the first point of contact<\/h3>\n\n\n\n<p>The front desk staff are the first point of contact for customers. They accept loan applications and guide customers through the application process. Their responsibilities extend well beyond merely recording data. They provide advice and support to applicants, help them to avoid input errors and, if necessary, forward applications to risk analysts. In practice, creditworthiness checks often use a traffic light system: green means a positive credit decision and red means rejection. If the signal is yellow, indicating an unclear recommendation, the case is forwarded to the risk analysts. Front desk employees have access to credit scores and other relevant data, but they do not have any decision-making authority in cases where the data is clear. Their role is therefore primarily advisory and coordinating, and they only make decisions in exceptional cases involving individual special solutions.<\/p>\n\n\n\n<p>The following example from our interviews illustrates this:<\/p>\n\n\n\n<div style=\"height:20px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><em>\u201cI&#8217;ve had this happen: parents on parental leave. Looking back, I\u2019d never have thought that we could offer them a mortgage. But then I spoke to a caseworker who said, &#8216;That&#8217;s understandable. Just process it manually.\u2019 As long as a human factor is involved, decisions outside the standard are possible.\u201d<\/em><em><br><\/em> \u2014 Ren\u00e9 Stephan, Business Customer Advisor<\/p>\n<\/blockquote>\n\n\n\n<h3 class=\"wp-block-heading\">2. Risk analysts as experts in credit assessment<\/h3>\n\n\n\n<p>In particular, risk analysts take over the review of loan applications when the automated system issues a yellow signal, i.e. an ambiguous recommendation. They review each case individually and make the final decision on whether to grant the loan. These experts have an in-depth knowledge of financial data and often have many years of experience in evaluating loan applications. This enables them to identify and correct deviations from standardised assessments manually if necessary.<\/p>\n\n\n\n<div style=\"height:20px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><em>\u201cOf course we use a rating system to assess creditworthiness. It\u2019s required by regulation. But at the end of the day, it\u2019s humans who make the decision.\u201d<\/em><em><br><\/em> \u2014 Credit Risk Management Expert<\/p>\n<\/blockquote>\n\n\n\n<p>Risk analysts play a key role in preventing bad decisions by ensuring that individual circumstances, which standardised systems cannot properly assess, are taken into account in the final decision.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Factors influencing human decision-making<\/strong><\/h2>\n\n\n\n<p>However, neither front desk employees nor risk analysts operate in isolation when making decisions. Their assessments are influenced by various factors, ranging from economic conditions and the internal guidelines of credit institutions to individual experiences and judgements. We identified these factors influencing the quality of decisions through various stakeholder interviews. These factors can be divided into three dimensions:<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"849\" height=\"1200\" src=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-849x1200.jpg\" alt=\"\" class=\"wp-image-111712\" srcset=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-849x1200.jpg 849w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-566x800.jpg 566w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-42x60.jpg 42w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-768x1085.jpg 768w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-127x180.jpg 127w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-410x579.jpg 410w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-35x50.jpg 35w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-255x360.jpg 255w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-1087x1536.jpg 1087w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-1449x2048.jpg 1449w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-1320x1866.jpg 1320w, https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/HiLo-Graphik-Kreditvergabe_page-0001-scaled.jpg 1811w\" sizes=\"auto, (max-width: 849px) 100vw, 849px\" \/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">1. External influencing factors<\/h3>\n\n\n\n<p>These include various social, economic and legal conditions. Shortages of skilled workers, banks&#8217; economic goals, and legal requirements such as the General Equal Treatment Act (AGG) influence the decision-making process, affecting the scope, transparency, fairness, and quality of decisions and processes. The completeness and traceability of the data sets used to train automated systems and provided by external credit agencies, such as Schufa, also play a decisive role in data quality.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">2. Influencing factors at the level of the actors<\/h3>\n\n\n\n<p>Several factors influence the quality of credit decisions at this level: the actors&#8217; understanding of their roles and professions, personal contact, and the scope available for decision-making. Prejudices against specific life circumstances or subjective assessments by individual employees can impair objective evaluation. At the same time, experts emphasise that, when it comes to lending, the human ability to understand special cases and individual life situations represents significant added value.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">3. Technical influencing factors<\/h3>\n\n\n\n<p>The available technology also influences how people make decisions. Decisive factors here include the database, the transparency and traceability of the system, and the design of the user interface, i.e. whether it enables intuitive, efficient and aesthetically appealing interaction between humans and machines. A poorly designed interface can cause users to overlook important information or enter it incorrectly. Transparency of algorithms is also essential, so employees can understand how and why the system arrives at a particular recommendation and question it if necessary. Added to this is the error culture that a company cultivates when dealing with automated processes.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Challenges in the interplay between humans and machines<\/strong><\/h2>\n\n\n\n<p>But how can we ensure that automated processes remain fair and transparent? Where do risks arise, and what measures are necessary to minimise them? Our analysis in the practical report reveals several challenges that need to be addressed.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Communication problems and lack of overall process knowledge<\/h3>\n\n\n\n<p>One of the key challenges is that neither individuals nor institutions have a complete overview of the entire automated decision-making process. Insufficient understanding of the system architecture \u2014 i.e. how decisions are made in individual cases, which algorithms are used and how data flows \u2014 makes it difficult to recognise important connections. This becomes particularly problematic when human intervention is necessary to correct special cases.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Lack of transparency<\/h3>\n\n\n\n<p>Another point of criticism is the lack of transparency, both at credit institutions and at credit agencies such as Schufa and Creditreform. Consumers often have little idea why a credit decision has been made. This makes it difficult for them to realistically assess their own creditworthiness and make the necessary adjustments. At the same time, employees often lack sufficient information about how the automated system works, leading to uncertainty and potential misjudgements.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Discrimination and biases<\/h3>\n\n\n\n<p>Discrimination also plays an important role in lending. Current legislation, such as the General Equality Act, does not offer consumers adequate protection against unfair discrimination. Furthermore, there is a lack of effective mechanisms for investigating allegations of discrimination in court. There is a discrepancy between the theoretical neutrality of algorithms and the fact that their design and operation are influenced by human biases and experience.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Uncertainty in dealing with automation technologies<\/h3>\n\n\n\n<p>Many employees involved lack sufficient technical understanding of the systems used. Our interviews revealed that some employees are unclear as to whether rule-based systems or AI-based applications are used in lending, and what limitations these systems have. This lack of technical know-how carries the risk that the system&#8217;s recommendations will either be accepted uncritically or questioned too much.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Recommendations for improving credit decision-making<\/strong><\/h2>\n\n\n\n<p>Based on these findings, the practical report sets out specific suggestions on how to strengthen the role of the &#8216;human in the loop&#8217; and improve the entire decision-making process.<\/p>\n\n\n\n<div style=\"height:20px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex\">\n<h3 class=\"wp-block-heading\">1. Expanding anti-discrimination law (AGG)<\/h3>\n\n\n\n<p>To combat discrimination in the lending sector more effectively, the legal framework should be expanded. Extending the AGG to consumer credit would make it easier for affected individuals to take legal action against unjustified rejections. This could be achieved by reversing the burden of proof and providing increased support from anti-discrimination associations. This would incentivise banks to implement discrimination-sensitive processes.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">2. Increasing transparency<\/h3>\n\n\n\n<p>Greater transparency on the part of credit institutions and credit agencies would be helpful. Consumers should be informed about the relevant decision-making factors in easily understandable language. From 2026, this will likely become legally binding due to the revision of the European Consumer Credit Directive. Additionally, banks could enhance internal communication to ensure that all stakeholders, from counter staff to risk analysts, clearly understand the decision-making logic and limitations of the system.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">3. Improving financial literacy<\/h3>\n\n\n\n<p>In addition to institutional measures, educating consumers individually also plays an important role. Targeted educational programmes can help applicants to assess their creditworthiness more realistically and to enter their financial data correctly during the application process. Interactive explanatory formats and targeted training courses are useful for explaining how to deal with credit decisions in an understandable, everyday context.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">4. Training and professional development<\/h3>\n\n\n\n<p>To increase the effectiveness of human involvement, front desk employees, particularly risk analysts, should receive regular training in technical and procedural issues. The aim is to impart a technical understanding of the systems used and to develop the ability to question their limitations critically.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">5. User-friendly application design<\/h3>\n\n\n\n<p>The increasing automation of the application process must not result in consumers having to acquire specific skills necessary for the success or failure of their application, for which they would then be held responsible. Intuitive, accessible application interfaces and personal contact options are important in ensuring that people without in-depth technical knowledge can also successfully apply for a loan. These two factors together would ensure that individual circumstances are adequately considered and that special cases are not overlooked due to rigid, automated processes.<\/p>\n<\/div>\n<\/div><\/div>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Outlook: Human <\/strong>e<strong>xpertise as a guarantee of fair decisions<\/strong><\/h2>\n\n\n\n<p>Automating lending processes can speed things up and reduce errors. However, human expertise remains indispensable, particularly in sensitive areas that directly impact people&#8217;s lives. Our case study clearly shows that the combination of human judgement and automated data processing creates real added value, provided the relevant influencing factors are understood and controlled effectively.<\/p>\n\n\n\n<p>Although automated systems based on rules are now standard, humans remain an integral part of the decision-making process. Employees&#8217; expertise and experience complement what algorithms cannot do. In special cases in particular, this difficult-to-quantify information and the human factor can mean the difference between a mechanical and a responsible decision.<\/p>\n\n\n\n<p>At the same time, however, our study also reveals significant challenges. There is often a lack of comprehensive understanding of processes, both within banks and towards customers. Insufficient communication channels and technical knowledge can lead to automated recommendations being misinterpreted or implemented incorrectly. Furthermore, discrimination and biases pose risks that must be urgently addressed as automation increases.<\/p>\n\n\n\n<p>Automation should not be an end in itself; rather, it should be understood as a tool that, when used correctly, can complement and strengthen human judgement. Only in this way can banks and credit institutions continue to provide responsible, non-discriminatory lending services in future.<\/p>\n\n\n\n<div style=\"height:20px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p><em>This article was first published in German on April 15, 2025, on the <a href=\"https:\/\/zevedi.de\/efinblog-automatisierte-kreditvergabe-menschliche-expertise-fur-grossere-fairness\/\">eFin blog<\/a> of the discourse project \u201cDemocracy issues of the digitalised financial sector\u201d (eFin &amp; Democracy) at the <a href=\"https:\/\/zevedi.de\/en\/\">Center Responsible Digitality (ZEVEDI)<\/a>.<\/em><\/p>\n\n\n\n<div style=\"height:20px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Full report<\/strong><\/h3>\n\n\n\n<p>Z\u00fcger, T., Mahlow, P., Mosene, K., &amp; Pothmann, D. (2025),&nbsp; <a href=\"https:\/\/www.hiig.de\/publication\/hilo-kreditvergabe-praxisbericht\/\" target=\"_blank\" rel=\"noreferrer noopener\"><em>Praxisbericht: Human in the Loop im Feld der Kreditvergabe<\/em><\/a> [Praxisbericht f\u00fcr den Sektor Finanzdienstleistung], Alexander von Humboldt Institut f\u00fcr Internet und Gesellschaft (HIIG).<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Further information<\/strong><\/h3>\n\n\n\n<p>on the website of our <a href=\"https:\/\/www.hiig.de\/en\/project\/human-in-the-loop\/\" target=\"_blank\" rel=\"noreferrer noopener\">research project <em>Human in the Loop?<\/em><\/a><\/p>\n<div class=\"shariff shariff-align-flex-start shariff-widget-align-flex-start\"><ul class=\"shariff-buttons theme-round orientation-horizontal buttonsize-medium\"><li class=\"shariff-button linkedin shariff-nocustomcolor\" style=\"background-color:#1488bf\"><a href=\"https:\/\/www.linkedin.com\/sharing\/share-offsite\/?url=https%3A%2F%2Fwww.hiig.de%2Fen%2Fautomated-credit-lending%2F\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0077b5; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 27 32\"><path fill=\"#0077b5\" d=\"M6.2 11.2v17.7h-5.9v-17.7h5.9zM6.6 5.7q0 1.3-0.9 2.2t-2.4 0.9h0q-1.5 0-2.4-0.9t-0.9-2.2 0.9-2.2 2.4-0.9 2.4 0.9 0.9 2.2zM27.4 18.7v10.1h-5.9v-9.5q0-1.9-0.7-2.9t-2.3-1.1q-1.1 0-1.9 0.6t-1.2 1.5q-0.2 0.5-0.2 1.4v9.9h-5.9q0-7.1 0-11.6t0-5.3l0-0.9h5.9v2.6h0q0.4-0.6 0.7-1t1-0.9 1.6-0.8 2-0.3q3 0 4.9 2t1.9 6z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button bluesky shariff-nocustomcolor\" style=\"background-color:#84c4ff\"><a href=\"https:\/\/bsky.app\/intent\/compose?text=The%20Human%20in%20the%20Loop%20in%20automated%20credit%20lending%20%E2%80%93%20Human%20expertise%20for%20greater%20fairness https%3A%2F%2Fwww.hiig.de%2Fen%2Fautomated-credit-lending%2F  via @hiigberlin.bsky.social\" title=\"Share on Bluesky\" aria-label=\"Share on Bluesky\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0085ff; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"20\" height=\"20\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 20 20\"><path class=\"st0\" d=\"M4.89,3.12c2.07,1.55,4.3,4.71,5.11,6.4.82-1.69,3.04-4.84,5.11-6.4,1.49-1.12,3.91-1.99,3.91.77,0,.55-.32,4.63-.5,5.3-.64,2.3-2.99,2.89-5.08,2.54,3.65.62,4.58,2.68,2.57,4.74-3.81,3.91-5.48-.98-5.9-2.23-.08-.23-.11-.34-.12-.25,0-.09-.04.02-.12.25-.43,1.25-2.09,6.14-5.9,2.23-2.01-2.06-1.08-4.12,2.57-4.74-2.09.36-4.44-.23-5.08-2.54-.19-.66-.5-4.74-.5-5.3,0-2.76,2.42-1.89,3.91-.77h0Z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button mailto shariff-nocustomcolor\" style=\"background-color:#a8a8a8\"><a href=\"mailto:?body=https%3A%2F%2Fwww.hiig.de%2Fen%2Fautomated-credit-lending%2F&subject=The%20Human%20in%20the%20Loop%20in%20automated%20credit%20lending%20%E2%80%93%20Human%20expertise%20for%20greater%20fairness\" title=\"Send by email\" aria-label=\"Send by email\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#999; color:#fff\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 32 32\"><path fill=\"#999\" d=\"M32 12.7v14.2q0 1.2-0.8 2t-2 0.9h-26.3q-1.2 0-2-0.9t-0.8-2v-14.2q0.8 0.9 1.8 1.6 6.5 4.4 8.9 6.1 1 0.8 1.6 1.2t1.7 0.9 2 0.4h0.1q0.9 0 2-0.4t1.7-0.9 1.6-1.2q3-2.2 8.9-6.1 1-0.7 1.8-1.6zM32 7.4q0 1.4-0.9 2.7t-2.2 2.2q-6.7 4.7-8.4 5.8-0.2 0.1-0.7 0.5t-1 0.7-0.9 0.6-1.1 0.5-0.9 0.2h-0.1q-0.4 0-0.9-0.2t-1.1-0.5-0.9-0.6-1-0.7-0.7-0.5q-1.6-1.1-4.7-3.2t-3.6-2.6q-1.1-0.7-2.1-2t-1-2.5q0-1.4 0.7-2.3t2.1-0.9h26.3q1.2 0 2 0.8t0.9 2z\"\/><\/svg><\/span><\/a><\/li><\/ul><\/div>","protected":false},"excerpt":{"rendered":"<p>How fair is automated credit lending? Where is human expertise essential?<\/p>\n","protected":false},"author":313,"featured_media":111891,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1289,1577,227,1579,224],"tags":[],"class_list":["post-111576","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","category-digital-so","category-everyday-life","category-ftif-plattformen-governance","category-policy-and-law"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>The Human in the Loop in automated credit lending &#8211; Digital Society Blog<\/title>\n<meta name=\"description\" content=\"How fair is automated credit lending? Our case study shows why not every decision can be left to machines and where human expertise is essential.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hiig.de\/en\/automated-credit-lending\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"The Human in the Loop in automated credit lending &#8211; Digital Society Blog\" \/>\n<meta property=\"og:description\" content=\"How fair is automated credit lending? Our case study shows why not every decision can be left to machines and where human expertise is essential.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hiig.de\/en\/automated-credit-lending\/\" \/>\n<meta property=\"og:site_name\" content=\"HIIG\" \/>\n<meta property=\"article:published_time\" content=\"2025-12-08T14:55:14+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-12-08T15:34:13+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/Titelbild_Sonja-\u2013-12.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1144\" \/>\n\t<meta property=\"og:image:height\" content=\"643\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Digital Society Blog\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Digital Society Blog\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"13 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"The Human in the Loop in automated credit lending &#8211; Digital Society Blog","description":"How fair is automated credit lending? Our case study shows why not every decision can be left to machines and where human expertise is essential.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/","og_locale":"en_US","og_type":"article","og_title":"The Human in the Loop in automated credit lending &#8211; Digital Society Blog","og_description":"How fair is automated credit lending? Our case study shows why not every decision can be left to machines and where human expertise is essential.","og_url":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/","og_site_name":"HIIG","article_published_time":"2025-12-08T14:55:14+00:00","article_modified_time":"2025-12-08T15:34:13+00:00","og_image":[{"width":1144,"height":643,"url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/Titelbild_Sonja-\u2013-12.png","type":"image\/png"}],"author":"Digital Society Blog","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Digital Society Blog","Est. reading time":"13 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/#article","isPartOf":{"@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/"},"author":{"name":"Digital Society Blog","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a921ecfdfcb94cb9c718b90c3a5dedbd"},"headline":"The Human in the Loop in automated credit lending \u2013 Human expertise for greater fairness","datePublished":"2025-12-08T14:55:14+00:00","dateModified":"2025-12-08T15:34:13+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/"},"wordCount":2355,"publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"image":{"@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/Titelbild_Sonja-\u2013-12.png","articleSection":["Artificial Intelligence","Digital Society Blog","Everyday Life","Ftif Platform governance","Policy and Law"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/","url":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/","name":"The Human in the Loop in automated credit lending &#8211; Digital Society Blog","isPartOf":{"@id":"https:\/\/www.hiig.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/#primaryimage"},"image":{"@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/Titelbild_Sonja-\u2013-12.png","datePublished":"2025-12-08T14:55:14+00:00","dateModified":"2025-12-08T15:34:13+00:00","description":"How fair is automated credit lending? Our case study shows why not every decision can be left to machines and where human expertise is essential.","breadcrumb":{"@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hiig.de\/en\/automated-credit-lending\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/#primaryimage","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/Titelbild_Sonja-\u2013-12.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/12\/Titelbild_Sonja-\u2013-12.png","width":1144,"height":643,"caption":"How fair is automated credit lending? Our case study reveals why not every decision can be left to machines and where human expertise becomes essential."},{"@type":"BreadcrumbList","@id":"https:\/\/www.hiig.de\/en\/automated-credit-lending\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hiig.de\/en\/"},{"@type":"ListItem","position":2,"name":"The Human in the Loop in automated credit lending \u2013 Human expertise for greater fairness"}]},{"@type":"WebSite","@id":"https:\/\/www.hiig.de\/#website","url":"https:\/\/www.hiig.de\/","name":"HIIG","description":"Alexander von Humboldt Institute for Internet and Society","publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hiig.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.hiig.de\/#organization","name":"HIIG","url":"https:\/\/www.hiig.de\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","width":320,"height":80,"caption":"HIIG"},"image":{"@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a921ecfdfcb94cb9c718b90c3a5dedbd","name":"Digital Society Blog"}]}},"_links":{"self":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/111576","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/users\/313"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/comments?post=111576"}],"version-history":[{"count":21,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/111576\/revisions"}],"predecessor-version":[{"id":111942,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/111576\/revisions\/111942"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media\/111891"}],"wp:attachment":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media?parent=111576"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/categories?post=111576"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/tags?post=111576"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}