Page 35 - KDU Law Journal Volume 4 Issue 2
P. 35

KDU Law Journal                                  Volume 04 Issue II
                                                               September, 2024
              quite easily but it was poor who were really losing a lot. Therefore,
              it was found that COMPAS is biased against the black defendants.
              The reason is the data that has been entered into the AI tool dates
              back in 50’s 60’s 70’s. Therefore the decision was labeled biased
              against blacks .
                          18
              The analysis revealed that black defendants were as likely to be
              tagged as repeat offenders than white defendants, indicating the
              likelihood  that  the  algorithm may  be  biased.  This  is especially
              problematic given that the assessments were utilized by the courts
              in  order  to  determine  matters  such as bail  and  release dates.
              Therefore, such AI -based systems and their coded algorithms must
              be examined carefully in order to ensure that bias is minimized and
              that if bias does arise in some form, it is assessed formally .
                                                                 19
              However, it will be important to carefully analyze the challenges and
              potential dangers if AI is to be implemented successfully. The challenge
              of Co-Robotics in the judiciary is facilitating functioning communication
              between human and machines. There are two approaches to the challenge
              of Co-Robotics in the judiciary. In order to retain human control, one
              has to either enable functioning communication between human and
              machines (addressing the Co-Robotics problem) or strictly separate
              them from each other (avoiding the Co-Robotics problem).  The main
                                                               20
              challenge is lack of transparency in how those tools are operating. To
              solve issues with data access, privacy, bias, and reluctance to change,
              policymakers and legal experts should collaborate.

              It has been noted that the use of Robotic Intelligence based systems
              appears to eliminate human bias, for example, “In a high crime
              city, a judge might start to hand out harsher sentences towards the
              upper end of the sentencing guidelines. In court, if a judge does

              18  ibid
              19  Justice Shiranee  Tilakawardane  (Retired Judge of the Supreme Court of Sri Lanka)
              Arificial Intelligence in the Legal System (Judges Journal Vol V ) P.3-4
              20  Neil M Richards and William D Smart, ‘How Should the Law Think About Robots?’( Robot
              Law  Publishing 2016) P.13
               law.faculty@kdu.ac.lk
                                          28
   30   31   32   33   34   35   36   37   38   39   40