Artificial Intelligence (AI) is revolutionizing industries across the globe, and the legal profession is no exception. With promises of unprecedented efficiency, improved access to justice, and cost-effectiveness, AI tools are quickly becoming integral to law practices. However, with great power comes great responsibility. The growing use of AI in legal work brings not only exciting potential but also significant ethical challenges that need careful governance.

The Double-Edged Sword of AI in Law

AI offers powerful tools to enhance legal practices, but it also introduces risks that could undermine fairness and justice. On one hand, AI can automate mundane tasks like document review and legal research, allowing lawyers to focus on more complex and strategic work. It can also level the playing field for smaller law firms by providing access to advanced tools that were once reserved for large, well-funded firms. For clients, AI has the potential to democratize legal services, making high-quality legal assistance more affordable and accessible.

However, these advantages come with considerable ethical concerns, such as bias in AI algorithms, data privacy issues, and a lack of transparency in AI decision-making processes. These challenges can impact the integrity of legal work, client trust, and the overall credibility of the justice system.

Key Ethical Concerns in AI Use in Legal Practice

  1. Bias and Fairness: One of the most pressing risks in AI deployment is bias. AI systems learn from historical data, which may contain inherent biases. These biases can manifest in the form of discriminatory outcomes, such as suggesting harsher sentences based on biased historical legal data. Legal practitioners must be vigilant about the potential for AI to perpetuate systemic injustices, particularly when using predictive analytics in cases involving sentencing or employment discrimination.
  2. The Black Box Problem: Some AI systems, particularly those using deep learning, operate as “black boxes,” meaning they provide results without transparency into the reasoning behind those conclusions. This lack of clarity is problematic in law, where accountability and justification are paramount. Lawyers must ensure that AI-generated outputs are explainable, particularly when used in critical legal decisions.
  3. Data Security and Confidentiality: Legal professionals handle sensitive, confidential information. The use of AI tools in legal workflows raises concerns about data security and privacy breaches. Legal firms must ensure that AI tools comply with data protection regulations and that robust safeguards are in place to protect client information from unauthorized access.
  4. AI Hallucinations and Inaccuracy: AI systems can generate plausible yet incorrect information, known as “hallucinations.” For example, AI may produce fictitious case law citations that appear legitimate but are entirely fabricated. This poses a significant risk in legal practice, where inaccurate or misleading information can have severe consequences for clients, courts, and the integrity of legal proceedings.

Building a Governance Framework for AI in Legal Practice

Given these risks, it’s essential for law firms and legal practitioners to develop robust AI governance frameworks that ensure responsible and ethical AI usage. This includes:

  • Guiding Development and Deployment: AI tools used in legal practice must be designed with fairness, transparency, and accountability in mind. Developers should collaborate with legal professionals to create systems that meet ethical standards and regulatory requirements.
  • Responsible Use and Compliance: Lawyers must understand the limitations of AI and use it as an assistant rather than a replacement for human judgment. They must also ensure compliance with data protection laws, maintain client confidentiality, and safeguard sensitive data when using AI tools.
  • Regular Audits and Oversight: Continuous monitoring of AI usage in legal practices is crucial to identifying and mitigating risks. Regular audits help ensure that AI tools are used appropriately and in compliance with ethical and legal standards.
  • Training and Awareness: Legal practitioners should be well-trained in the capabilities and limitations of AI technologies. This ensures that AI tools are used responsibly and ethically, and that lawyers can confidently integrate them into their practice without compromising client trust or professional standards.

The Role of Legal Professionals in AI Governance

While AI tools can enhance the efficiency of legal services, they cannot replace the need for human oversight. Lawyers have a professional duty to ensure the accuracy and reliability of AI-generated outputs. This includes verifying AI-generated information, ensuring transparency in decision-making, and safeguarding the confidentiality of client data.

Legal ethics also play a crucial role in AI adoption. Lawyers are bound by principles such as competence, confidentiality, integrity, and transparency. They must ensure that AI tools align with these ethical standards, both in terms of their development and their use within the practice.

Conclusion: Embracing AI with Responsibility

AI is transforming the legal profession, offering new opportunities for efficiency, accessibility, and innovation. However, to harness its full potential, legal practitioners must carefully navigate the ethical considerations associated with AI use. By establishing clear governance frameworks, ensuring transparency, and adhering to ethical principles, lawyers can leverage AI to enhance their practices while safeguarding justice, fairness, and client trust.

For a deeper exploration of these ethical considerations and best practices in AI governance for law practices, you can read my full paper on AI Governance: Ethical Considerations in the Transformative Use of AI in Your Law Practice here. This paper was initially presented at the 2024 Annual Jamaica Bar Association Flagship Conference.

Share this:

On 30th November, 2021 the Government of Jamaica, through its publication in the Jamaica Gazette, enacted sections 2,4, 56, 57, 60, 66, 74 and 77, and the First Schedule, of the Data Protection Act 2020 with an operative date of the 1st December 2021. A week later, it was reported via local news outlets, that the Governor General had also appointed an Information Commissioner – Ms. Celia Barclay, also with an effective date of 1st December 2021. These developments have the primary effect of:

  1. Establishing the Office of the Information Commissioner with certain powers, duties and responsibilities as conferred under the Act;
  2. Commencing the two year transitional period stipulated in section 76 of the Act; and
  3. Effecting immediate obligations & data standards for data that can’t be processed automatically, or that does not form a part of a structured filing system.

The Office of the Information Commissioner

The sections of the act brought into operation with the gazette notice, primarily apply to the establishment of the role and office of the Information Commissioner. With these enactments, the duties & responsibilities of the Commissioner are now operational. In particular, the Commissioner is to establish procedures and make regulations to give effect to the provisions of the act and create a data sharing code after consultation with industry stakeholders. Additionally, the published notice officially conferred to the Commissioner the duty to prepare reports & guidelines for parliament; to adhere to regulations for international co-operation; and to maintain confidentiality of information in her role. The newly appointed Information Commissioner, Ms. Celia Barclay brings to her role a wealth of legal & regulatory experience with over fourteen years at the bar and  over seven years in public service.

Commencement of Transitionary Period for Data Controllers

The Act directs controllers to take all necessary measures to ensure compliance with the provisions of the Act and the standards articulated therein for a period of two years after the earliest date of enactment. For this transitionary period, no proceedings may be taken against a data controller for any processing done in good faith. Data controllers now therefore have until 30th November 2023 to reform their data processing practices to ensure that the comply with the provisions of the Data Protection Act.  

Immediately Effective Standards & Obligations

As of the earliest effective date of the Act, being December 1st 2021, any personal data that is held in a way that:

  1. does not allow the data to be processed automatically or;
  2. is not a part of a filing system where the information is structured (either by a reference to individuals or by reference to criteria relating to individuals) in a way that allows specific information relating to a particular individual to be readily accessible;

shall be subject to certain obligations under the Act. In particular, any such data must adhere to the following data standards in accordance with the Act:

  1. The personal data shall be processed fairly and lawfully;
  2. The personal data shall be obtained only for one or more specified and lawful purposes, and shall not be further processed in any manner incompatible with those purpose;
  3. The personal data shall be adequate, relevant, and limited to what is necessary for the purposes for which they are processed
  4. The personal data processed for any purpose shall not be kept for longer than is necessary for that purpose
  5. Appropriate technical and organisational measures shall be taken— (a) against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data; and (b) to ensure that the Commissioner is notified, without any undue delay, of any breach of the data controller’s security measures which affect or may affect any personal data.
  6. The personal data shall not be transferred to a State or territory outside of Jamaica unless that State or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.
  7. That personal data shall be processed in accordance with the rights of data subjects conferred under the Act, with the exception of the right to access and the right to request rectification of inaccuracies.

In addition to this, Controllers processing the data falling within this category are required to:

  1. Obtain consent for any direct marketing in accordance with the Act;
  2. Adhere to written requests for the prevention or cessation of processing in accordance with the Act;
  3. Respect the rights conferred on data subjects with regard to automated decision making;
  4. Meet registration requirements with the Information Commissioner; and
  5. Where applicable, appoint a data protection officer.

Notwithstanding this enactment, without the establishment and structure of a formal registration process within the Office of the Information Commissioner, it is unlikely these provisions will be immediately enforced. Moreover, where a data controller can demonstrate that he has been processing data in good faith during this transitionary period no proceedings may be brought against him under the Act.  

Share this:

 

 

Background

Just over a week ago, the renowned online technology news site TechCrunch, released a shocking article revealing a major security failure that resulted in the possible exposure of the private information for thousands of travellers to Jamaica. Within a week of the first vulnerability being exposed, TechCrunch uncovered not one but two additional security vulnerabilities which led to the website finally being offline.

Based on the initial published report, the JamCovid App and website which is used (i) to pre-approve travellers to the country, (ii) to facilitate self-reporting of Covid19 symptoms and (iii) to aggregate and publish periodic Covid19 statistics for the Ministry of Health, was built and developed by the Amber Group for the benefit of the Jamaican government.

The type of data collected by the JamCovid App appears to be:

  1. Names
  2. Emails
  3. Phone Number
  4. Addresses
  5. Passport Numbers
  6. Dates of Birth
  7. Nationality
  8. Name of Employers
  9. Job Title/Position
  10.  Photographs
  11.  Flight information
    • Airline
    • Date of arrival
    • Date of departure
    • Flight Number
    • Port of Disembarkment
  12. Cookies and Usage Data
  13. Health Information including temperature readings and symptoms submitted by travellers and self-reporters
  14. Travel Authorization Reference Numbers
  15. Geo-Location Information

And based on the TechCrunch article, also included:

  1. Images of Traveler’s signatures; and
  2. Lab results
  3. Quarantine Orders

Source of information: Screenshots from the JamCovid App and The JamCovid Privacy Policy

According to the first report published by Zack Whittaker of TechCrunch a storage server, hosted on Amazon Web Services which stored uploaded documents and information, was set to public.

The Vulnerability In Perspective- A Technical Summary

The “storage server” referenced by TechCrunch is an Amazon Simple Storage Service (Amazon S3) bucket. Amazon S3 is a cloud based service that provides object storage, which is built for storing and recovering any amount of information or data from anywhere over the internet. The Amazon S3 storage can be used via a user friendly web interface or a well documented Amazon S3 REST API

Think of the Amazon S3 service as a suitcase that you need to pack before you travel. The suitcase in this case is the “bucket” and each of the items you put in your suitcase will be called an “object”.  When using the Amazon S3 service a bucket must first be created with specific permissions before you can start using it to store data in the form of objects.  In this case the bucket was set to “public” which means anyone in the world can access data/objects stored in this bucket. This major oversight would be akin to creating a suitcase that has no zippers to secure the items in your suitcase while you travel. 

Web Interface Showing How Easy To Configure Permissions on AWS S3

Web Interface Showing Warnings When Configured as Public Access

The second reported security vulnerability revealed that private keys and passwords for the JamCOVID app and website were exposed through a file that had been left open and accessible on the website. Again, based on TechCrunch’s report, the third security lapse dealt with quarantine orders being publicly accessible from the JamCOVID website as they were also not protected with a password.

But …. Was There a Breach?

Based on all accounts, TechCrunch, through its very public incident report, merely brought the government and their contractor’s attention to the security vulnerabilities and possible data exposure. There is currently no indication that the data was downloaded, stored or processed by any unauthorized person. The Ministry of National Security also released a statement stating that they found no evidence that the vulnerabilities were exploited for malicious data extraction:

The issue is that finding evidence that there was no breach of data, does not indicate that a breach did not indeed occur.

The Amazon S3 service allows developers to enable logging on all buckets created and as such the Amber Group and its developers would have the ability to see whether the exposed server was accessed through logs, if logging was enabled. 

See the guide from AWS to Enabling Amazon S3 server access logging at https://docs.aws.amazon.com/AmazonS3/latest/userguide/enable-server-access-logging.html

Settings under AWS S3 Bucket To Enable Logging

Having server access logging enabled could indicate how many times and from which IP addresses the data was accessed. Unfortunately it is unclear whether the Amber Group had logging enabled. If it didn’t, it would therefore be difficult, if not impossible to conclusively indicate whether a  breach did in fact occur and how many times the data may have been accessed, downloaded or otherwise processed by unauthorized persons. The fact remains – the Amazon S3 bucket used to store the JamCovid data was public and could have been accessed by nefarious individuals because it was configured irresponsibly to “public”.

What about the Travellers/Data Subjects?

Notwithstanding the absence of evidence of a breach, it is clear that the data subjects right to transparency should effectively grant data subjects the right to be notified of any major operational failure that puts the data subjects’ personal information at risk of exploitation. Implicit in this right to transparency is the right that data subjects are owed to mitigate any loss or damage which may result from the failure of the Amber Group and by extension the GoJ to properly and securely handle their data.

While no data breach has been reported or confirmed, it is clear that personal data was being processed and  at the very least, the government should have adhered to their own data protection standards set out in the Data Privacy Act of 2020 (inspired by industry standards set out in the GDPR).

Global Implications?

Furthermore, the multiple security vulnerabilities reported by TechCrunch in the span of 8 days may have major implications for Jamaica, our businesses and technology locally and internationally.

The inelegant technical management of the JamCovid App by the Amber Group and the way in which this security incident is handled may have long-term effects on the development of our digital economy. Under the GDPR international data transfers are regulated such that, where a company in Jamaica seeks to do business with a company within the European Economic Area; any transfer of personal data from the EU to Jamaica will be subject to one of  three conditions, the broadest being that Jamaica has met  “an adequate level of protection for personal data as determined by the European Commission”. Needless to say, presently Jamaica does not meet that criteria and has not (to the best of my knowledge) been earmarked as one of the countries that ensures an adequate level of protection for personal data. Arguably, two weeks ago we were closer to reaching that mark. With last year’s passing of the Data Protection Act, we were closer to demonstrating adequacy through the implementation of data protection legislation and regulation that meet the GDPR’s standards. In the wake of this ongoing security issue with the JamCovidApp, this can be seen as precedent. In a very public way, the AmberGroup has now demonstrated that despite a lack of proper cyber security safeguards and a failure to implement proper data protection guidelines and best practices; the rights of data subjects are not being prioritised or recognised.

To meet the GDPR's requirement for adequacy, local businesses and companies will therefore have to continue to build & maintain their own strict self-regulated safeguards that afford data subjects with legal remedies; or fit within very specific situations to fit the criteria for conducting trans-border data transfers with the EU.

The Amber Group

If the content of the TechCrunch articles are to be accepted as true, it is clear that the AmberGroup has failed to perform and respond adequately on several fronts:

  1. No evidence of the conducting a thorough Data Privacy Impact Assessment prior to the deployment of the App or thereafter.
  2. No evidence of the conducting a thorough IT Security Assessment prior to the deployment of the App or thereafter.
  3. No clarity on the number of affected data subjects and lack of evidence on the proper notification of data subjects.
  4. An outdated privacy policy which showcases the date of drafting as “2018”, two years prior to the development and deployment of the application and website (In case of removal see screenshot in gallery above).
  5. No evidence that a Data Breach Incident Plan was in place.
  6. A reactionary approach to securing information and protecting personal data; as opposed to the proactive approach of engineering a program that incorporates privacy by design.

Conclusion

No system is 100% secure. Investing in all the cyber-security and data privacy resources on earth cannot prevent any application or website from having vulnerabilities. What is important is ensuring that data controllers and data processors work proactively to embed security and privacy into every step of the design, engineering, development deployment and operation of IT systems, networked infrastructures, and business practices.

Errors and oversight can happen, however, data controllers (and processors) must work actively to create a framework that fundamentally respects the rights of data subject and effectively manages data privacy and security at all levels. This requires creating a framework that meets legal compliance requirements while meeting the expectations of business clients/customers and simultaneously reducing the risk of security incidents and data breaches.

At all levels, staff and contractor needs to be adequately informed on the organisation's security practices and privacy policies, with constant monitoring of activities to control, manage and report any risks and vulnerabilities associated with security and privacy management. Incident reports, such as those provided by TechCrunch, are not attacks, but an opportunity to mitigate risk and build a more robust infrastructure and system. Having a clear plan to respond to unfortunate public events and incidents is core and critical to effective data privacy management.

“If you fail to plan, you are planning to fail”.

Benjamin Franklin

[/kc_column_text][/kc_column][/kc_row]

Share this: