Criminal Law and AI Ethics: What Should S36MY

In a recent feature article by Dr. Stefano Filletti—Managing Partner at Filletti & Filletti Advocates and Head of Criminal Law at the University of Malta—published in SiGMA Magazine Issue 34 and distributed at this month’s SiGMA Europe-Mediterranean Summit, the spotlight is on how AI ethics and criminal law converge at the iGaming boardroom level.

The Intersection of Law, Ethics, and Innovation

In a recent feature article by Dr. Stefano Filletti—Managing Partner at Filletti & Filletti Advocates and Head of Criminal Law at the University of Malta—published in SiGMA Magazine Issue 34 and distributed at this month’s SiGMA Europe-Mediterranean Summit, the spotlight is on how AI ethics and criminal law converge at the iGaming boardroom level.

Dr. Filletti argues that directors must show continuous diligence to protect customers, safeguard their organizations, and ensure long-term growth. While AI now shapes customer journeys, risk scoring, and product development, courts continue to evaluate corporate responsibility based on tangible, documented oversight rather than subjective intent.

His message is unequivocal:

“Due diligence is not passive—it is an ongoing, proactive process.”

Structured records, model updates, and formal briefings are essential to demonstrate lawful governance in fast-paced, iterative environments.


Structured Oversight Under Criminal Law

Under Maltese law, companies can be held criminally liable for offenses committed in their interest, and executives may face vicarious liability. According to Article 121D of the Criminal Code, once a company is charged, the burden shifts to officers, who must prove both ignorance of the offense and evidence of comprehensive supervision.

This makes board-level oversight critical. Meeting minutes, audit trails, and independent verification become decisive in court. As Dr. Filletti stresses:

“Turning a blind eye is not a defense.”

Written workflows, incident logs, and real-time dashboards allow boards to pre-empt prosecutorial scrutiny with clear evidence of diligence.


AI Ethics in Practice

Governance must keep pace with agile development. Pre-release AI briefings should align with iteration cycles, with risks flagged early through registries reviewed by legal, compliance, and technical teams.

Dr. Filletti maintains that “growth and safety are allies, not opponents.” He calls for real-time monitoring instead of quarterly audits, and board-level visibility into anomalies—such as unexplained deposits or sudden behavioral shifts among players.

Dynamic compliance, reinforced through scenario workshops and stress-testing, ensures preparedness beyond paper policies.


Criminal Risk and Technological Tools

Tools such as poker assistance software or advisory platforms heighten regulatory concerns, especially where money laundering or unfair advantage risks arise. Dr. Filletti advocates stronger legal frameworks but warns that regulation lags innovation, leaving boards responsible for risk mapping, data segregation, and full documentation of assessments.

Leadership, he argues, must review every system integration and invest in monitoring before facing prosecutorial pressure:

“Inaction is not an option.”


Evidence, Presumptions, and Prevention

Vicarious liability shifts courtroom dynamics—once misconduct is proven, directors must demonstrate both ignorance and sufficient oversight. Dr. Filletti sees this as consistent with fair trial principles, provided companies meet today’s higher standards:

  • Real-time dashboards for high-risk decisions
  • Regular workshops addressing emerging threats
  • Dedicated compliance staff overseeing algorithms, thresholds, and audit checklists accessible to the board

As he concludes:

“Compliance is a dynamic discipline. Ongoing oversight protects innovation more effectively than litigation after the fact.”

Leave a Reply

Your email address will not be published. Required fields are marked *