Please wait while the transcript is being prepared...
0:00
Hi, my name is Amalia Barthel. I'm an advisor, consultant and educator in the areas of digital risk, digital data risks, privacy, compliance and governance. This talk is called how is digital risk regulated and applicable laws. In this talk, the audience will make the connection between digital risk and the other types of organizational risks and learn how cybersecurity, privacy and very soon artificial intelligence are regulated and what laws apply in Europe as well as in United States, Canada and Australia.
0:38
There are a number of laws and regulations emerging in the strongest markets. In no particular order, let's start with the European Unions Digital Services Act. By now, we assume you are familiar with the general data protection regulation in the European Union. Well, the European Union, European Commission and other bodies have been busy. They found it mandatory to put some rules in place for the after pandemic very connected world.
1:09
What is the digital Services Act? It is an act meant to protect children and young people online. Why was such an act necessary? Before we answer, you need to know that there are networks of regulators who meet regularly and there are other forums that discuss the global impact of many aspects of our world, including the Internet. At this moment, there is a bill here in Canada discussing the safety of children and youth online to make the internet safer. But coming back to the Digital Services Act or DSA, what is it trying to achieve? One, identifying online risks for minors. Harassment, bullying, false information, illegal content, fraud. Two, asking for those providing products and services directed to this young segment of the population to engage in conducting risk assessment and reduction of risks. As we have age rating for films in the cinema, some online content and services are not appropriate for younger age groups. Therefore platforms must also put measures in place to mitigate these risks, including as appropriate depending on the platforms. Parental controls, settings that help parents and carers monitor or limit children's access to the Internet. Protect from online risks and inappropriate contents. Another aspect is age verification. This has been a very thorny issue for all the privacy laws around the world. United States and Canada list various ages at which a minor may take control of their personal information. So that creates a wide opportunity for ill intended actors to target the age gap between different laws. In addition, there hasn't been a method established and agreed upon as to how to verify the age without infringing even more on children's safety and protection by keeping them anonymized. As such, a system to check the age of users before they access to service, for instance, based on physical identifiers or other forms of identification doesn't currently exist or it is not standardized. Three, tools to help young people signal abuse or get support. Four, child friendly complaints and reporting systems. It is important that the platform can act on content that could affect people's rights such as dignity, privacy and freedom of expression. All of these are not just privacy and self determination risks but digital risks. The DSA, Digital Services Act, wants it to be easy for its users including minors to report and complain when they discover illegal or other content that should not be online. Platforms should also act quickly when trusted flaggers report content which they consider illegal or against the terms and conditions of the platform. Five, personal data privacy. We all have the right to privacy and to keep our personal information safe. The personal data we share must be protected. It cannot be manipulated or reshared without our knowledge and people cannot spy on us. Additionally, according to the DSA, online platforms used by children should protect the privacy and security of their users. For example, special privacy and security settings by default. Child friendly information. Terms and conditions must be written and updated in a way that it is easy to understand for everyone including minors. Online services used by minors must make an extra effort to explain things clearly so young users can understand what they're agreeing to and that's in Article 14. The Digital Services Act in Article 39 requires very large online platforms which you'll also see as the acronym VLOP to make the information safe and guard against particular risks in the dissemination of illegal content and societal harms.

Quiz available with full talk access. Request Free Trial or Login.

Hide

How digital risk is regulated and applicable laws

Embed in course/own notes