Our digital world is creating more data than ever before.

And according to findings from Norton Rose Fulbright’s 2024 Annual Litigation Trends Survey, as data volumes grow exponentially, so do cybersecurity and data privacy concerns. 

From our survey:

  • 40% of organizations saw their cybersecurity dispute exposure grow in 2023 – more than any other area – as cyberattacks reached record levels and the patchwork of data protection regulations grew in complexity.
  • Cybersecurity, data protection and data privacy also top the list of litigation concerns in the year ahead amid the challenges posed by data management and AI.

In episode two of our miniseries, hosts Ted Brook, Partner, and Erin Brown, Senior Associate, are joined by the Canadian Co-Heads of Cybersecurity and Data Privacy at Norton Rose Fulbright: Imran Ahmad and John Cassell. Together, they dissect the nuances of cybersecurity, data privacy and emerging policies around AI as the dramatic growth of generative AI tools like ChatGPT outpaces existing legal frameworks. Now more than ever, organizations need to remain proactive in order to mitigate risks and take advantage of future opportunities.

Download the full Litigation Trends Survey at 2024 Annual Litigation Trends Survey.

This episode is accredited 0.62 substantive hours in Ontario and 0.5 substantive hours in British Columbia.

Listen and subscribe to the Disputed podcast on:
   Spotify logo Apple podcasts

Disputed episodes >

Transcript

[00:00:02] Ted Brook Welcome to Undisputed, a Norton Rose Fulbright podcast. I'm Ted Brook.

 

[00:00:08] Erin Brown And I'm Erin Brown.

 

[00:00:09] Ted Brook We're excited to be hosting this special mini series to unpack the findings of our firm's annual Litigation Trends report. For 19 years, our firm's research has tracked changes and trends defining the litigation landscape from dispute types and exposure to litigation preparedness and in-house legal staffing by surveying legal professionals at organizations of all sizes across key commercial sectors in Canada and the US.

 

[00:00:37] Erin Brown In Undisputed, we'll explore the emerging trends and insights concerning the litigation challenges industry leaders are facing in 2024. Hope you enjoy. Welcome to the podcast. We're so happy to have you today. So today we'll be speaking with John Cassell, who is a partner in our Calgary office, and Imran Ahmad, who is a partner in our Toronto and Montreal offices. John and Imran are the co-heads of information governance, privacy and cyber security at Norton Rose Fulbright and Imran is the head of technology. Welcome.

 

[00:01:19] Imran Ahmad Thanks for having us.

 

[00:01:20] John Cassell Thank you.

 

[00:01:21] Ted Brook Yeah. So I guess with both of you here, we have one head of national information governance, privacy and cybersecurity. Right? The two of you together. It's like transformers. You come together and create this ultimate head of the of the team.

 

[00:01:36] Imran Ahmad Although John is the vast majority of that. So he's he's the brains behind the whole practice. I'm a small minority shareholder.

 

[00:01:44] John Cassell No, you're too modest. Power numbers. Yeah, absolutely. I think we're here multiplied in strength.

 

[00:01:49] Ted Brook And we need you to be, because we're talking about what I think is probably the highlight of the litigation trends survey. It's clear from the very first page that what's concerning in-house counsel in Canada and the US the most is an uptick in litigation relating to cybersecurity, data governance, information security.

 

[00:02:14] Erin Brown Before we get into too much, your first intro line has raised my first question. So we hear a lot on cybersecurity, cyber breaches, data privacy, data protection, AI, generative AI. Can you give us a quick sort of overview lay of the land. What does each of those concepts really mean, and how do those things relate to each other and overlap or not overlap?

 

[00:02:40] Imran Ahmad It's a lot to unpack when we're talking to our clients. Really. It comes down to digital risk. Everybody has a digital footprint these days. Gone are the days. Although it does happen once in a while. Clients are going to be entirely paper based. Everything is digital these days, just by way of background for the audience. Like I started out as a tech lawyer, I was not doing cyber security, I was not doing AI, I was not doing privacy either. But over the last 20 years or so, those areas have evolved and grown significantly. So how is it grown exactly? Tech always changes. There's always something new coming up. A few years ago was blockchain. There was a metaverse, and now we're talking a lot about AI. But cyber is unique. I think it stands on its own because it is one of those major risks that companies are facing that have a reputational, operational, financial on for US lawyers, legal and regulatory impact. And that can literally impact clients operations short term but longer term goodwill up to reputation as well. It's our job either as bridge coach to guide them and navigate and be their trusted advisor through that very difficult situation. Or in a more ideal scenario, if you have not had a bridge talking to the board, talking to leadership and making sure that they understand how they can mitigate that risk on the front end. So cyber is pretty unique as an adjacent to that, but slightly differently. We often have to advise on privacy related issues because you have employee information and consumer information and so on. And that's where it becomes really tricky, because the enforcement environment of privacy, not just here in Canada, provincially, but globally as well, has increased significantly. I know we'll get into the legislative developments because it's really a game changer from a Canadian perspective. But on AI, I'll just share the story with the audience. I think it's really worthwhile. In the last 12 to 14 months, there's been this storyline. Generative AI has been around for a while. If people know about it. AI has been there for decades, but there's been research that's been going on over time. But in November of 2022, you had ChatGPT launch. Very few people even tracked it. And there was a first phase where everybody was super excited. It was going to change and revolutionize how we do things. And to a large extent, that was a lot of hype, but also a lot of reality to it to. And then in the spring of last year. Almost to the year, you had a few leading seminal pioneers in AI research who came out and said, hold on a second. This could turn out to be like the Terminator and Skynet, and the singularity will come out and destroy humanity. We need to put a pause on how quickly this is developing, and a lot of people freaked out. The average consumer or individual who's looking at AI is like, where are we going with this? And that prompted this third phase we're at right now, which is regulatory development. Let's build a box around this. Is there a framework? So we'll get into all of these three. But they're all interconnected because did the lifeline in all of this is the data part, data cybersecurity, data and privacy or data in the AI component.

 

[00:05:36] Ted Brook Yeah. And so when you say that Imran, it's really interesting. And something I want to circle back on is what you said. AI is the real deal. And I want to talk to you a little bit about that. So the way you have cybersecurity, you have data protection, you have privacy. And AI sort of is an element of all three. It amplifies some of the concerns and risks in each of those areas. Or do you see it as a standalone area now that clients are thinking about?

 

[00:06:02] John Cassell Ted, you know, I think it connects all of those various areas. AI will touch on privacy in terms of how it impacts personal information. Is AI using individuals personal information to make automated decisions as an example? Information governance often refers to and data refers to a data more than personal information. All types of data to those same sorts of questions apply to commercial data, intellectual property. And then of course it impacts cybersecurity as well. An example that Imran and myself see is the attacks are more devastating now. AI is now being used by threat actor groups to cause more damaging incidents and attacks. The phishing emails are more convincing.

 

[00:06:48] Ted Brook This reminds me when I was first started at Norton Rose, I got a phishing email that seemed to come from I think it was Andrea Brewer who is in corporate department, not part of the team that I worked on, but she asked me if I could run a quick errand for her, and it was one of those, like really rudimentary phishing attacks where they ask you to do something. You say, "Yeah, sure, I can do that." And then, "It's okay. Can you run to the store and buy me like 100, 100 dollar gift cards?" Right. And then it's, well, that sounds odd. I called her, I said, "What's going on?" And she is. "I don't know, I don't know what you're talking about." But I feel like now with AI being able to replicate people's voices, that sort of first defense against a targeted phishing scheme, could.

 

[00:07:29] Imran Ahmad Ted it's not ...

 

[00:07:30] Ted Brook Just how things get more complicated.

 

[00:07:31] Imran Ahmad Hypothetical what you're saying. It's happening already. If listeners are interested, they can go on the Quebec Securities Exchange Commission, the AMF Autorite des Marches Financiers. It's website, and they actually put a bulletin out on the use of deepfakes, a deepfake could be a picture or it could be somebody's voice. But essentially what it is, it captures your imprint. And the interesting part of a generative AI is when you're talking, for example, Ted like right now, there's a spectrum in which your voice operates, a very unique digital footprint that you have. What generative AI does, it tries to take a little bit of that and create a whole conversation, but your imprint can be a bit larger at that point. But to the average listener, you can't tell the difference. Remember a few a few months ago, you heard, oh, there's a new song that generative AI made that is in Drake's voice. What happens is, you know what the AMF reported, and there's others as well. There's reports about companies having diverted 2.5 million, despite having put a lot of effort to identify all the protocol that you described. Pick up the phone, call someone and figure things out. They will send that phishing email, which we're all accustomed to. It will be a bit more sophisticated than the Nigerian prince type of email. And then once you look at it, you go, well, I know what the protocol is. I need to validate if this makes sense or not. Let me pick up the phone. But what they'll say is in the email I'm traveling, I'll call you and they'll call after hours when they know the person is not going to pick it up. And then they leave a message saying, hey, it's Jane Doe or John Doe authorizing the transaction of the email that would just send to you a few hours. They'll give you the timestamp and the description so it looks really legit. And the average controller, CFO or the financial person will say, well, look, I got the authorization. I've got a  written that's coming from a legitimately looking email. I've done what I had to do and lo and behold, they're out of money in that situation. So this is happening now.

 

[00:09:17] Erin Brown Okay. So this is giving us, I think, a really good sense of what's going on in this complicated, evolving area. But let's bring it back to disputes for a minute. 40 percent of organizations surveyed in the Litigation Trends survey saw their cybersecurity dispute exposure grow in 2023, and are predicting that cybersecurity is going to be a huge factor moving forward into 2024. So bringing it back to sort of the legal landscape and the disputes that companies are facing as a result of these things. John, can you give us some examples of the types of disputes that you're seeing in the sort of cybersecurity data privacy sector?

 

[00:09:56] John Cassell Yeah, you know, unfortunately, due to the the nature of cyber incidents, there's disputes and that's the arising out of breaches. And that's really the overarching goal of a lot of our clients is to avoid disputes and mitigate risk. That's the number one goal coming out of incidents. So some of the the major disputes that come out of data breaches, cybersecurity issues are potentially class action risk, obviously more prevalent in the United States than in Canada. But as an example, where an organization has had a large scale data breach that's resulted in the compromise of of a lot of data, personal information or otherwise plaintiff class members can band together and commence a claim against the organization. The basis for that may be on various different torts, negligence or there's new emerging torts in Canada. The intrusion upon seclusion torts. So it'll be based on some legal basis really with the the basis that you have breached my personal information and I've suffered damages. That's always top of mind of clients as the most threatening and damaging type of dispute. There was a very important trilogy of cases released in Ontario in this past summer that really reframed and narrowed the opportunity for successful class action plaintiff lawsuits arising out of a data breach. That those were really important cases, really narrowed the application of this new tort of intrusion upon seclusion, when that can be used and then, importantly, reaffirmed that plaintiffs are individuals who have had their data breached, really have to suffer damage.

 

[00:11:37] Ted Brook With this sort of data compromising types of litigation. And I can see how that is top of mind for companies. But I was wondering if there even if avenues for class actions in that space in Ontario is narrowing. Are there other types of lawsuits that are becoming maybe more common, whether in the US or Canada? I'm thinking like failure to disclose a breach or representations made regarding certain protections or infrastructure that was in place that perhaps become litigated in the securities markets.

 

[00:12:14] John Cassell Great question. And you've hit on on an emerging trend, particularly in Canada. Cyber to date has really been folded as part of ESG disclosure, reporting and financial statements. So companies are increasingly having to disclose their cybersecurity risks and how they mitigate them and sat that out in their public disclosures. We haven't yet seen that translated into lawsuits in Canada for misrepresenting your security posture and in your financial statements or otherwise, but it's likely on the horizon. We're watching it very closely, whether there's representative actions. So I think that's an emerging trend, something we're watching very closely, whether that turns into derivative actions or those types of disputes. I would suggest that's that's on the horizon. That's the next big type of cybersecurity related dispute that there were waiting to see if it if it manifests itself in Canada.

 

[00:13:14] Imran Ahmad I will say this much on the trend study. I know where most of the folks are coming from in the last 18 months. The B2B relationship is key. So if you're a company, Acme Co and downstream, you've got five or six different vendors and one of those vendors is processing your data and they experience a breach that impacts your data from a privacy standpoint, if you're talking about personal information, you have liability there. But that third party vendor may not share as much information or at the cadence or with the granularity that you want. And that's where we see a lot of challenges. So beyond the privacy litigation, privacy regulatory or other types of actions, the B2B one, because there's a business relationship, it's not just pure legal contractual dispute here. There's a business relationship that generally speaking, parties want to solve it somehow, but they also need to get comfort on. Can we continue doing business with you and are you going to make us whole as a result of the breach you suffered with our data, and that's where we see a lot of difficulties. Because each client has a different threshold. Each client has a different level of sophistication when it comes to cyber security, and that sometimes it can be very challenging to manage those expectations.

 

[00:14:26] Ted Brook I'm really glad you brought that up, Imran. It struck me that the third party vendors are a bit of a boogeyman in the report. Like there's there's a quote from a telecom AGC who says this is where the real cyber risks are for us, because we use a vendor who uses a vendor who uses some other vendor, and eventually someone on down the chain isn't as sophisticated as us or the people in our immediate sort of circle. And so what do you do if you're an enterprise who's relying on outside vendors to store, transfer, analyze data to protect your yourself and your customers all the way down the line?

 

[00:15:08] John Cassell To answer your question, Ted. It's really screen the risk and at the outset and manage the risk throughout the relationship. And what I mean by that is we help a lot of our clients set up a vendor onboarding process. When you're onboarding a vendor who would have access to all of your data or your customer data as an example, that is a lot of risk, a significant amount of risk to you and your customers. Are you screening that particular vendor for their cybersecurity safeguards at the outset? When we highly recommend that they do that. Set up detailed questionnaires and that you're really doing your due diligence on that vendor before you enter into any type of agreement with them. Depending on but you get a score, you'll screen your vendor on the risk and how much data they have. Really, in a lot of cases, that flows back to the contract depending on the risk that you've assessed. Does your contract adequately reflect the risk? Do you have contractual levers that you've built into your contracts with those vendors that that allow you to, you know, impose security safeguards on them in the event of a breach? What happens? So it's really a full risk matrix when looking at your vendors. But it's a big it's a big issue. A lot of our clients have very robust cybersecurity programs, but they also recognize where where they have a bit of a blind spot or it can be difficult to manage is third party vendors. We are we're getting notices from our third party vendors multiple times a week that they've had an incident. They're difficult to manage. How can we manage this risk? And it's not an easy answer.

 

[00:16:44] Erin Brown It seems to me, from what you're saying, John, that one of the main ways to minimize risk in this space is to be proactive and try to stay ahead of ahead of the curve and have good policies and procedures in place, and have people like you and Imran's ready to call up if, if need be. Is that a fair assessment of sort of how to how to manage risks?

 

[00:17:03] John Cassell That's absolutely right. We always recommend being proactive. There's a number of things that organizations can do again to get ahead of those risks and be prepared.

 

[00:17:12] Imran Ahmad John, you're right on point there in in Erin, look proactive preparation. What does that mean? It's not just a policy or it's not just a protocol or it's not just a list that you want to have or a checklist you want to actually test it. The protocols in the plans are great. That's the baseline. But testing them out, stress testing them year over year, identifying the gaps, improving them. It's constant updating their protocols and their policies.

 

[00:17:37] Ted Brook Yeah. Staying staying ahead. And the sort of expectation now is a customized plan. Like that's the table stakes now. I want it to just shift to a slightly different topic. But it was something that jumped out at me in the report is that one of the respondents said, surprisingly, maybe. I don't know if you would disagree with this, but he says, "I don't consider the retention of data something that's actually going to cause litigation, but I do think it's going to make any litigation I have more painful and more expensive, because we have bigger volume of data that we're going to have to sift through." So I'd be curious to get your thoughts, John or Imran, on that sentiment, both from the perspective of what is massive data doing to in-house counsel and their jobs. But also this thought that as long as I have controls in place, I don't think that's going to cause litigation. I'm worried about how it impacts my other files.

 

[00:18:31] Erin Brown And to add to that question, the first focus of your question maybe shows us one area in which data and AI can really overlap, because one of the ways to manage having a lot of data in a file is to use AI to help you sort through different data.

 

[00:18:47] Imran Ahmad I'm generally a very positive outlook kind of person, but I don't think anyone or I would say my house is so secure. Nobody can ever break in. Never gonna happen.

 

[00:18:59] Erin Brown What did they say about the Titanic? Right?

 

[00:19:02] Imran Ahmad Right. Blanket comments are always dangerous. And I will say, shifting slightly to privacy. But first of all, a cyber attack is going to happen no matter what. If you have a motivated attacker, there's an opportunistic attackers, which is the least point of resistance entry that they'll take advantage of. And then you've got motivated attackers. Often we use the term apts or advanced persistent threats that are out there. They could be government funded. They could be hacktivist group. They're a variety of them that are out there. They say, you are my target. I will find their way in. I do need to make one mistake. They'll get into it. And it doesn't matter who you are. What John and I try to work on with our clients and our team is make sure they have a resilience where they can respond quickly. It's to how you're going to be responding to a breach that you're evaluated on, Ted you go to class actions, you defend clients. It's not the fact that you had a breach that's going to be problematic. It's how you responded to that. Did you act in a methodical, thoughtful way quickly to meet your legal requirements or other requirements that would have been triggered at that point? That's the key. So that I think from a from a purely cyber risk perspective, it's always going to be there. I don't believe that there's no risk that you can, or that you can cover and control the data in a manner that will never be compromised. It's not true. But the second piece of the question is, you know, over collection or data retention, if you look at some of the cases that are out there, Casino Rama being one that came out, they were praised by the court in terms of how they responded. But when you look at the commissioner's report, there were questions as to why they were retaining certain pieces of information longer than for what it was collected for in the initial case. I think data retention is a key area. We're seeing a lot of it with Law 25 in Quebec that came in and the future C-27. Whatever comes in, clients are going to be focused much more on reducing their digital footprint.

 

[00:20:46] Ted Brook I know Imran, I've gotten into some back and forth, some some spirited debates with privacy lawyers on your team who who want to put in a policy to destroy data, where from my perspective, as a litigator, I want to keep relevant data because we might have production obligations down the line. Right? So we were balancing those two things.

 

[00:21:05] Imran Ahmad Well. Well, you just said it. The key word you said in what you just said was relevant data. You didn't say all data. And a lot of times clients cannot discriminate. Our tendency, I remember years ago I worked for this this client, and it was a fascinating conversation there in the not for profit space. And they said, look, we're collecting a ton of data. We want to follow the lifecycle from when the person starts volunteering with us, when they have no money with their young kids or teenagers to the time they graduate. We have limited funds are giving us 20 bucks or whatever to their career pinnacle where they're making the most money and then when they retire, maybe want to give us a legacy donation down the road when they pass away. So we want to have the full lifecycle of that. Fact the matter is you're not collecting that information, disclosing it to the person in the first place. Are you able to do it? Is it appropriate? Just because you can, doesn't mean you should. And so those are the that come up from a privacy law compliance standpoint and what you can do technologically, which is quite, quite significant.

 

[00:22:00] Ted Brook Interesting. John, what do you think about that? The sort of tension between keeping data for AI purposes or for others versus the the risk that it carries?

 

[00:22:09] John Cassell I agree with everything Imran said. And and it has exponentially increased the risks associated with the data breach. So we see that very clearly when we're assisting clients with breaches. Now, the amount of data involved in in breaches compared to today, even three or four years ago is exponentially greater. And unfortunately, as Imran indicated, sometimes it's almost like a skeleton in the closet type of data. We see all the all types of data. They've kept everything. And unfortunately, that is the type of data that has the most exposure and could result in litigation, that old data that should have been purged, former employees for confidential information. That's the most difficult. And we see that in virtually every single breach we deal with now.

 

[00:22:57] Erin Brown It's interesting that this conversation is not at all specifically linked to companies. I'm literally thinking of myself on my photo album, on my phone, and how I have 57 pictures of my daughter from, like, the skating rink, because she's adorable and I have to pay for more and more cloud storage because I just, I can't the manpower to go through and delete those photos.

 

[00:23:16] Imran Ahmad That's that's a great example, Erin. Anyone who was listening to this podcast right now and yourselves included, if you want to do a quick test, go into your personal email account. There be Gmail, Hotmail doesn't really matter. You. The human nature we have is, oh, I may need some point down the road. I'll keep it. And I have stuff which I'm like, why do I have this from 20 years ago? Do I need it or I don't need it? And often we forget the type of information we're sharing readily by email. It is quite significant. We'll take a picture with our phone and send it to our financial advisor or bank, because they need something to validate a KYC thing. Same thing for companies. They collect more than they think they do. John and I, when we're on calls, I kid you not. And we're like, okay, you've had an email breach or other type of breach. What kind of information do you think is in there? Well, not super sensitive maybe name, address and telephone numbers there. We don't we don't collect a lot. Just so you know, said okay. Well fair enough. That's your your initial thought. Here's a document. Why don't you and your team look at it and come back to us if anything seems responsive and they come back like, oh yeah, Imran we looked at it. We may have some social insurance numbers and some scans have, you know, passports and driver's licenses and gosh, now we probably have to do a whole e-discovery. The problem with that is from a cyber perspective, like going back to what I said earlier, you're judged on how you respond. You're in this conundrum now. Do you respond quickly or do respond longer? And here's a pro and con. If you respond quickly, you're going to blanket notify everyone. And the larger your footprint, the larger you have to notify a population. Fine. You get speed, but you risk increase your risk exposure. If you are more surgical, meaning you do a full doc review that can take weeks if not months in some cases. So now you have a group of people who are going to be very unhappy saying you had this incident on January 1st and I'm only hearing about it in March or April, whatever their time frame. So it's really difficult. And it seems cleaning up the house on the data side makes sense, but maybe difficult or time consuming or costly. But the cost down the road can be quite significant as a reputation or just from a dollars and cents perspective.

 

[00:25:11] Ted Brook Yeah, and those document reviews like time sensitive document reviews, it's not like a Google search. Typing in important personal information and seeing what gets come up like they're painful requires man hours. Even with technology assisted review, you still have to get eyeballs on a lot of those documents in order to to train the tools that we have to to look for more similar documents. It's it's a lot of work.

 

[00:25:36] Erin Brown I can understand from this conversation why cybersecurity and AI and data privacy is at the forefront of people's concerns, because there's been so much to unpack on this call in terms of different threads and risks and how they how they run together. I do want to come back. Imran, you mentioned earlier in our call some legislative updates. So I wanted to make sure we had an opportunity to let you run through any important legislative updates that you think are driving some of this litigation or that are important for people to understand.

 

[00:26:03] Imran Ahmad I'll kick it off with, Law 25 and C-27, which are the two privacy related developments. And then, John, to probably touch on C-26, which is not the critical infrastructure side of things for cybersecurity. And then obviously we'll talk about AI afterwards. But look first time in 20 years it's exciting time where entire privacy landscape in Canada has changed materially. So without getting into the specifics of it, in Canada, there's broadly 24 different privacy data protection laws. So it can be quite difficult to navigate. And rare are the instances where a client is operating in just one jurisdiction. So you can have situations where multiple piece of legislations may apply at the same time to the single client with their single breach. So what has happened if you go back a few years, Quebec adopted something called Bill 64 and that bill turned into Law 25 came into force in three phases starting in 2022 then 23 and 24 with the three main developer, the big zoom out picture story with with Law 25 was we have become much more European in our approach. Our prior privacy laws were very much built on an ombudsman model. So this is where you come together, the privacy Commissioner, the non-compliant entity and the complaint, and try to figure out a way to get the compliance. And by and large, it was not an enforcement driven approach. It was an ombudsman and it lacked teeth. You fast forward from that initial 20 years when we adopted this piece of legislation to where GDPR is adopted in Europe, where it's a heavy regulatory hammer. If you're not compliant, there are significant fines and you'll be investigated. You better report so on and so forth. The Europeans had to wait. And Canada has something what we call adequacy standard, meaning you can transfer data back and forth Europe and Canada without any limitation. In the classic sense, if you were to send the data from Europe to the U.S., there's all kinds of requirements. You have to enter agreements and so on. Canada and the Europeans didn't have that previous to GDPR, because Canada was deemed to be adequately protecting information of European. So as a country based on commerce, we needed the adequacy standard, which is reviewed every four years. GDPR comes in. And then here we're evaluating kind of say hold on a second.

 

[00:28:11] Ted Brook Oh so we were gonna lose, we are at risk of losing that.

 

[00:28:14] Imran Ahmad You may lose adequacy. There could have always been that risk of losing it. So in Quebec, Minister Labelle at the time it was a minister of justice, introduced this Bill 64 or Law 25, which was very much model with the European one. Federal government had initially introduced something called C-11, which died on the docket with the previous election, but then got reintroduced as C-27, which has three parts to it very quickly. One, they will essentially replace PIPEDA. Two, and this is the litigation piece. I got a private right of action with a specialized tribunal. So they moved away from the commissioner issuing fine model that Quebec has adopted went to is a bit more like a tribunal after the Competition Act for example. So very different model. A lot of questions of how you can harmonize that across the country with different regimes. And then thirdly, they introduced this Artificial Intelligence Data Act, one of the first few jurisdictions in the world that has it. So that's currently at second reading. It's making its way through Parliament. There's broad agreement amongst practitioners and others who I've talked to that it's going to be passed in 2024, probably coming into force in 25 or later, but certainly will get passed. So that's one. The other a development. And John, you can maybe take this one is the C-26 one, which is introduced at the same time as C-27, but quite significant from a cybersecurity standpoint.

 

[00:29:35] John Cassell C-26 is Canada's first real piece of cybersecurity legislation. It is focused at federally regulated critical infrastructure. You're banks, airlines, telecoms, other federally regulated industries. It does two things. It amends our existing Telecommunications Act to take account of cybersecurity threats, gives regulators wide scoping power to address cybersecurity threats and risks in the telecommunications sector. But it also enacts what's called the Critical Cyber Systems Protection Act or the CCSPA, which is a new piece of legislation specifically targeted at federally regulated critical infrastructure and has a whole host of requirements, a big one being any entity caught by the scope of this act is required to implement a cybersecurity program within 90 days of enactment, which is very short. Must submit that plan to the regulator and any subsequent changes to their plan every year. Significant regulatory oversight of your cybersecurity plan.

 

[00:30:48] Ted Brook And John, that's only if you have an incident. So like you get you get put on probation or do all of those federally regulated?

 

[00:30:54] John Cassell All federally regulated entities as it's currently drafted. So very wide reaching piece of legislation, even more significant is, as it's currently drafted, there's a requirement for any entity caught by the act to also mitigate their downstream impacts of cyber risks. So to go back to our our discussion about third party vendors, the knock on effects from this piece of legislation will flow through to the supply chains of all these federally regulated entities. So we expect it to have a very significant impact. There will be contractual provisions that we also discussed also flow through. There will be liability flowing through a lot of it potentially because of this act, Bill C-26. So we expect it to to change this cybersecurity landscape in Canada quite significantly.

 

[00:31:45] Ted Brook Well, look, this has been fascinating if not a little bit doom and gloom. Is there positive note that we can end this chat on? Is there something you've seen, whether it's in industry, in the report or elsewhere, that gives you a little optimism for how in-house teams can be either be more proactive in responding to these risks?

 

[00:32:03] Imran Ahmad There is a positive tone to this. So we've talked about all the bad things that can happen. But I go back to what I was saying earlier. It's on how you respond that you're going to be judged. And so if you it may sound obvious, but the more you invest on the front end to prepare and the test, the better the outcome is going to be. It's muscle memory. It's like riding a bike. If you do it every day, you'll know how to ride a bike. But if you've never done it before, even though you have a bike sitting in the garage, you may have struggles riding the bike for the first time. So really here is to have that muscle memory and reaction and remain flexible, like there's no perfect answers. Right? At the end day, you're often faced as an organization with bad and worse options, not good and bad, and making sure that you have canvas the right decision points and make sure you identify those issues so that if you do end up in a dispute down the road, litigation, class action, a regulatory investigation, B2B, dispute or otherwise that you can say, look, you know what? There was a logic to what we did. Here's how we proceeded. It was methodical. It was thought through, it thoughtful.

 

[00:33:01] Erin Brown From what I'm hearing from this discussion, the other good news for our listeners today and for the general counsel that indicated this as a concern in the survey, is that there are people that are thinking about this stuff almost 24/7 and up to date on regulatory changes and what other companies are doing and how to stay ahead of these things, and how to deal with them when they happen and GCs don't have to face these things alone. There's teams like you and John who can who are there to help if they if they need.

 

[00:33:29] Ted Brook Yeah. It's great. John, do you want the final word?

 

[00:33:30] John Cassell I do think the message is positive to build on your your point, Erin. I think it's a really good one. There's a whole suite of professionals who are ready and willing to assist clients. Client is never alone when handling cybersecurity risk and it's top of mind as evidenced by this podcast. So you know, most organizations are well aware of the risks. Now it's just a question of doing it, being proactive and doing it. It's not an unknown risk. So there's a lot of great work being done in terms of preparing for these risks.

 

[00:34:00] Ted Brook Thank you, John. Thank you, Imran.

 

[00:34:02] Erin Brown Hopefully this episode will help people think through some of those risks and how they can stay on top of them and not just make people feel more doom and gloom like Ted was a few minutes ago. But this has been really helpful and you're feeling positive, I think, thanks to you both.

 

[00:34:16] Imran Ahmad Thanks. Appreciate it.

 

[00:34:18] John Cassell Thank you.

 

[00:34:22] Erin Brown That was a really interesting discussion, Ted.

 

[00:34:24] Ted Brook Yeah, yeah, it was good. It was good. I still can't get my head around the fact that 40 percent of the organizations that responded to this survey experienced litigation in the cybersecurity, data protection and privacy space, like that's up from 33 percent in the previous year. That's massive. Right?

 

[00:34:42] Erin Brown I don't think it's not surprising based on our conversation, though, based on talking to John and Imran. I mean, there's so much happening. There's regulatory developments. There are, you know, it's evolving so quickly. Hackers are getting more sophisticated. And I think at the same time, people don't quite know what it is because it is evolving so quickly. So there's that sort of fear of the unknown. But I think having fantastic people like John and Imran to call when things come up, I hope we'll put some of our listeners at ease that, you know, there's help when these things come up.

 

[00:35:12] Ted Brook Yeah, yeah, it put me more at ease is still I feel like I'm a little uneasy about it, but. No, no, I like talking with them.

 

[00:35:20] Erin Brown I'm uneasy about being a a podcast host. Now that, you know, John and Imran have told us that people can use our voices. So, listeners, please do not use our voices to try to to scam, Ted and I. Please. Yeah.

 

[00:35:33] Ted Brook Don't turn us into deep fakes. I feel like that's what the producers are going to do. And then they can just have AI through to episodes instead of us.

 

[00:35:41] Erin Brown Yeah, maybe they won't even need us there. They'll just be AI hosts. Yeah. All right, listeners, well, I hope you stick with us or our AI robots that will take over as hosts soon.

 

[00:35:52] Ted Brook Yeah. Next episode. All right.

 

[00:35:55] Erin Brown Stay tuned for that.

 

[00:35:59] Erin Brown Thank you for listening to Undisputed. Undisputed spin off. To learn more and download our 2024 Litigation Trends survey. Visit Litigationtrends.com or visit the link in our show notes.

 

[00:36:13] Ted Brook And you can subscribe to Undisputed on Apple Podcasts, Spotify or wherever you get your podcasts currently so that you won't miss any of our episodes.

Contacts

Partner
Senior Associate
Partner, Canadian Head of Technology and Canadian Co-Head of Cybersecurity and Data Privacy
Partner, Canadian Co-Head of Cybersecurity and data privacy