BLG cyber expert Eric Charleston discusses trends in online crime
Canadian organizations are facing the continued threat of widespread ransomware attacks, a broadening of the cyber-criminal landscape, and artificial intelligence is adding to cybersecurity risk, says Eric Charleston, co-chair of the cybersecurity group at Borden Ladner Gervais LLP.
Charleston’s group works across Canada and its lawyers do cyber incident response, cybersecurity preparedness, and regulatory compliance.
In Charleston’s team’s cyber incident response practice, ransomware remains pervasive.
“It is everywhere, and it's affecting our clients terribly.”
He says these attacks are typically “double extortion,” involving encryption of the client’s network and data theft. In the last year, the cyber incident response team has dealt with a lot of third-party ransomware, also known as a vendor attack, where the attacker targets an entity to whom the client has trusted their data.
“Those third-party vendor breaches have been very prevalent over the last year to 18 months,” says Charleston. “And that's because the bad guys get a lot of bang for their buck when they hit a vendor that manages the data of lots of institutions. It's almost like 30 breaches in one, which may increase the odds of them getting a ransom payment for the decryption keys or a promise to delete the data.”
Charleston’s incident response team also deals with a lot of funds-diversion fraud. In this type of fraud, the hacker gets into an organization’s email system, impersonates a vendor who is owed money, and provides false bank information to the target. The target pays the bill, and when it discovers the fraud, the money is gone. “That's still very, very common,” he says.
In Charleston’s cybersecurity preparedness practice, many clients request vendor management assistance. Many want their vendor agreements reviewed to ensure they have the necessary cybersecurity obligations.
“Our clients are asking us: ‘What do we need in our contracts with other parties to make sure that they're being cyber secure, and they are protecting the data we trust them with?’”
For a few years, it has been common to include privacy requirements in vendor agreements that delineate how they use and disclose data. Charleston says the new trend is the inclusion of “mandatory cybersecurity controls” in the agreements, including specific technical components in their cybersecurity stack.
“That is relatively new and happening across all industries.”
He says that mandatory cybersecurity controls are a particular focus for industries that will fall under the scope of the proposed Bill C-26. The legislation, An Act respecting cyber security, amending the Telecommunications Act and making consequential amendments to other Acts, is currently before the Standing Committee on Public Safety and National Security in the House of Commons.
Bill C-26 would impose cybersecurity requirements, including vendor cybersecurity management, on companies involved in critical infrastructure, such as telecoms, airlines, and international pipelines.
“Organizations that find themselves potentially regulated by that law, should it pass, are starting to be proactive about managing cyber risk in their vendor relationships,” says Charleston.
The cybersecurity world, he says, is seeing a broadening of the “threat actor landscape.” In the last 18 months, syndicates of ransomware attackers have allowed outsiders to use their tools and branding to stage attacks, which is known as “ransomware as a service.” What this means for victims, says Charleston, is they do not always have the reliability and professionalism that accompanies an established cyber attacker whose conduct throughout the crime will at least be predictable.
When large language models, such as Chat GPT, emerged, there was significant concern that these technologies would be used to rewrite ransomware at a “breakneck pace,” he says. Generative AI is also a “very powerful tool for persuasion.” Charleston has seen a marked increase in the persuasiveness, syntax, grammar, and believability of phishing communications and in the language used during ransomware negotiations.
AI also has the potential to drastically innovate impersonation fraud, he says. It can be used to replicate the voice or image of a CEO or CFO, which can be used to approve massive payment diversions.
“That's a doomsday scenario where AI can enable this type of fraud, but it's the kind of impersonation fraud that you'll hear AI security researchers discuss as a potential.”