In the early days of Russia’s invasion of Ukraine, malicious online scammers leapt into action across social media, with numerous fake accounts suddenly emerging to capitalize on the ongoing tragedy. Many of these were impersonation accounts, taking on the identity of Ukrainian President Volodymyr Zelenskyy and others, in a bid to either scam unwitting donors for money, or to spread misinformation.
Eydle — a new startup supported through the USC Viterbi Startup Garage — is aiming to fight these types of malicious events head-on, harnessing a powerful AI platform to stamp out online impersonation, or phishing. Eydle is co-founded by information security expert Ashwini Rao and Birendra Jha, who is an assistant professor within the USC Viterbi Mork Family Department of Chemical Engineering and Materials Science, with a background in AI research.
Phishing is an increasingly prevalent online occurrence in which a malicious attacker impersonates a reputable source, so that they may gain access to information, secure personal and financial details and defraud their victims. Rao and Jha founded Eydle to tackle the growth in impersonation accounts across social media, where malicious users emerge to either duplicate official accounts — such as those of financial institutions and high-profile individuals — or take over existing users’ accounts and disguise themselves as official sources.
Rao and Jha said that their AI analysis had unearthed at least 14 impersonation accounts on Instagram created between Feb 25 and March 1, claiming to be the official or private accounts of Volodymyr Zelenskyy.
“All accounts used the picture, bio and posts that impersonated the President. The majority of these accounts had raked up between 500 and 3500 followers within sometimes a matter of hours,” Rao said, adding that even in the case of legitimate Zelenskyy “fan” accounts, some had linked to fraudulent donation sites.
In the case of the Ukraine conflict, these types of attacks make it more difficult for reliable information and humanitarian efforts to break through the noise of false actors. However, Rao said these types of scams and attacks were doubling each year across the board on social media, affecting brands, companies and individuals.
“What generally happens is an attacker or a scammer will go on social media to create an account that looks very similar to, for example, Goldman Sachs.” Rao said. “They copy their logo and bio information and some of the recent posts.”
From there, often scam accounts will communicate with followers to try to extract personal or financial details. Scammers can also impersonate CEOs or high-ranking individuals within the companies, or subsidiary departments that may not have verified checkmarks.
Another tactic involves a scammer infiltrating an individual’s existing account by obtaining their login details. From there, they can change the individual’s profile information to that of a company, such as a bank or financial institution. By harnessing the user’s existing follower base, the account can appear more legitimate, often fooling a social media platform’s own search algorithm and therefore appearing higher in search results and further increasing its reach. Ordinary users may then unwittingly tag the impersonator account, because it appears in search results, instead of the official account
While these types of attacks typically target financial services companies, Jha and Rao have also been examining how attacks can impact small businesses that have a social media presence. Recently, there have been a proliferation of online attacks targeting psychics and astrologers — small businesses that rely on Instagram and other social media platforms to build their online customer base.
“Scammers will set up a very similar account to send direct messages to the followers of this actual account, offering to do a reading for them or offer online services,” Rao said. “I would say one of the reasons that scammers go after these types of businesses is that people who seek out psychics can often be quite vulnerable.”
For small businesses, trying to manually find and report an ever-growing number of phony accounts may seem like a game of whack-a-mole. Scammers can use software to more easily duplicate accounts, and as soon as fakes are uncovered, new ones can appear. To address this, Jha has harnessed his AI background in developing Eydle’s platform, which uses AI and deep learning to locate these accounts on a broader scale.
“Artificial intelligence has seen exponential growth in visual perception tasks — finding images and detecting images, whether it’s a person’s face or brand logo, you name it,” Jha said. “Human analysts can work on it, but you can’t scale it to 200 million accounts on Instagram. You can’t do it on a daily basis. It has to be automated.”
Eydle’s model runs continually, seven days a week, analyzing social media for advanced visual similarity on a user’s logos, images and text to unearth impersonators quickly. It then offers users the opportunity to report these accounts for trademark or copyright infringement, and then monitor them until they’re taken down.
“Given AI’s promise in image perception and visual perception tasks, I thought it was the perfect combination. We did further research on who else is doing it and what it can offer for possible solutions in this area, and this seemed like a golden opportunity,” Jha said.
Rao said other tools in this field approach social media impersonation by using more of a manual approach, searching for keywords, which may call up a series of results that could be false positives.
“Just because the name of the business may be mentioned, doesn’t mean it’s an impersonation account. What we intend to do is really make it much more accurate through our visual analysis,” Rao said.
The Eydle team is currently working on sourcing organizations to partner with on their social media security, while also further researching how their platform could be expanded to small businesses and individual users.
“Eventually we want to be a social media security system for everyone, not just for enterprise.” Rao said.
Published on March 31st, 2022
Last updated on April 5th, 2022