DIGITAL LIFE
Exposing how automation apps can spy—and how to detect it
A team of University of Wisconsin-Madison engineers and computer scientists has identified vulnerabilities in popular automation apps that can make it easy for an abuser to stalk individuals, track their cellphone activity, or even control their devices with little risk of detection.
In recent years, tech companies have released an array of automation apps—including native apps like Apple's Shortcuts and Bixby Routines on Samsung phones as well as third-party apps like Tasker and IFTTT—to help simplify digital tasks. Using nontechnical menus, users can "program" the apps to do things like automatically turn down a phone's volume at school or work, sort photos into specific folders, set up routines for smart home devices like lights and thermostats, or launch a specific playlist when the user gets into their car.
On the flip side, an abuser who has access to another person's phone for even a few minutes can set up automation routines that share location or texting information, or enable them to overload or control a phone, take unauthorized videos, and impersonate them, among other activities.
Each of these automations acts like a mini-app. However, since they are housed within the larger automation app, phones and tablets do not treat them like individual apps.
In other words, unlike the sounds, badges or banners that alert users when, for example, they receive a text, automations don't trigger notifications when they've been activated or are running. That means malicious automations may go undetected.
To make device access even easier, abusers can find many of these malicious automation shortcuts on social media or other public platforms.
Chatterjee first posed the issue to students in his seminar course, CS 782: Advanced Computer Security and Privacy. Zhang and computer sciences student Jacob Vervelde took on the task, first finding out how perpetrators could exploit automation apps. After the class ended, Zhang continued the investigation as part of her graduate research in Fawaz's lab.
She next surveyed the public repositories, finding 12,962 automated tasks of all sorts for Apple iOS alone. The researchers then developed an AI large language model-assisted analysis system to detect shortcuts with the potential for abuse. In the end, they found 1,014 combinations that, if placed on someone's device, could enable abusive behavior.
Next, they used test devices to confirm that it is indeed possible to use those 1,014 shortcuts to perform activities such as sending malicious emails from another person's account, overloading a phone so it is unusable, locking a user out, turning on airplane mode, and stealing photos—all without leaving obvious, detectable traces.
Zhang says the team notified tech companies about these issues. "One company told us that users are responsible for their own devices, and they should create strong passwords and make sure the devices aren't accessible to other people," she says. "But that doesn't reflect reality; that's not how things work in the abusive relationships we see."
The researchers' analysis also showed that conventional security and detection strategies were of little use: Permissions settings apply to apps and not individual automations, notifications can be easily turned off, and third-party malware detectors don't scan for malicious automations.
That's why the researchers decided to turn their AI large-language-model-based evaluation tool into its own app—an online service people can use to detect these malicious recipes, says Fawaz.
While AI may be the current solution to the issue, the team is also concerned that AI will also enable even more digital abuse: Combining AI assistants and automation apps, for example, will likely make it even easier for abusers to cook up recipes for malicious digital tools.
In the meantime, Zhang, Fawaz, Chatterjee and their collaborators will continue to be on the lookout for emerging forms of digital abuse and ways to mitigate it.
"This project is a strong example of the Wisconsin Idea and the 'circle of research' in action," says Chatterjee. "It began with a community-outreach initiative, grew through our course curriculum, and was brought to life by the Kassem's Wisconsin Privacy and Security research team. Ultimately, it will give back to the community by providing a tool designed to prevent the abuse of automation apps and help protect survivors."
Provided by University of Wisconsin-Madison
----------------------------------------------------------------------------------------------------------------------------
Samsung Galaxy S24 FE 5G 256GB 8GB RAM Tela 6.7": https://amzn.to/47tdDRC
Samsung Galaxy A36 5G 256GB, 6GB RAM: https://amzn.to/3HxAQaH
Smartphone Motorola Moto g56 5G - 256GB 24GB: https://amzn.to/47sz8C0
Epson EcoTank L18050 Photo Printer-Ink Tank, 6 Colors, A3+ Format, Bivolt: https://amzn.to/3HvxpkZ
No comments:
Post a Comment