Home / Technology / Detecting 10 times more bots and stopping fraud with behavioral biometrics

Detecting 10 times more bots and stopping fraud with behavioral biometrics

Bots and fraud occur in every app, and in every vertical, said Alon Dayan, Founder and CEO, Unbotify. “In games online, this is a several-billion-dollars problem,” he said. “In our data we see that between five to 15% of the users of online games are actually using bots.”

Dayan joined Dean Takahashi, Lead Writer at GamesBeat, to discuss the origin of bots, fraudsters, and what the evolution of bots means for the future of mobile games, on his panel, “The Evolution of Bots and Future Predictions,” as part of the two-day GamesBeat event, Into the Metaverse.

“When you have incentive, and when you have scale, then you will see bots,” he added. “And this is how they started to evolve from the web to smartphones to Smart TV, and any new platform, any new media.”

Bots interfere directly in the monetization of apps, disrupting in-app purchases and user engagement, creating bad user experiences that are the direct cause of churn in every category, including dating apps and social networks. And, obviously, bots are a big problem for the adtech industry, laying waste to marketing budgets with ad fraud and user acquisition fraud.

In gaming, there are bots that watch the ads, see impressions, click on ads and they install the game over and over again. There are the bots that actually play the game. Players use service solution bots for the games they’re playing to harvest resources so they don’t have to buy them, or invest the time in harvesting themselves.

It’s a growing problem, specifically in the last year of the pandemic, he said. More and more games are seeing traction and it’s very natural that if a player likes a game, then they might want to skip all the boring stuff and get leveled up as fast as possible. They can buy cheap bots that do all the leveling for them. Fraudsters use bots to play the game and then sell those aged accounts to players who want to start at an advanced level, skipping earlier levels.

It means the developer is losing money, because without bots, the player would be making far more in-app purchases. Bots are bad for the community, too. Gamers who are playing fair get frustrated when players using bots appear and skew the game, and then they’ll churn. And these bots have traditionally been hard to detect, and hard to stop.

“As always in security, it’s a cat and mouse game, so when the cat becomes smarter, the mouse becomes smarter as well,” Dayan said. “Today, artificial intelligence and machine learning is part of the day-to-day business of the fraudsters. And this is also why you need a sophisticated solution in place, because bots are there, and they are sophisticated, and you need a premium solution to defend from them.”

Unbotify uses behavioral biometrics to do that. The solution uses biometric sensors in smartphones to track how a user interacts with the app, from the way they touch the screen, to how hard they press, the speed of their finger, how the device is moving, the alpha beta and gamma of the accelerometer, the light sensor, and the battery. With that data, and using machine learning models, they build a profile of how an individual behaves when using an app. The solution then detects anomalies — any other type of behavior that doesn’t match the user model, and is therefore a bot.

“These [fraudsters] are making a business out of these bots, so they are not going to give up very easily — they get a lot of money,” Dayan said. They’ll try more sophisticated, human-like bots, but the machine learning models are still able to detect and ban them without retraining. And after a couple of unsuccessful tries, these fraudsters will give up and move on to another app or another game, looking for more low-hanging fruits without the same level of security.

“Today we are detecting more than 10 times more bots than any other vendor or in any in-house solution that we see,” he said. “And it’s not enough just to detect more bots. The idea is not to have false positives, and today we have less than one false positive in more than 200,000 sessions and users.”

As for the metaverse, fraudsters are certain to launch bots there and use more or less the same techniques, because there is an incentive and there is big scale. And in the beginning, it will be easy for them, because there’s unlikely to be any sophisticated detection in place during the early days.

“Once it becomes a burden on the vendors, they will start to develop in-house solutions or try to buy detection solutions, and then you will see the cat and mouse game also in the metaverse,” Dayan said. “So I guess you will see more or less the same process as we saw in the last 20 years in web, as we saw in mobile devices, in gaming — it will just be a copy/paste to the metaverse.”

Let’s block ads! (Why?)

VentureBeat

About

Check Also

SAG-AFTRA hits out at AI Taylor Swift deepfakes and George Carlin special, calls to make nonconsensual ‘fake images’ illegal

The Screen Actors Guild – American Federation of Television and Radio Artists (SAG-AFTRA) put out …