Home > Media News >
Facebook in its attempt to make its platform secure and uncover any bugs that might have gone unnoticed is building a hidden bot only platform . According to a research paper released by Facebook researchers via The Verge, Facebook is apparently looking to build a West World-esque fake platform that will only have bots. This platform will be used to run simulations and uncover hidden bugs in the “real” Facebook.
Company researchers are using the technology of “Web Enabled Simulation”(WES) to achieve this task. This means that the bots can like, comment, share, send friend request, and to take a darker turn, harass, abuse, and scam other bots. A Web-Enabled Simulation (WES) is a simulation of the behaviour of a community of users on a software platform. It uses a software platform to simulate real-user interactions and social behaviour on the real platform infrastructure, isolated from production users.
Facebook describes building a scaled-down, walled-off simulation of its platform populated by fake users modeling different kinds of real behavior. For example, a “scammer” bot might be trained to connect with “target” bots that exhibit behaviors similar to real-life Facebook scam victims. Other bots might be trained to invade fake users’ privacy or seek out “bad” content that breaks Facebook’s rules.
Software simulations are obviously common, and Facebook is expanding on an earlier automated testing tool known as Sapienz. But it calls WES systems distinct because they turn lots of bots loose on something very close to an actual social media platform, not a mockup mimicking its functions to detect bugs .
Researchers can build WES users whose sole goal is stealing information from other bots, for example, and set them loose on the system. If they suddenly find ways to access more data after an update, that could indicate a vulnerability for human scammers to exploit, and no real users would have been affected.
Some bots could get read-only access to the “real” Facebook, as long as they weren’t accessing data that violated privacy rules. Then they could react to that data in a purely read-only capacity. In other cases, however, Facebook wants to build up an entire parallel social graph. Within that large-scale fake network, they can deploy “fully isolated bots that can exhibit arbitrary actions and observations,” and they can model how users might respond to changes in the platform — something Facebook often does by invisibly rolling out tests to small numbers of real people. In the future, Facebook can use this technology to build bots that perform a selected type of function .
Top Stories