"Long before it's in the papers"
January 27, 2015


“Virtual schoolgirl” designed to trap online perverts

July 10, 2013
Courtesy of SINC
and World Science staff

A new com­put­er pro­gram sim­u­lates a girl in or­der to catch pe­dophiles lurk­ing on­line.

Span­ish re­search­ers de­vised the “bot” to pose as a 14-year-old girl in chats and so­cial net­works. The Basque Coun­try po­lice force has al­ready shown in­ter­est in Ne­gobot, as the fake vic­tim is named, ac­cord­ing to the re­search­ers.

“Chat­bots tend to be very pre­dict­a­ble. Their be­hav­ior and in­ter­est in a con­versa­t­ion are flat, which is a prob­lem,” said Car­los Laor­den of the Uni­vers­ity of Deusto, one of the de­vel­op­ers. Ne­gobot in­stead “em­ploys game the­o­ry to main­tain a much more real­is­tic con­versa­t­ion,” var­y­ing its be­hav­ior over the course of a chat.

Ne­gobot, which cur­rently speaks only Span­ish, is a set of sev­en con­versa­t­ional agents in one, each with a dif­fer­ent way of be­hav­ing.

It starts with a “neu­tral” stance—lev­el 0, which it can main­tain in­def­i­nite­ly, since cun­ning pe­dophiles can hide their in­ten­tions for days. But it can be readily drawn in­to six ad­di­tion­al con­versa­t­ional “lev­els” cor­re­spond­ing to dif­fer­ent be­hav­iors. In lev­els 1 to 3, the hu­man user is push­ing the con­versa­t­ion in­to in­creas­ingly per­son­al or ex­plic­it ar­eas and ask­ing for per­son­al in­forma­t­ion. In lev­els mi­nus 1 to mi­nus 3, the hu­man is pro­gres­sively los­ing in­ter­est.

If the lev­el rises above ze­ro, and the bot de­tects sus­pi­cious be­hav­ior, it starts try­ing to ob­tain per­son­al da­ta from the sus­pect. If the lev­el drops be­low ze­ro, the bot tries to re­gain his at­ten­tion, re­sort­ing to feigned of­fense or pleas for sym­pa­thy if nec­es­sary.

“The most dan­ger­ous pe­dophiles are very care­ful about giv­ing out in­forma­t­ion,” said Laor­den. “These days it is suf­fi­cient to ob­tain a so­cial net­work pro­file, mo­bile num­ber or e­mail ad­dress, in­forma­t­ion that is pro­vid­ed to au­thor­i­ties.”

Ne­gobot does have lim­ita­t­ions; for in­stance, al­though its con­versa­t­ional abil­i­ties are very wide-rang­ing, it can’t de­tect irony. “It can be seen as a fil­ter that helps an­a­lysts from go­vernment safe­ty bod­ies and forces,” Laor­den said.

To avoid rais­ing sus­pi­cion, Ne­gobot keeps a rec­ord with in­forma­t­ion on the sub­jects in or­der to con­duct con­versa­t­ions real­is­tic­ally sep­a­rat­ed in time. Oth­er tech­niques in­clude var­y­ing re­sponse times, tak­ing the lead in the con­versa­t­ion some­times, or us­ing col­lo­qui­al or even poorly writ­ten lan­guage.

“Ne­gobot has al­ready been im­ple­mented and tri­alled ac­tively on Google’s chat serv­ice and could al­so be trans­lated in­to oth­er lan­guages,” Laor­den ex­plains. “We do not dis­card the pos­si­bil­ity of bring­ing it to new chan­nels in the fu­ture and we be­lieve it could be a very use­ful tool for so­cial net­works to in­cor­po­rate.”

The researchers describe their work in the Proceedings of the 5th International Conference on Computational Intelligence in Security for Information Systems

* * *

Send us a comment on this story, or send it to a friend

Sign up for

On Home Page         


  • St­ar found to have lit­tle plan­ets over twice as old as our own

  • “Kind­ness curricu­lum” may bo­ost suc­cess in pre­schoolers


  • Smart­er mice with a “hum­anized” gene?

  • Was black­mail essen­tial for marr­iage to evolve?

  • Plu­to has even cold­er “twin” of sim­ilar size, studies find

  • Could simple an­ger have taught people to coop­erate?


  • F­rog said to de­scribe its home through song

  • Even r­ats will lend a help­ing paw: study

  • D­rug may undo aging-assoc­iated brain changes in ani­mals

A new computer program simulates a little girl in order to catch pedophiles lurking online. Spanish researchers devised the “bot” to pose as a 14-year-old girl in chats and social networks. The Basque Country police force has already shown interest in Negobot, as the fake victim is named, according to the researchers. “Chatbots tend to be very predictable. Their behavior and interest in a conversation are flat, which is a problem,” said Carlos Laorden of the University of Deusto, one of the developers. Negobot instead “employs game theory to maintain a much more realistic conversation,” varying its behavior over the course of a chat. Negobot, which currently speaks only Spanish, is a set of seven conversational agents in one, each with a different way of behaving. It starts with a “neutral” stance—level 0, which it can maintain indefinitely, since the wiliest pedophiles can hide their intentions for days. But it can be readily drawn into six additional conversational “levels” corresponding to different behaviors. In levels 1 to 3, the human chatter is pushing the conversation into increasingly personal or explicit areas and asking for personal information. In levels minus 1 to minus 3, the human is progressively losing interest. If the level rises above zero, and the bot detects suspicious behavior, it starts trying to obtain personal data from the suspect. If the level drops below zero, the bot tries to regain his attention, resorting to feigned offense or pleas for sympathy if necessary. “The most dangerous pedophiles are very careful about giving out information,” said Laorden. “These days it is sufficient to obtain a social network profile, mobile number or email address, information that is provided to authorities.” Negobot does have limitations; for instance, although its conversational abilities are very wide-ranging, it can’t detect irony. “It can be seen as a filter that helps analysts from government safety bodies and forces,” Laorden said. To avoid raising suspicion, Negobot keeps a record with information on the subjects in order to conduct conversations realistically separated in time. Other techniques include varying response times, taking the lead in the conversation sometimes, or using colloquial or even poorly written language. “Negobot has already been implemented and trialled actively on Google’s chat service and could also be translated into other languages,” Laorden explains. “We do not discard the possibility of bringing it to new channels in the future and we believe it could be a very useful tool for social networks to incorporate.”