Researchers at the University of Colorado at Boulder have developed a new tool to combat bullying on social media platforms.

Unofficially designated the “cyberbullying detector,” CU’s CyberSafety Research Center presented the computer program and app in recently published research.

The program, which is both a desktop computer application and an Android mobile app called BullyAlert, scans and analyzes massive amounts of social media data for abusive language and incidents of bullying.

The issue of cyberbullying has resonance in Colorado. In 2017, the suicides of students at multiple Denver metro area schools were attributed to bullying on social media.

The cyberbullying detector is a functional prototype and is only compatible with Instagram for now, but there are plans to expand its capabilities to analyze other social media platforms such as Facebook and Snapchat. Developers see it as a tool for school districts and other large organizations who can use the speed and accuracy of its social media analysis to prevent bullying, as opposed to simply reacting to it.

“First, if you look at some studies that are done, more than 50 percent of U.S. teenagers are being cyberbullied,” said professor Shivakant Mishra, a computer scientist and co-author of the research. “In some sense, it’s a much more serious problem than, say, on the school playground. The reason for that is that it can be done 24/7. Social media allows bullies to hide themselves.”

The program monitors and analyzes large amounts of Instagram data only available through public accounts. Private Instagram accounts cannot be monitored by the program.

Administrators of schools or other organizations can identify and flag multiple accounts and monitor them for aggressive comments and recurring incidents that may indicate cyberbullying. If the program detects aggressive comments or characteristics of cyberbullying, it sends an alert to administrators, including the post and comment in question.

Similarly, the mobile app allows parental monitoring. Instead of overseeing multiple accounts, BullyAlert allows parents to supervise their children’s Instagram accounts specifically. BullyAlert ensures parents are aware of potential incidents without requiring them to follow each individual post or intrude in their kids’ social media presence.

Perhaps the greatest difficulty in designing the program was deciding how to define cyberbullying. After all, how can a computer program differentiate between abusive language and jokes or sarcasm on social media, where comments can lack tone or personal contextualization?

“Defining cyberbullying is the most difficult problem in designing a program like this,” Mishra said. “The definition of cyberbullying itself is very subjective and varies from person to person and culture to culture.”

To address the often ambiguous world of social media interactions, the research team designed the programs to learn and adapt to user input.

Initially the team employed humans to teach the program how to tell the difference between benign online comments and abusive language, and to identify patterns of repetition that may point to an incident of cyberbullying. It’s this foundation that every desktop or mobile application uses as a standard for comparison when first launched.

A pilot test of the program detected cyberbullying with 70 percent accuracy, according to CU Boulder Today.

Nevertheless, there’s concern that artificial intelligence is not adequate to discern the intricacies of interpersonal communication. Justin Patchin, who teaches criminal justice at the University of Wisconsin at Eau Claire and is a co-founder and co-director of the Cyberbulling Research Center, sees programs such as BullyAlert as addressing just the tip of the iceberg.

“I’m skeptical, but hopeful. I think (artificial intelligence) has come a long way, but we still have a ways to go,” Patchin said. “Unless an effective response is taken, identifying incidents is only part of the equation.

“The thing about cyberbullying is that it’s contextualized. Machines can’t figure out the nature of yours and my relationship, because it doesn’t know the context. I’m more optimistic about the capabilities of machine learning, but we’re not there yet.”

Matt Farber, an assistant professor at the University of Northern Colorado who focuses on technology and education, said technologies such as BullyAlert may be useful, but they should not be considered in isolation.

“It’s hard to tell if two kids are joking around with each other,” Farber said. “Typing something and putting an emoji after it — maybe you’re kidding, maybe you’re not — that is a tough thing. I think we need to practice good social and emotional learning along with digital citizenship. When we have a blended experience that combines digital experience and human mediation, along with programs like the one out of CU-Boulder, they can complement one another.”

The CU program comes at a time when cyberbullying is increasingly cited as a factor in teen suicide. The Centers for Disease Control and Prevention reports that the overall suicide rate between 2007 and 2015 increased 31 percent for males age 15-19 and doubled for females of the same age. Colorado saw suicides of teens ages 10-18 rise from 50 in 2014 to 68 in 2016.

In September, Littleton had two social media-related suicides sparking the school  “Offline October” challenge at Heritage High School that spread to include 1,600 people at 200 schools in seven countries. Similar incidents also occurred in Aurora and Thornton the same year.

As a prototype, the BullyAlert isn’t yet used by any school districts in the state. However, schools are using other technology in an effort to prevent bullying.

Jefferson County Public Schools uses Safe2Tell Colorado, a system that allows students, parents, and teachers to anonymously report information about any issues that may concern their safety or the safety of others, including cyberbullying.

“We’re always working to keep our kids safe. Anything that helps is always worth investigating,” said Diana Wilson, communications officer for the school district.

Safe2Tell users can use the Apple or Android app to upload photos or social media posts to help schools and local law enforcement conduct an investigation. They can report incidents online or by calling 1-877-542-7233 to speak with a Colorado State Patrol dispatcher.

Source link

Load More By elspoka
Load More In Social Media

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Check Also

Cryptocurrency reality checks and the coming boom

If you’re thinking cryptocurrencies have been an embarrassing speculative fad full o…