brazzer porn
casino siteleri
Technology

Is Discord risky for minors?

Since the early days of the Internet in the 1980s, going online has meant participating in a community. In the beginning, phone chat servers, email lists, and text-based discussion groups focused on specific interests.

Since the early 2000s, powerful social media platforms have combined these small spaces into larger ones, allowing people to find their little corners of the Internet but with interconnections with others.

This allows social media sites to suggest new spaces that users could join, be it a local neighborhood discussion or a group with the same hobby, and sell specifically targeted advertising. But the small group niche community is making a comeback with the adults and with the, kids and teens.

About Discord

When Discord—a freeware VoIP voice, video, and text chat instant messaging service—first launched in 2015, many video games didn’t allow players to talk to each other via live voice chat while playing, or they were forced to pay premium prices to do so.

Discord was an app that enabled real-time voice and text chat, so friends could team up to conquer an obstacle or chat while exploring a game world. People still use Discord for that, but today most activity on the service is part of a larger community than just a couple of friends getting together to play games.

The Discord analysis is part of an investigation into how academics, developers, and policymakers can design and maintain healthy online spaces, conducted by Brianna Dym, Ph.D. candidate, Information Sciences, the University of Colorado-Boulder for The Conversation, which we share below in his voice.

A little old school

Discord first appeared on my radar in 2017 when an acquaintance asked me to join a writer’s support group. Discord users can create their communities, called servers, with shareable links to join and options on whether the server is public or private.

The writer’s pool server looked like an old-school chat room, but with multiple channels segmenting the different conversations, people were having. He reminded me of descriptions of early communities based on online forums and chats that hosted long conversations between people from all over the world.

The folks at the writers’ server quickly realized that some of our community members were teenagers under 18. Although the server owner had kept the space invitation-only, he avoided saying “no” to anyone requesting access. After all, it was supposed to be a support community for people working on writing projects. Why would you want to exclude someone?

He didn’t want to kick the teens out, but he was able to make some adjustments using the Discord server’s moderation system. Community members had to reveal their age, and those under 18 were assigned a unique “role” that labeled them minors. That role prevented them from accessing channels we marked as “not safe for work” or “NSFW.” Some writers worked on graphic romance novels and didn’t want to solicit teen input. And sometimes, the adults just wanted to have their own space.

Although we have been careful to build a safe online space for teens, there are still dangers in an app like Discord. The platform is criticized for lacking parental controls. The terms of service state that no one under the age of 13 should register on Discord, but many young people use the platform despite this.

People have used Discord to organize and promote hateful rhetoric, including neo-Nazi ideologies. Others have used the platform to traffic child pornography.

However, Discord maintains that such activities are illegal and unwelcome on its platform, and the company regularly bans servers and users that it says perpetuate harm.

Security options

Since then, every Discord server I’ve joined has some protection against youth and inappropriate content. Whether through age-restricted channels or denying minors access to specific servers, the Discord communities I belong to share a great concern for the safety of young people online.

However, this does not mean that all Discord servers are always safe for their members. Parents should take the time to talk to their children about what they do in their online spaces. Even something as innocuous as the popular kids’ game environment Roblox can turn sour in the right environment.

And while the servers I’ve been involved with have been carefully managed, not all Discord servers are regulated this way. In addition to servers not being uniformly held, account owners can lie about their age and identity when signing up. And there are new ways for users to misbehave or annoy others on Discord, like spamming loud and inappropriate audio.

But, as with other modern social media platforms, there are safeguards to help administrators keep online communities safe for young people if they so choose. Users can tag an entire server as “NSFW,” going beyond single-channel tags and blocking underage accounts from whole communities.

But if they don’t, those in charge of the company can do it themselves. When Discord is accessing on an iOS device, the NSFW servers are not visible to anyone, including accounts owned by adults. In addition, Discord has a Moderator Academy to support the training of volunteer moderators who can adequately handle various situations.

Tighter controls

Unlike many other popular social media platforms, Discord servers typically run as closed communities, with invites required to join. There are also large open servers flooding with millions of users. But Discord’s design integrates content moderation tools to keep things tidy.

For example:
The creator of the server has strict control over who has access to what. And what permissions each server member can have to send, and delete. Or manage messages. Additionally, Discord allows community members to add automation to a server, continuously monitoring activity to enforce moderation guidelines.

With these protections, people use servers to form tight-knit, closed spaces, safe from chaotic public squares like Twitter and less visible to the rest of the online world. This can be good, as it keeps areas safer from stalkers, trolls, and misinformation spreaders.

In my research, young people have cited their Discord servers as safe. And private space they have online in contrast to the cluttered public platforms.

Brianna Dym, a Ph.D. candidate in Information Sciences, University of Colorado Boulder.

However, moving online activity into more private spaces also means that those healthy. Well-regulated communities are less discoverable to the vulnerable groups that might need them.

For example:
New parents looking for social support are sometimes more inclined to access it through open subreddits than Facebook groups.

Discord servers are not the first closed communities on the Internet. They are essentially the same as old-school chat rooms, private blogs, and curated mailing lists. They will have the same problems and opportunities as the previous online communities.

Self-protection debate

In my opinion, the solution to this problem does not necessarily lie in prohibiting certain practices or regulating Internet companies. Research into youth online safety reveals that government regulations designed to protect minors. On social media rarely has the desired result and instead often leads to disempowerment and isolation of youth.

Just as parents and caring adults talk to children about how to recognize dangerous situations in the physical world. Talking about healthy interactions online can help young people protect themselves in the online world. Many youth-focused organizations and Internet companies have Internet safety information for children of all ages.

Every time young people jump on the next tech fad, there is a panic about how adults, and businesses. And society may or may not keep young people safe. The essential thing in these situations is to remember that talk to young people about using these technologies.  What to do in difficult situations can be an effective way to help them avoid serious harm on the Internet.

Related Articles

Back to top button