When you start thinking about a cybercriminal, you probably imagine a sleezy looking person in a dark room wearing a black hoodie, right? They are likely browsing the dark web and maybe surrounded by empty high energy drinks. However, that’s not how most cybercriminals look. Most look just like a normal person, and they are hiding in plain sight.
They are organized, they function like any profitable business, they have hierarchies, employees and even a business plan.
Criminals can easily create Facebook groups or YouTube channels, and then they start participating in cybercrimes, which include buying and selling information from credit cards, spamming/phishing tools, or even accessing logins and passwords. Some of these groups have thousands of members.
This might not sound like a lot, but it is. You also have to consider the fact that Facebook as approximately 2 billion people logging into the site every month. With that amount of people, it can be difficult for the company to deal with this type of infiltration.
Facebook removes these cybercriminals, but it tells us that the mega corporation is having a difficult time keeping bad behavior at bay. This is a game of whack a mole. They keep popping up like mushrooms or weeds. These groups also includes share false information, hate speech, and incite violence. It also shows how this behavior is amplified by Facebook’s or YouTubes algorithms.
Finding these groups or channels is easy. All you have to do is search for “spam” or ‘CVV,” “dumps”, “skimming” or search a variety of “white supremacy” terms and then join. Once you join these groups, the algorithms come into play and suggest even more groups that are similar. The truth is, these sites don’t have a good way to catch these criminals, and it relies on user reports to police the bad behavior.
Since this is the case, tech companies have a long way to go before it can stop relying on user reports. There is also the fact that oftentimes, these reports are not taken seriously, so even valid reports can fall through the cracks.
One example of this is with the terrorist attack in Christchurch, New Zealand in 2019. The gunman actually streamed the attacks on Facebook Live. Though Facebook took the video down, eventually, thousands of people were able to see it before it disappeared. Facebook claims that there were no reports of the video, which is why it took so long to take it down…though that is difficult to believe.
The one bright spot is that Facebook has admitted that there is an issue, and it understands that these groups were in violation of its own policies. The company also said that it knows that more vigilance is required and that it is working on investigating more criminal activities that are reported.
Written by Robert Siciliano, CEO of Credit Parent, Head of Training & Security Awareness Expert at Protect Now, #1 Best Selling Amazon author, Media Personality & Architect of CSI Protection Certification.