Posts

Use Facebook or YouTube if You Want to be a Criminal

When you start thinking about a cybercriminal, you probably imagine a sleezy looking person in a dark room wearing a black hoodie, right? They are likely browsing the dark web and maybe surrounded by empty high energy drinks. However, that’s not how most cybercriminals look. Most look just like a normal person, and they are hiding in plain sight.

They are organized, they function like any profitable business, they have hierarchies, employees and even a business plan.

Criminals can easily create Facebook groups or YouTube channels, and then they start participating in cybercrimes, which include buying and selling information from credit cards, spamming/phishing tools, or even accessing logins and passwords. Some of these groups have thousands of members.

This might not sound like a lot, but it is. You also have to consider the fact that Facebook as approximately 2 billion people logging into the site every month. With that amount of people, it can be difficult for the company to deal with this type of infiltration.

Facebook removes these cybercriminals, but it tells us that the mega corporation is having a difficult time keeping bad behavior at bay. This is a game of whack a mole. They keep popping up like mushrooms or weeds. These groups also includes share false information, hate speech, and incite violence. It also shows how this behavior is amplified by Facebook’s or YouTubes algorithms.

Finding these groups or channels is easy. All you have to do is search for “spam” or ‘CVV,” “dumps”, “skimming” or search a variety of “white supremacy” terms and then join. Once you join these groups, the algorithms come into play and suggest even more groups that are similar. The truth is, these sites  don’t have a good way to catch these criminals, and it relies on user reports to police the bad behavior.

Since this is the case, tech companies have a long way to go before it can stop relying on user reports. There is also the fact that oftentimes, these reports are not taken seriously, so even valid reports can fall through the cracks.

One example of this is with the terrorist attack in Christchurch, New Zealand in 2019. The gunman actually streamed the attacks on Facebook Live. Though Facebook took the video down, eventually, thousands of people were able to see it before it disappeared. Facebook claims that there were no reports of the video, which is why it took so long to take it down…though that is difficult to believe.

The one bright spot is that Facebook has admitted that there is an issue, and it understands that these groups were in violation of its own policies. The company also said that it knows that more vigilance is required and that it is working on investigating more criminal activities that are reported.

Written by Robert Siciliano, CEO of Credit Parent, Head of Training & Security Awareness Expert at Protect Now, #1 Best Selling Amazon author, Media Personality & Architect of CSI Protection Certification.

YouTube’s Spoon Feeding Pedophiles Kids Home Videos

YouTube uses a recommendation algorithm to help people view things they’d like to see. Recently, the algorithm seemingly encouraged pedophiles (YouTube would have no way of knowing this) to watch videos of children playing at home, videos that the family members uploaded.

safr.me

Do your kids make digital purchases with you money?

A report from the New York Times detailed how YouTube had been exploiting minor children through the automated recommendation system. According to the report, researchers at the Berkman Klein Center for Internet and Society at Harvard were studying the influence of YouTube in Brazil. This was when they noticed the alarming issue. The experiment used a server, which followed YouTube recommendations a thousand or more times, which build a map of sorts in the process. The map is designed to show how YouTube users are guided as to what they may want to watch.

During the experiment, recommendations stemmed from sexually-themed videos, which is when researchers noticed that the system showed videos that were extreme or bizarre, placing more emphasis on youth. In some cases, a video of females discussing sex led to videos of women breastfeeding or wearing just underwear. Many times, the women mentioned their ages, which ranged from 19 to 16 years old.

Deeper into the experiment, YouTube started recommending videos where adults wore children’s clothing or solicited payment from ‘sugar daddies.’

With such softcore fetish recommendations already being showed, YouTube showed videos of children who weren’t fully clothed, many of them in Latin America or Eastern Europe.

These videos were usually home videos that had been uploaded by their parents. Many times, parents want to easily share videos and pictures of their children with family and friends. However, YouTube’s algorithm can learn that people who view sexually-exploited children want to see these family videos and may recommend them without knowledge.

One mother, Christine C., was interviewed by the Times about her 10-year-old child. The child uploaded a harmless video of her and a friend playing in the pool. The video was viewed over 400,000 times in just a few days. The mother said that her daughter was excited about the view count, which alerted Christine that something was amiss.

This is just one of many incidents that unfolded after YouTube publicly confronted its issues with pedophilia earlier in 2019. Back in February, YouTube had to disable comments on minor children’s videos because pedophiles were reportedly commenting on the videos in ways to signal other predators.

Studies have shown that the recommendation system on YouTube can create a rabbit-hole effect where the algorithm recommends more extreme content as time goes on. The company denied that reality or skirted the topic. However, in May, Neal Mohan, the chief product officer at YouTube, said that extreme content doesn’t drive more engagement or watch time than other content options.

YouTube hasn’t made many comments about the recommendation system or that it creates the rabbit hole effect. Instead, journalists and reporters are referred to a particular blog that explains how the company focuses on protecting minors and that its videos don’t violate any policies and are posted innocently.

The announcement also focuses on the recent steps taken by YouTube to disable comments for videos that feature or are uploaded by minors. Minors are also going to be restricted so that they cannot live-stream unless a parent is on the video. Along with such, the company plans to stop recommending videos that depict minors in risky situations.

Researchers believe that it would be best to block children’s videos or videos depicting children and not allow those videos in the recommendation system at all. However, YouTube reported to the Times that it doesn’t plan to do that because the automated system is one of the largest traffic drivers and could harm creators.

ROBERT SICILIANO CSP, is a #1 Best Selling Amazon author, CEO of CreditParent.com, the architect of the CSI Protection certification; a Cyber Social and Identity Protection security awareness training program.

If You Care About Privacy Don’t Do These 8 Things

I don’t care as much about privacy like some people do. My concern is personal security. If I was concerned about people knowing “me” stuff then you wouldn’t be reading this because I’d live in a cave in Wyoming with no Internet and I’d blow glass all day. But personal security is something I deeply care about. The following are both privacy issues and a little personal security in there too.

Don’t throw away anything that can be used against you. For privacy and security reasons consider how someone could use something in your trash against you. I never toss anything with a name or account number on it and I’m careful not to toss DNA related stuff either. And I know people are saying that’s crazy. If it can be planted at a crime scene its flushed.

 Don’t publish your phone number. Many data aggregators use phone company records to index you. Without a published phone number they have a harder time indexing your name associated with an address. My home phone number is under a pseudonym and it’s also under a business name.

Don’t allow your name to be searchable on Facebook or be on Facebook at all. I broke that rule. When logged into Facebook go HERE to change it.

 Don’t broadcast your location. Location-based services (LBS) are information and entertainment services, accessible with mobile devices through the mobile network and utilizing the ability to make use of the geographical position of the mobile device. Twitter, Facebook and others are getting in the game with LBS.  Carnegie Mellon University compiled more than 80 location services that don’t have privacy policies or collect and save all data for an indefinite amount of time. I see this more as a personal security issue.

Don’t post videos on Youtube that reveal your personal life. I have a business Youtube page and a personal. The iPhone has a direct connection to Youtube and it’s a blast taking video and quickly uploading. However, my personal page is under another name and all the videos are private. The only way to see them is to login.

Don’t forget to read privacy policies. I don’t like reading privacy policies because they are long winded and confusing. But not knowing what companies may do with your data is not good.

 Don’t use your real name as a username. I broke this rule a few hundred times. It’s a privacy issue when you don’t shield your name. It’s a personal security issues not to grab your name allowing someone else to get it and use it against you. Get all of them at Knowem.com.

Don’t put your name on your mailbox or on a plaque on your home. All the postal carrier needs is a street number. There’s no reason to plaster your last name on your home either. I see this more as a personal security issue. But there are certainly privacy concerns here too.

Robert Siciliano personal security expert to Home Security Source discussing Location Services on The CBS Early Show.