How do you envision a better responsible social media experience? This is not really just about Facebook Inc and it’s suite. 97% of the Fortune 500 Companies use social media as a primary means of for business marketing and advertising. Very interesting discussion around structured data sets and the Facebook Platform Algorithm review of the United States Senate.
- October 2021: Facebook’s Platform and Public Product Challenges
- Facebook’s response to 60 Minutes’ report, “The Facebook Whistleblower”
- Facebook whistleblower says company incentivizes “angry, polarizing, divisive content”
- Watch Live: Facebook whistleblower Frances Haugen testifies before Senate committee
Facebook hides behind walls that keep researchers and regulators from understanding the truth. Public can not validate the truth. The companies hides their own data, they have directly asked how do you impact the health and safety of our children. Facebook has not earned our blind faith.~ Frances Haugen: Facebook Testimony before US Senate
What is the United States Senate Subcommittee leading this review
US Senate Consumer Protection, Product Safety, and Data Security subcommittee exists to address issues of consumer affairs and consumer product safety; product liability; property and casualty insurance; sports-related matters; consumer privacy and data security protection, and international data transfer issues.
The subcommittee conducts related legal oversight of:
- Federal Trade Commission (FTC) which, in 2020, sued Facebook for illegal Monopolization
- Consumer Product Safety Commission (CPSC), 10/05/2021 Senate hearing for Facebook Protecting Kids Online
- Federal Trade Commission administers the Children’s Online Privacy Protection Act
Social Media Platform Owners need to Protect Consumer Safety
The challenge with Social Media is when Facebook and other social media outlets tend to choose profit over “Protecting People Safety. We need an algorithmic protection act that better manages the adverse impacts.
We need to expand the Child Online Privacy Protection Act to stop preying on children similar to the 1990 Children’s Television Act mandates educational television for children on commercial broadcast stations. Reestablishes commercial time restrictions. Bans host selling. We can not change the “feedback” loop, but we should not be using target ad selling to youth (similar to that of the television act).
Facebook AI Research is not the issue the platforms algorithm, it can not run unattended effectively. The issues are complex and nuanced. Public Safety, Privacy and Profit over safety are why we must realize the actual damages that have happened. This is a case of Facebook choosing to become a trillion dollar company who profits, at the expense of consumer/user safety.
How do we balance Public Safety Policy with profit? Facebooks company leadership products harm children, create division, and weaken our democracy. Yester day we saw Facebook down for 5 years. Millions of small businesses could not reach their customers. We can have global social media that connects us without tearing us apart.
Facebook has repetitive conflicts between profit and safety. The answer is always profit, where Facebook choose to grow at all costs. Facebook has repeatedly mislead the public about the safety of children and and the efficacy of its artificial intelligence. Facebook wants to trick you into believing that privacy protections alone are sufficient. We can afford nothing less than full transparency and accountability from public accountability.
What are the implications for Data Science Careers? Business Model challenges and this very situation, poses a very interesting challenge in Social Media Platform dynamics (in all social media, not just Facebook). Whistleblower Frances Haugen’s last Data Scientist role was in Threat Intelligence Counter-Espionage team.
Facebook and Technical Platform Issues
Facebook made a statement that they do not have to tell the truth about their algorithm~ Frances Haugen at Senate Subcommittee Hearing October 5, 2021
The Algorithm change of 2018 introduced the issues can absolutely be adjusted by the algorithm adjustment by the technology with appropriate oversight, however unmanaged artificial intelligence is creating problems that are harming the vulnerable persona’s on Facebook. Humans find interest in solutions by developing things where we do not appreciate the impact. it is in the “continuous improvement” cycle where we need to become a specialist that addresses issues of impact.
MSI is meaningful social interaction. Facebook has a flat structure. They have shared metrics that people move. MSI is a metric that puts people at risk that tends to reward. Lack of leadership amplified the problem. Statistically as social media time rises, so does the risk of depression, escalating negative body image and suicide among vulnerable groups. Facebook made a statement that they do not have to tell the truth about their algorithm
TechCrunch shared the options for viewing the difference between the algorithm and no algorithm feed. Taking “likes” off will not solve the problematic issue. If you want more likes, more shares, you produce more content. Most do not realize how the algorithms push dangerous content. Facebook likely did not intend to do this, but an unintentional result is that the features that promote engagement, increase risk of social comparison.
Facebook’s own research has endangered Facebook’s feature value. We need to write new rules that better balance the organization and metrics. Facebook needs to measure what matters, and establishing technical platform rules around misinformation, underinvestment in languages that are not in English.
- New York Times reported that Facebook was using its platform to promote misinformation about themselves
- COVID Vaccine Misinformation on social media has limited information with overreliance on Artificial Intelligence
- Teenagers coach teens not to post too often and that they can’t be authentic
- NPR reported that Facebook knew the potential for explosive violence was very real [on Jan 6]
Social and Family in Vulnerable Groups
In today’s Senate Facebook Testimony, Frances Haugen testifies that Facebook generates self harm and self hate, particularly around teenage children, particularly the rise of young girls around the age of 13. Most vulnerable users are those who are young, very sick, recently widowed. My son had shared this observation and several Russell Brand Thought pieces on the topic a few years ago.
Facebook awareness of the adverse very important insight into consumer oriented social impact since 2019. Teens are constantly being suggested you are not good enough, to encourage you to buy products. Facebook studies found that multiply body image for 1 in 3 teenage girls. Among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram
“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse. Comparisons on Instagram can change how young women view and describe themselves.”Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show – WSJ
Online Queues in Social Media have normalized an insensitivity generated from “likes” and “follows”. Teenagers are subject to a new set of challenges in interactions. Most parents have never been a teenager during social media as a teenager. This tends to put extra stress on children, as well as their parents, to figure out how to manage social media.
Facebook Knows Instagram is Toxic for Teenagers, Teenage Girls in particular. The have In-depth knowledge of the problem, yet downplay the issue in the public. Facebook has internal research on interest in healthy recipes and engagement based ranking, the user persona is pushed towards extreme dieting from the algorithm. A problem when the user is a teenage user with body image issues (body dysmorphia and anorexia) is very quickly fed information that can fuel an eating disorder, for example.
The Pschology of Social Sharing
82% of the United States has a Social Media Profile. Psychological is not social when the user feels they can not control how they are informed. Social Media has been used to expand their profile from a fictitious identity personas. When efforts are not reminded, but the response of our peers whose life appears so great.
- Adolescent Loneliness is Skyrocketing- is Tech the Culprit?
- Take a break to focus on yourself or your child to be mindful becoming their best self
- Remind yourself that you are only seeing highlights and others exaggerated successes
- Teaching Kids to Be Smart About Social Media (for Parents) – Nemours Kidshealth
- Is Instagram Toxic for Young Girls’ Mental Health?
- ‘My Brain Has Too Many Tabs Open’ launches – Digital Detox – Time to Log Off
- How to Kick a Mindless Scrolling Habit
- Social Media Addiction: What It Is and What to Do About It