< All Topics
Print

Senate Hearing on Facebook Transparency an Accountability

How do you envision a better responsible social media experience? This is not really just about Facebook Inc and it’s suite. 97% of the Fortune 500 Companies use social media as a primary means of for business marketing and advertising. Very interesting discussion around structured data sets and the Facebook Platform Algorithm review of the United States Senate.

There are two sides to every story, and this discussion unfolds that. Social Media Usage is being analyzed. This really is not just a social media thing, but it is an important consideration for Social media platforms and New Media News challenges of engagement profitability from that which threatens public safety. Global Privacy policy, accountable age appropriate marketing decisions, New Policy and process. Business Process Engineers, Data Scientists and Artificial Intelligence can make a difference in tacking this challenge!!

Facebook hides behind walls that keep researchers and regulators from understanding the truth. Public can not validate the truth. The companies hides their own data, they have directly asked how do you impact the health and safety of our children. Facebook has not earned our blind faith.

~ Frances Haugen: Facebook Testimony before US Senate

What is the United States Senate Subcommittee leading this review

US Senate Consumer Protection, Product Safety, and Data Security subcommittee exists to address issues of consumer affairs and consumer product safety; product liability; property and casualty insurance; sports-related matters; consumer privacy and data security protection, and international data transfer issues.

The subcommittee conducts related legal oversight of:

Social Media Platform Owners need to Protect Consumer Safety

The challenge with Social Media is when Facebook and other social media outlets tend to choose profit over “Protecting People Safety. We need an algorithmic protection act that better manages the adverse impacts.

We need to expand the Child Online Privacy Protection Act to stop preying on children similar to the 1990 Children’s Television Act mandates educational television for children on commercial broadcast stations. Reestablishes commercial time restrictions. Bans host selling. We can not change the “feedback” loop, but we should not be using target ad selling to youth (similar to that of the television act).

Facebook AI Research is not the issue the platforms algorithm, it can not run unattended effectively. The issues are complex and nuanced. Public Safety, Privacy and Profit over safety are why we must realize the actual damages that have happened. This is a case of Facebook choosing to become a trillion dollar company who profits, at the expense of consumer/user safety.

How do we balance Public Safety Policy with profit? Facebooks company leadership products harm children, create division, and weaken our democracy. Yester day we saw Facebook down for 5 years. Millions of small businesses could not reach their customers. We can have global social media that connects us without tearing us apart.

Facebook has repetitive conflicts between profit and safety. The answer is always profit, where Facebook choose to grow at all costs. Facebook has repeatedly mislead the public about the safety of children and and the efficacy of its artificial intelligence. Facebook wants to trick you into believing that privacy protections alone are sufficient. We can afford nothing less than full transparency and accountability from public accountability.

What are the implications for Data Science Careers? Business Model challenges and this very situation, poses a very interesting challenge in Social Media Platform dynamics (in all social media, not just Facebook). Whistleblower Frances Haugen’s last Data Scientist role was in Threat Intelligence Counter-Espionage team.

Facebook and Technical Platform Issues

Facebook made a statement that they do not have to tell the truth about their algorithm

~ Frances Haugen at Senate Subcommittee Hearing October 5, 2021

The Algorithm change of 2018 introduced the issues can absolutely be adjusted by the algorithm adjustment by the technology with appropriate oversight, however unmanaged artificial intelligence is creating problems that are harming the vulnerable persona’s on Facebook. Humans find interest in solutions by developing things where we do not appreciate the impact. it is in the “continuous improvement” cycle where we need to become a specialist that addresses issues of impact.

MSI is meaningful social interaction. Facebook has a flat structure. They have shared metrics that people move. MSI is a metric that puts people at risk that tends to reward. Lack of leadership amplified the problem. Statistically as social media time rises, so does the risk of depression, escalating negative body image and suicide among vulnerable groups. Facebook made a statement that they do not have to tell the truth about their algorithm

TechCrunch shared the options for viewing the difference between the algorithm and no algorithm feed. Taking “likes” off will not solve the problematic issue. If you want more likes, more shares, you produce more content. Most do not realize how the algorithms push dangerous content. Facebook likely did not intend to do this, but an unintentional result is that the features that promote engagement, increase risk of social comparison.

Facebook’s own research has endangered Facebook’s feature value. We need to write new rules that better balance the organization and metrics. Facebook needs to measure what matters, and establishing technical platform rules around misinformation, underinvestment in languages that are not in English.

Social and Family in Vulnerable Groups

In today’s Senate Facebook Testimony, Frances Haugen testifies that Facebook generates self harm and self hate, particularly around teenage children, particularly the rise of young girls around the age of 13. Most vulnerable users are those who are young, very sick, recently widowed. My son had shared this observation and several Russell Brand Thought pieces on the topic a few years ago.

Facebook awareness of the adverse very important insight into consumer oriented social impact since 2019. Teens are constantly being suggested you are not good enough, to encourage you to buy products. Facebook studies found that multiply body image for 1 in 3 teenage girls. Among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram

“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse. Comparisons on Instagram can change how young women view and describe themselves.”

Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show – WSJ

Online Queues in Social Media have normalized an insensitivity generated from “likes” and “follows”. Teenagers are subject to a new set of challenges in interactions. Most parents have never been a teenager during social media as a teenager. This tends to put extra stress on children, as well as their parents, to figure out how to manage social media.

Facebook Knows Instagram is Toxic for Teenagers, Teenage Girls in particular. The have In-depth knowledge of the problem, yet downplay the issue in the public. Facebook has internal research on interest in healthy recipes and engagement based ranking, the user persona is pushed towards extreme dieting from the algorithm. A problem when the user is a teenage user with body image issues (body dysmorphia and anorexia) is very quickly fed information that can fuel an eating disorder, for example.

The Pschology of Social Sharing

82% of the United States has a Social Media Profile. Psychological is not social when the user feels they can not control how they are informed. Social Media has been used to expand their profile from a fictitious identity personas. When efforts are not reminded, but the response of our peers whose life appears so great.

25 Most Impactful Leaders on LinkedIn | A-Z Business Guide | Artificial Intelligence (AI) and (RPA)| COVID-19 A-Z   | Enterprise Global Cyber Fraud Prevention Global Employment Taskforce  | Enterprise Global Cyber Fraud Prevention | Executive Womens Network | Global Education IELTS & Study Abroad| |  Global Recruiting  Network   |  Health and Wellness Center of Excellence – Facebook | LinkedIn | Jobs N Career Success Networks –  Facebook  | LinkedIn  | Silicon Valley B2B Website and Communications Best Practices | Vouch4Vets |Jobs N Career Success –  Facebook  | LinkedIn |
#opentowork, covid19coe, covid coe, covid 19 coe, Silicon Valley, itSMFUSA, San Francisco, Bay Area, Board of Directors, Chapter President, SF Bay itsmf linkedingroups, Dawn C Simmons, Service Delivery Improvement, OCM, Process, Technology,itsmf, HDI, IT Service Management, ITIL, ServiceNow, Change Management, linkedin.com/dawnckhan, Business Process Improvement, ITSM, COVID, Process Improvement for ITSM
Dawn Christine Simmons dawncsimmons.comLinkedIn | Twitter

Table of Contents