top of page

Introduction

How do we create a functional and ethical system design to utilize the current technology to develop better systems of Information Dissemination. We need to understand better how users interact with the information they come across on social media to create a digital design system that works in symbiosis with the current technological and social order

​

​

This research was conducted over two phases:

​

  1. Post-graduation Thesis (Research conducted for post-grad at George Brown College (GBC) – Interactive Media Management program. Presented as a video presentation in August 2020)

  2. Post-post-graduation Research (Research including the above and continued until Feb 2022. Presented as this document. This document also contains research material submitted to media watchdogs organizations in USA and Canada)

​

This research also includes research conducted during the 2021 United States Capitol attack and the Canada convoy protest in 2022. The social media interactions were observed and analyzed in parallel with the events on the ground leading up to them. Some of that research was conducted for media watchdog organizations.

Misinformation

noun

 

false information which spreads regardless of intent to mislead

Disinformation

noun

deliberately misleading or biased information; manipulated narrative or facts; propaganda

Truth

noun

 

that which is accurate or in accordance with fact or reality

 

Background

This thesis is a story of my journey through lies to grasp the importance of the truth.

 

I came from a Media & Animation background in the world of UX Research & Design at the Interactive Media Management post-graduate program at George Brown College. For the final semester's thesis topic, I chose 'Fake News & Digital Media: How the tech that is responsible for the debacle can be used to fight against it.' This thesis was a solo project at George Brown College, presented in a video presentation format, and I was extremely fortunate to have the mentorship of Xavier Masse to guide me along.

As the sole team member, all responsibilities were on my shoulder. I was the leader and a follower at the same time. Once a week, Xavier would meet up with me virtually to discuss my progress and help chart out the plan for the following week.

At the project's outset, I knew that this was not a problem that an app could fix. Moreover, this was not just a problem about information, but it was a problem deeply entangled with human psychology.

​

Barkun, M. (2015). Conspiracy theories as stigmatized knowledge. Diogenes, 62(3–4), 114–120. https://doi.org/10.1177/0392192116669288

A conspiracy theory is not simply a conspiracy. Barkun writes that conspiracies are "actual covert plots planned and/or carried out by two or more persons". A conspiracy theory, on the other hand, is "an intellectual construct", a "template imposed upon the world to give the appearance of order to events". Positing that "some small and hidden group" has manipulated events, a conspiracy theory can be local or international, focused on single events or covering multiple incidents and entire countries, regions and periods of history. Conspiracy theorists see themselves as having privileged access to special knowledge or a special mode of thought that separates them from the masses who believe the official account. Abstract

​

I needed a straightforward objective that was to the point.

Objective:

Design a solution that uses technology and media to fight against 'fake news' by equipping the user with skills and tools to detect and avoid unsubstantiated information while emphasizing values of empathy, truth, fun, integration, and collaboration

​

"Give a man a fish, and you feed him for a day. Teach a man to fish, and you feed him for a lifetime." Mathew 4:19.

 

Context

'Fake News' was already a significant problem worldwide by November 2019. I had been watching it happen over the years on the internet, but now it had started to stream into the mainstream print and cable news. What once were meticulously confusing plots of conspiracy theories had been reduced to insane and incredible news stories to distract from reality. Slowly the gatekeepers of information were discredited, and the new 'merchants of doubt' crashed the gates' using the internet as their primary weapon. Facebook, Instagram, Twitter, YouTube, 4chan and many others are all used as weapons in this information warfare.

As much as one could make inferences about where the information-sharing culture was headed by observing these platforms, there are so many platforms and so much data to track that it can seem daunting. However, over the years, many have taken up this arduous task to catalogue, study, and analyze and thanks to their work, we can see the picture more clearly today. Of course, the picture is more precise, but we are starting to see serious repercussions of that cultural shift on how information gets shared.

 

Timeline (GBC Thesis)

  • Total Timeline: 8 months (4 months part-time + 4 months full-time)

  • January-April 2020 Part-time | May-August 2020 Full-time

  • January-April 2020 – Secondary Research

 

May 2020

  • Join Facebook Groups

  • Recruit research subjects and filter the focus group

  • Collect data on types of misinformation and track their origins

  • Measure the time it takes for a false story to go from deep dark places on the internet to the Facebook groups

 

June 2020

  • Analysis of the Research data

  • Develop Insights

  • Brainstorm Solutions

  • Develop Strategies

 

July 2020

  • Systems Map for Consumption of Information through Digital Media

  • Journey of Fake News Map

  • User Personas

  • User Journey Map

 

August 2020

  • GBC Thesis Presentation

​

Research Statement and Goals

 

Research Question

How do we create a functional and ethical system design to utilize the current technology to develop better systems of Information Dissemination & Implementation?

​

Research Statement

I want to understand better how users interact with the information they come across on social media to create a digital design system that works in symbiosis with the current technological and social order

​

Research Goals

  • Discover people's behaviour concerning information consumption/sharing in a group setting

  • Learn about people's current pain points, frustrations, and barriers to accessing verified and trustworthy information

  • Uncover and chart out the current approaches people use to propagate misinformation

  • Chart out the methods in which influencers monetize these Facebook groups

  • Understand what truth means to people, how they define it and why it is essential to them.

  • Evaluate how people use Facebook groups to spread propaganda, misinformation, hate, socialize with like-minded people, and where their digital journey takes them from here.

JANUARY 6th 2021

I submitted my post-grad thesis in August 2020, but I continued my research for media watchdogs in Canada & USA. This part of the research yielded better research insights partly due to the more time invested in research and a professional setting.

The research conducted after August 2020 was very American-centric, as 2020 was a presidential election year, and we all knew Trump would go all out with the torrent of misinformation. Still, no one could be ready for what was to come.

​

I woke up on 6th January 2021 with full knowledge that it would be a busy workday. I had all my devices ready to record the contents of the day. It was just 7 am. There was some time before work began, so I went to Twitter to measure the temperature. American lawmakers seemed pretty oblivious to the monster they had helped create, so I told them how wrong they were.

​

Notice the time of every tweet below. These are my tweets on the morning of 6th January 2021. This thesis will explain how many researchers saw it coming, yet the law enforcement authorities didn’t.

Screenshot_20210106-181657_Twitter.jpg
Screenshot_20210106-181409_Twitter.jpg
Screenshot_20210106-181331_Twitter.jpg
Screenshot_20210106-181240_Twitter.jpg
Screenshot_20210106-181048_Twitter.jpg
Screenshot_20210106-180946_Twitter.jpg

Thanks to the insurrection, the afternoon had passed, and I got too busy to be on Twitter. But I did go back to make one final statement because the Trump Twitter ban that researchers had been begging for would come anytime now. It was already late, though. Very late.

Screenshot_20210106-180816_Twitter.jpg

Let's rewind from January 2021 to the GBC Thesis Research in January 2020 

Secondary Research

Depth of Field

There was already ample research material in the ether to access and study conspiracy theories and disinformation. So, this seemed like the right place to start to have a better sense of charting out a plan for my primary research to fill the gaps I come across in the secondary research.

Considering the wide range of people affected by fake news, I wanted to delve into various mediums and sources with a cache of research and analysis. This study would range from academic papers and books to movies and podcasts.

 

Notable Secondary Research material

​

Academic papers

 

Books

 

Documentaries

 

YouTube

 

Movies

 

Podcasts

 

Internet Forum

  • QanonCasualties – Reddit Forum featuring experiences of people who have family or friends lost to conspiracy theories, specifically QAnon related conspiracy theories. This forum contains thousands of stories with a lot of human behavioural overlaps. https://www.reddit.com/r/QanonCasualties/

 

Web Articles studied for Secondary Research (Click on this link to the view the articles list)

Primary Research

Defining Choices

I had to scope out the project from the get-go because, in the vast network of internet information pipelines, it would have been foolish to target all platforms. However, after extensive research of the flow of information on different social media platforms, I concluded that Facebook, Twitter, and YouTube were the most significant transgressors in spreading misinformation & disinformation. Therefore, I decided to track these platforms for my primary research while focusing the most on Facebook Groups.

During Secondary Research, I identified the most significant research gap that I would want to fill. I needed direct observational data, and I needed to collect it by joining the in-groups. I chose to conduct my Primary research on Facebook because it's the only social media platform with the least anonymity and most information for observing and profiling research subjects. Unlike Twitter, YouTube, and the likes, Facebook can chart the subject's likes and dislikes. Watching a person change and devolve on Facebook has been the most painful experience during this research.

To begin my primary research, I joined a Facebook group called 'Truth Seekers Canada' because I live in Canada. The thing that attracted me to join this group to observe was that this was a group of Canadian Truth Seekers. Still, the cover page of the group was a menacing-looking photograph of now-disgraced American President Donald J. Trump.

Trump - Truth seekers canada.jpg

The screenshots below are from the same group which has a rule of 'No Negativity or Wild Speculation'. The group is full of just that - wild speculations. These posts bring engagement, the only metric Facebook cares for.

​

As observed, Jean left that conversation believing an absolutely bonkers claim made by Jesse.

IDRIS ALBA - JUSTIN TRUDEAU-split.jpg
IDRIS ALBA - JUSTIN TRUDEAU-split (1).jpg
IDRIS ALBA - JUSTIN TRUDEAU_3_jesse.jpg

I knew that this would be a world of misinformation that I had to study right off the bat. While I was waiting to get my approval from the group's admins, the Facebook algorithm suggested more groups to join. All these groups had an aura of misinformation, bigotry, or resentment. I got recommended to join a 'yellow vest' group, one anti-vax group, an Illuminati group, and the likes of those. I didn't join any of those groups for my mental health safety. As I was sifting through the rabbit hole groups, Facebook notified me that my 'approval' came through. Now I could scroll back in time to the first post ever made in this group; this meant that I had access to five thousand Facebook profiles in this group, I did not have all the keys to their profiles, but I had more than I needed.

 

Considering the research and data collection involved, I decided to combine the following research methods to gather data and information to triangulate my findings for a much more precise analysis.

 

Research Methods & Approaches

  • Disguised Naturalistic Observation
    (Naturalistic observation is an observational method that involves observing people's behaviour in the environment in which it typically occurs) (Pau1)         CITATION

  • Disguised Participant Observation
    (In disguised participant observation, the researchers pretend to be members of the social group they are observing and conceal their identity as researchers) CITATION

  • Mixed Research (Qualitative + Quantitative) approach for collecting and analyzing the data.

 

Choosing a Research subject

With roughly 2.85 billion monthly active users as of the first quarter of 2021, Facebook is the most extensive social network worldwide. In the third quarter of 2012, the number of active Facebook users surpassed one billion, making it the first social network ever to do so. CITATION Those who have logged in to Facebook during the past 30 days are considered active users. Choosing subjects from a pool of almost 3 billion Facebook profiles can seem easy, but how does one look for the kind of subjects I need for my research? Facebook Groups came in very handy. I wanted to study the group and its behaviour under the influence of 'Influencers'. I wanted to chart out a path for how misinformation gets disseminated, absorbed, and redistributed again and again in those circles. At the same time, a superficial feeling of community emanates from the users as they participate in this conjunct activity. The Facebook Group 'Truth Seekers Canada' was already an established group with over five thousand users. Most of them were there because the Facebook algorithm profiled them and suggested that they join this group of like-minded folks.

​

I was also looking to join a group that was in its nascent stages so that I could observe the beginnings of how a Facebook Group gets co-opted by bad actors to monetize lies, fear and hate by exploiting ignorance and resentment. It was Feb 2020, and a new virus called Covid-19 had created a pandemic across the globe and had thrown the world into a chaotic frenzy. The pandemic is how I joined a group Facebook recommended I join; this group was called #BackToWork. This group had started right after the first covid lockdown in Canada. Studies have confirmed that social media usage went drastically high because of all the time spent at home. People joined various 'internet communities' to tackle the loneliness that the Covid-19 precautions had brought on. A tsunami of misinformation and disinformation was coming our way; I could already see it. This tsunami was going to snowball into more significant problems, and I knew that my scope of research was going to widen.

BACKTOWORK.jpg

I concluded that we could not design a product/system for every individual. I had to come to terms with the reality that certain personalities were non-persuadable. Even though it seemed like it might've reduced the scope of research, the depth of the study had increased multi-folds. The percentage of 'persuadable' is much higher than that of 'non-persuadable', but that could change over time as information pipelines get more polluted. The intervention has to come as early as possible to have a higher chance of success.

Research in Numbers (Quantitative)

​

Research Subject Categories

​

Three Primary Categories of subjects

  1. Influencers

  2. Wannabe Influencers

  3. Influenced

 

Two Secondary categories for the Research subjects

  1. Persuadable

  2. Non-persuadable

 

Overview for each Primary & Secondary category

  • Influencers = mostly non-persuadable

  • Wannabe Influencers = 50/50 of Persuadable and Non-persuadable

  • Influenced = mostly persuadable

 

Facebook Groups

 

‘Truth Seekers Canada’: 5000+ members & ‘#BackToWork’: 3000+ members

 

From a pool of 8000+ members Total Members selected for Round 1 = 150 (10 Influencers + 50 Wannabe Influencers + 90 Influenced)

 

Process:

  • Facebook Group admin/s chosen by default

  • Identified the most active members of the group and profiled them

  • Four critical factors for choosing research subjects for profiling
    (ONLY access and catalogue publicly available information on the Facebook Profile for research)

    1. The subject should be from one of the two Facebook groups I joined.

    2. The subject should be an outspoken personality who engages in posting and commenting in these groups and status updates on their own public Facebook profiles

    3. The subject should have enough publicly available information on their Facebook profile to verify their identity and confirm their social status

    4. The subject should have a Facebook profile that has been active for many years and should have Facebook statuses dating back at least two years

 

Note: Cross-checked the subjects across their multiple social media platforms to measure their reach. Once verified, only track the Facebook profile.

The subject referenced below has 2000+ Facebook connections and has more than 2500 followers on her Instagram. Sadly, she is one of the many victims of misinformation while simultaneously being its aggressive purveyor.

dia1.jpg
dia2.jpg

The 150 subjects were observed for four weeks while accessing their past activity on their Facebook statuses. In Round 2, I wanted to filter out 30 of the most viable subjects for the study. Again, the intention was to create multiple focus groups to concentrate the research on.
The 30 splits as 5 influencers + 10 wannabe influencers + 15 influenced

 

Research in User Interaction and User Experience (Qualitative)

Qualitative research was the backbone of the primary research because of the vast number of potential target users for the hypothetical product/design system I was going to design. Therefore, it was imperative to have a quality focus group to study. It was equally essential to triangulate any analysis with random A/B testing with one of the non-focus group subjects. The non-focus group consisted of the 120 subjects from the initial 150 who did not make it to the final focus group of 30 (called Focus 30)

Qualitative research analysis was primarily driven by observing the group’s activity and social interaction within the Facebook groups. Initially, I was a passive observer of the group, tracking the specific topics that generated the most engagement while keeping track of the group’s most influential members.

 

Elements Tracked within the groups

  • Conspiracy theories posted and their format (meme, long winding post, uploaded video, link to a video, screenshot of another source)

  • Track the conspiracy theory as far back to the source as possible

  • Track members who post more frequently and newer conspiracy theories

  • Identify and analyze the profiles of people that engage the most when a new conspiracy appears

  • Keep track of the change in the activity of conspiracy theories concerning actual events in the outside world

  • Analyze the comment section to note how one member of the group convinces many in to believe an utterly non-plausible idea

  • Note the nature of the content (web articles, videos, memes and long-form posts) that gets passed around as evidence of a conspiracy

 

Characteristics to note on the Facebook Profile of the individual subjects (Data Points to collect)

  • Note the Bio, Profile picture and cover picture. At least one of those elements will hint at their indulgence in conspiracy theory thinking and propagation.

  • The most recent five posts you see will at the very least allude to a conspiracy theory, if not outright promote it [Seema]

  • The subject will post very frequently but in many short bursts.

  • The subject will often engage in arguments with their friends and family on their Facebook profiles, often leading to bitter endings. In addition, they will usually resort to using aggressive language before the friend or family member excuses themselves from the discussion or, in some cases, from their lives.

​

How do we find out what they are looking at in their Facebook feed?

Due to the highly customizable nature of the Facebook algorithm, we have to make calculated assumptions based on the available data. We find out what they are looking at in their Facebook feed by checking their ‘About’ section for their interests and likes. In this section, we look to find the data points from the list below. The list focuses on American, Canadian and British personalities/entities as they tend to influence across borders thanks to no linguistic barriers. By no means is this an exhaustive list, but it does cover the most popular and wide-reaching transgressors that lead folks down the rabbit hole of conspiracy thinking. They all use the same tactic of JAQing off.

​

JAQing off
The act of asking leading questions to influence your audience, then hiding behind the defence that they’re “Just Asking Questions,” even when the underlying assumptions are entirely insane.

​

It is imperative to understand that these personalities do not have a specific topic that they deal with within disinformation circles. Most of them, if not all, keep a tab on the information trends and choose the most divisive issues, usually leaning towards ‘Culture Wars’

​

The Disinformation Dozen are twelve anti-vaxxers who play leading roles in spreading digital misinformation about Covid vaccines. The “disinformation dozen” network has a combined following of 59 million people across multiple social media platforms. They as listed as follows:

Joseph Mercola, Robert F. Kennedy, Jr., Ty and Charlene Bollinger, Sherri Tenpenny, Rizza Islam, Rashid Buttar, Erin Elizabeth, Sayer Ji, Kelly Brogan, Christiane Northrup, Ben Tapper, Kevin Jenkins

Source: The Disinformation Dozen | Center for Countering Digital Hate (counterhate.com)

 

Internet Personalities: American

Alex Jones, Ben Shapiro, Joe Rogan, Steven Crowder, Dave Rubin, Bret Weinstein, Tim Pool, Kelly Brogan, Candace Owens, Mike Cernovich, Jack Posobiec, Andy Ngo, Bari Weiss, Dinesh D’Souza, Liz Wheeler, Benny Johnson, Ron Watkins, Jim Watkins, Donald Trump Jr.

 

Internet Personalities: Canadian

Gavin McInnes, Stefan Molyneux, Ezra Levant, Faith Goldy, Lauren Chen, aka Roaming Millennial, Karen Straughan, Gad Saad, Lauren Southern, Jordan B Peterson

​

Internet Personalities: British/EuropeanPaul Joseph Watson, Nigel Farage, Katie Hopkins, Russel Brand, Carl Benjamin aka Sargon of Akkad, Milo Yiannopoulos, Stephen Yaxley-Lennon aka Tommy Robinson,

Media Channels

Infowars, Breitbart News, Fox News, Zerohedge, Daily Caller, Drudge Report, Rebel News, Druthers, Red State

 

Fox News Personalities

Tucker Carlson, Sean Hannity, Greg Gutfeld, Laura Ingraham, Jesse Watters, Dan Bongino, Mark Levin

 

Politicians

Vladimir Putin, Donald J Trump, Jim Jordan, Matt Gaetz, Louie Gohmert, Ron De Santis, Marjorie Taylor Greene,

 

We learn the interests of the subject from these likes/data points. Then, based on that information combined with astute observation of the subject’s Facebook activity, we can make a calculated assumption about the kind of content the Facebook algorithm curates for the user.

 

 

What is the algorithm doing?

A unique aspect of every Social Media service is the personalized, customized algorithm. As a result, the made-to-order product gets delivered multiple times every day, every time a user interacts with the service. That is to say; Facebook is not a product that a user consumes when they create a Facebook account. Instead, Facebook is the medium to deliver the product.

 

If Facebook is not the product, then what is the product? The user-generated content, interactions and engagement between users and the advertisements that tag along are the product. Every time a user interacts with the Facebook app on their mobile device, they provide new information to the algorithm. This latest information adds to the current profile information to create a new product delivered to the user every day. This customization is core to all the social media services as their income revenue is dependent on providing the most relevant ads to their users. The only way a user will see an ad is if the user is active on the app. Keeping the user engaged on the app is the only profitable way for any ‘free-to-use’ social media service. Different services use different approaches to entice their user and keep them engaged. For example, the YouTube algorithm will keep auto-playing content irrespective of the quality as long as the content user keeps ingesting it. The Youtube algorithm is not trying to get the user engaged in the comment section, and if that happens, it’s a bonus.

​

In this interview, AI expert Guillaume Chaslot who helped write YouTube’s recommendation engine explains how those algorithmic priorities spin up outrage, conspiracy theories and extremism. Guillaume says, “It was always giving you the same kind of content that you’ve already watched. It couldn’t get away from that. So you couldn’t discover new things, you couldn’t expand your brain, you couldn’t see other point of view. You were only going to go down a rabbit hole by design.” citation.

 

On the other hand, the Facebook algorithm keeps the user engaged with their Facebook friends and family and with their Facebook groups. The information hellscape we live in is a culmination of multiple social media companies moving fast and breaking things. The algorithm that Youtube started with years ago has evolved, and today we have similar tactics and strategies designed by everyone from Instagram to Tik-Tok for their algorithms. They have successfully broken lifelong relationships, successfully broken the trust in institutions, and eventually, that will break democracy.


“An informed citizenry is at the heart of a dynamic democracy.” ― Thomas Jefferson

Tammy Case Study (May 2020)

Tammy2.jpg
Tammy3.jpg

Tammy’s case is typical of disinformation and conspiracy theory influencer behaviour. She was one of the group admins for the group Truth Seekers Canada. This group was a QAnon group at its core, but it disguised itself as a Pastel Anon group. This group eventually got banned in Oct 2020 during the broad sweep of QAnon accounts across Facebook. Although in July 2020, Tammy was very busy across multiple social media platforms recruiting people into the QAnon conspiracy theory.

​

With approximately 7000 followers on Twitter and 1200 followers on Facebook, Tammy fed many vulnerable people a lot of disinformation, negatively impacting the lives of those unsuspecting folks. She also was a frequent poster on the Truth Seekers Canada group.

Tammy1.jpg
Tammy twitter.jpg

Tammy would frequently make posts like the ones below on the Truth Seekers Canada group. Unfortunately, being an admin of the group meant that every time she posted, most group members either received a notification for the post or, at the very least, the algorithm would push it higher up in the order in the group members Facebook feed. Nevertheless, Tammy’s posts always got good engagement, and with every new post, she was successfully mainstreaming the fringe ideas one post at a time, just like a steadfast Digital Soldier like her would.

Tammy5.jpg

In typical Pastel Anon fashion, Tammy would regularly post #savethechildren posts as those tend to pull in a larger audience because who doesn’t want to save the children. The only problem is that they are not saving the children, as this reporting attests to, along with many others.

​

Tammy4.jpg
Tammy6.jpg

Her Twitter account got suspended during Twitter’s QAnon ban in late July 2020.

Tammy7.jpg

Diane Case Study (May 2020)

Diane (last name redacted) is a 60+-year-old woman who is also a mother. She lives in Brampton, Ontario. The following images are just a tiny fraction of people’s experiences on Facebook and how Facebook facilitates the infection of the hearts and minds of naïve folks. In this case, Diane is an ‘Influenced’ persona, and if she had not ended up in this group, she could’ve been a ‘persuadable’, but I fear that the brainwashing through love-bombing has converted her into a ‘non-persuadable.’

​

In this one single conversation thread, Diane goes from being heartbroken to updating her post an hour later, thanking the group for the support she received. The support she received is typical behaviour of a cult. Even though these Facebook groups are too large to be labelled as cults, ignoring their parallel function systems is almost impossible.

​

I cannot stress enough how much people will avoid posting their personal relationships losses on their own Facebook profile as a status. However, the comment section in these Private Facebook groups is where their confessions and feelings get freely expressed. Most of the research and analysis about social behaviour on Facebook groups happened in the comment section because that is the place where filters don’t exist, as the following images will attest.

​

There is a significant loss in relationships for many victims of misinformation, but the pain gets momentarily masked by the superficial support received through these Facebook groups. It is important to note that the folks showing support to Diane do not even live in the same city as her; most live in a different province in Canada and some outside of Canada. So, even though it might seem like Diane can make 5000 new friends, she has not made a real connection through this group while also losing the relevant and close-knit bonds she had with her children, family and friends.

diane1.jpg
diane2.jpg
diane3.jpg
diane4.jpg
diane5.jpg

The image below shines a light on many of the keywords this research targets - Truthers, Light Workers, Digital Soldiers and Awakened Souls. Based on the research, this group member is an influencer in his mind but should get categorized as a ‘Wannabe Influencer.’ He tends to go on these non-sensical rants in many other comment sections. Yet, he never gets called out because one thing these kinds of Facebook groups have in common is a confirmation bias and lack of intellectual opponents in the group. Anyone who dares question the false and non-sensical narratives pushed in these silos gets ridiculed, insulted and eventually blocked from the group. By keeping the fact-checkers out they create a fertile ground to indoctrinate unsuspecting vulnerable people.

​

It is easy to dismiss it as a negligible problem which exists in the dark corners of the internet, but it is not in the dark corners. It is on Facebook and everyone is vulnerable.

diane6.jpg
diane7.jpg

The above image  shows how supported Diane felt in those moments.

 

Unfortunately, Diane has continued on the same path of contrarian and conspiratorial thinking she was on during 2020 because a timely intervention was not available to her and she still has not found an exit from the bubble.

diane8.jpg

DOCUMENT UNDER CONSTRUCTION

PLEASE CHECK BACK IN FUTURE FOR MORE

Coming Up...​

  • More Case Studies

  • Analysis and synthesis

  • Assumptions and biases

  • Impact

  • Reflections

  • Next steps and recommendations

bottom of page