What Tech Companies Can Learn From Facebook’s Whistleblower Incident?

What Tech Companies Can Learn From Facebook’s Whistleblower Incident?

Facebook, the social media giant and beyond question the biggest Internet platform, is regularly in the news for its shady business practices. But the Frances Haugen whistleblower incident takes home the cake.

In case you needed a recap, Frances Haugen, former Facebook product manager ventilated its affairs on live TV ahead of Congress. Her testimony confirmed what several folks suspected: Facebook does not care about its users, solely its information.

Haugen, a widely known name in the Silicon Valley area, has held high positions at Yelp, Pinterest, and Google alleged that Facebook systematically took advantage of its users’ emotional insecurities for its profits. Haugen worked for Facebook for two years before resigning in April. She had become progressively skeptical of Facebook’s slippery ways. Haugen made it clear that the social network’s algorithm is meant in such a way to incite discourse and build an environment of toxicity.

The whistleblower backed these claims via her leaked documents. The documents show Facebook’s information engineers raising concerns over the algorithms giving birth to “unhealthy aspect effects on necessary slices of public content, like politics and news.”

Haugen is conjointly concerned reverting the changes and instead choosing unbiased, natural engagement on its platform. The company’s algorithm ranks content that is most likely to provoke a reaction from its users. It suggests participating content is typically contentious, upsetting, and even false or dishonest.

But what does it all mean for other tech companies and the future of technology?

There are four big takeaways for tech companies:

  1. Customers matter more than shareholders
  2. Employees have the power to inform proper authorities when they feel a company is misleading its customers
  3. A company’s reputation can change overnight
  4. The right and left can come together to govern major tech monopolies

Let us see how each of these takeaways can shape the future of tech and what other companies can learn from Facebook’s fiasco.

Customers matter more than shareholders

According to Haugen, Facebook research revealed that 13.5 percent of U.K.’s young females said their suicide thoughts increased after they began using Instagram.

According to another leaked survey, 17% of young females report their eating issues have worsened due to using Instagram.

According to Facebook’s research, which was originally published by the Journal, roughly 32% of young females stated that when they felt horrible about their bodies, Instagram made them feel worse.

Haugen’s revelations made it clear that Facebook cared very little about its users and often chose profits over privacy.

Michigan State University researchers discovered a substantial link between social media use and addictive behavior.

Excessive Facebook usage is connected to addictive behavior comparable to heroin and cocaine abusers. According to a study published in the Journal of Behavioral Addictions titled “Excessive social media users display poor decision making in the Iowa Gambling Task.”

The worst part is Facebook knows how addictive their platform is. It is designed this way on purpose. Facebook has been abusing its users for the best part of the last decade. Just look at the Cambridge Analytica scandal of 2018.

Other tech companies need to be aware they are all easily replaceable, no matter how big of a company they are or how many users visit their platform daily. A business is as good as its customers, and loyalty can only go so far.

Many Facebook users have started to reevaluate their dependency on social media platforms. The report shows as many as 40 percent of U.S. users have taken a break from checking the app for several weeks on end. 44 percent of younger users in the United States have deleted the app from their phones completely to make matters worse for Facebook.

Other tech companies must realize that competition is tough. The whole operation comes to a halt if the users start leaving in droves.

Employees can and will inform the proper authorities

Many governments have made laws protecting the anonymity and job security of whistleblowers. They consider whistleblowing a corner state of a democratic state. It is a fair and objective rule that is needed in a fair and just society.

But employees, especially former employees, can abuse this system of trust. It is the reason why protecting corporate data when an employee resigns is so crucial.

They can unfairly or superfluously blow the whistle as a matter to get even. Employees can leak the private workings of an organization or leak a business’s intellectual property on purpose.

It can be devastating to the business’s inner operations and cause a lot of unnecessary financial losses and a burden on the shareholders. So businesses need to be responsible about what information they share with their employees.

Employers also need to be wary about how many levels of access each employee has to the company’s internal data. It can spell an end for a business if such information falls into the wrong hands.

XNSPY is a remote monitoring app made with keeping these employer concerns in mind. It can help businesses suspicious of their employees stealing personal information from them or wasting valuable company resources.

Employers may use it to monitor their employees to ensure they are not engaging in criminal activities or sharing private, personal information with others without authorization.

The email monitor from XNSPY also guarantees that workers are only using their emails for work-related objectives and are not browsing the web. Employers can also go through the emails to see whether their workers are selling intellectual property to competitors.

XNSPY comes with a slew of features. SMS messages, phone calls, GPS, screen recording, social media, network traffic, and multimedia may all be displayed on the device. XNSPY may also manipulate the device remotely, such as turning on the microphone or a smartphone’s camera. It can remove or install applications as well as lock a device.

Screen recording is a neat feature not implemented in most remote monitoring apps. Employers can use this functionality to see live screenshots of their employees’ phones as future references. It can be a sigh of relief knowing workers are not wasting valuable company resources or sharing private company information with third parties. Screen recording can help employers in informing how to track websites visited by employees.

Employers will also have the benefit of not getting spotted as the app works in stealth mode. It can be quite advantageous when subtlety is needed. So employers can check an employee’s smartphone who they suspect of selling intellectual property to their competitor, without the fear of getting caught using XNSPY.

It is also the reason why many remote-tracking apps have a built-in stealth feature. This option is necessary when employers want to know how to track websites visited by employees.

Because it is unlawful, it may hurt a company’s reputation and workflow. It can also significantly impact a company’s market share.

A company’s reputation can change overnight

The whistleblower incident was a long time coming for Facebook, a company marred by controversies. But, the incident caused enough chaos that a lot of its users started deleting the app entirely. It goes to show how fickle the tech world is.

Companies need to be aware that their reputation can change overnight, for the better or worse. It is why all major players like Tesla, Google, and Apple have teams of PR representatives, holding down media pressure.

They are the frontrunners holding a company’s hands during tough times. Companies need to remind themselves, sometimes no news is good news.

The left and right can come together to govern major tech monopolies

Usually, Democrats and Republicans cannot agree on anything, but this time things are different. Both wings of the U.S. political sphere have agreed that Facebook violates the moral and ethical code by which the Internet is supposed to be governed.

Democrats and Republicans are particularly looking to overturn a decades-old law known as Section 230 of the Communications Decency Act. The act protects social media businesses from being sued over what their members publish.

Social media companies will now have to be wary of what kind of content gets posted on their website. Their content will be under more scrutiny from the public and government officials. It means these companies will need more censorship and regulation on their websites.

They will need to hire special moderation teams and set up guidelines to ensure the right content gets posted and content that gets removed. It translates to more expenses and more resources put to the test.

Leave it to Facebook to unite the left and right.

Facebooktwitterredditpinterestlinkedinmail