Privacy of data is closely related to the privacy of yourself. We’re living in the surveillance capitalism world, where interested app owners can easily collect and monetize online data. These sets can be colossal.
However, you’re fooled in the name of security. But actually, there is no limit tod peeping into your own personally identifiable information (PII).
Recently Roe v. Wade was overturned in the American parliament. The reason is simply the pressing need for executing comprehensive federal privacy legislation. With the right fit law, further hacking or manipulation of individuals’ sensitive information can be fairly put to an end.
Dobbs v. Jackson Women’s Health Organization claimed that people seeking abortions are prone to risk because of the existing unprotected data collection practices. There are multiple apps that monitor menstrual cycles and reproductive health. Their top most back-office services provider team collects that data to identify the pregnancy status of anyone. The app called FLO had to go through a trial & settle with the Federal Trade Commission (FTC) for selling users’ sensitive details to organizations like Facebook and Google. These app owners are free to sell data to any third parties for targeted advertising.
But, the use of that data can go beyond the mere usage of apps. They sell people’s location data, text messages, and online activity reports via the Reproductive Management Apps. Previously, SafeGraph had done it by selling location data of 600 people who visited Planned Parenthood clinics. Facebook was also involved in collecting data on people visiting the websites of crisis pregnancy centers.
All these and many more cases are registered that show that data are unsafe in the hands of app owners.
There are people who do not publicly disclose their gender or sexuality due to some potential risks. But, the data collection company’s back office team tracks data and sell information. Recently, a Catholic news outlet drew Grindr’s geolocation data and monitored the phone of a closeted Catholic priest. Later, he gave up his position. The very same app Grindr sold users’ data to any third parties and also circulated HIV health reports with two external companies. The technologies like dockless bikes and scooters also put LGBTQ+ data at risk.
Besides, the parental surveillance tools can also be a danger to this community people because they can analyse gender on the basis of their online searches and activities.
Recently, Facebook (FB) announced that ad data can integrate historic data that show systematic discrimination. Hence, FB automatically gains access to certain products and services that people search for.
On the basis of that data analysis, for example, FB’s targeted ad options allowed a few companies like Uber to attract young men, but not female, non-binary, & older male job seekers for jobs.
This discrimination is visible in housing and other services areas. In 2019, FB (which is now Meta) and the US Department of Housing & Urban Development (HUD) went through a case for not showing housing adverts to people based on protected characteristics, covering race also.
Muslim Americans’ privacy is no more protected. A prayer app- Muslim Pro, a dating app-Muslim Mingle, and many other ones made money out of the personal location data of their users to the contractors of the US military and defense. Unfortunately, only 5 out of 50 such apps have encrypted details in any way. The rest of the app owners drew monetary benefits by selling that information. The NYPD tracked their location & names to reach a conclusion if they show any signs of “radicalization.”
It is crystal clear that the online surveillance of religious minorities is almost nil. It is clear abuse. Their personal and interest-specific intent data are easily circulating in public and private sectors, where analysts in the back office can draw insights to misuse against them in the existing digital economy.
Law enforcement officials and activist groups have a right to collect social media and location data. It’s a clear attack on civil rights. Also, the effectiveness of these activists and protesters is doubtful.
The death incident of George Floyd, which boiled the back people, is a great example of it. The FBI, at that time, used a geofencing warrant that displayed the location data of all android users that passed by the headquarters of the Seattle Police Officer’s Guild (SPOG).
There is another incident of the same kind wherein The Intercept traced surveillance-based documents on Black Lives Matter activists since the outbreak of 2014 protests.
All of these incidents clearly signal that any organization can easily attain online data amid inadequate privacy laws. The insufficient privacy laws have greatly increased the need for proactive legislation. Only this way, the appropriate use of any app data can be possible. Else, this practice will continue to threaten civil rights & society.
At present, the data privacy policies and laws regulate the way companies handle data. But, there are unregulated surveillance programs that can severely abuse civil rights.
America’s Law enforcement facial recognition networks cover 117 million+ adults. One out of four states and local law enforcement agencies also come in this radius. Private companies like Clearview AI, which is a leading commercial provider of facial recognition technology, have every right to extract publicly available pictures from websites. They have easy access to niche-based commercial companies and various data brokers’ visual data.
Unfortunately, the availability of such data can prove a disaster if it is merged with the data obtained from other surveillance tools. The analysis of that data can threaten innocent civilians’ lives, especially Blacks and Hispanic people.
All of these cases and examples clearly tell why to consider data privacy during back office virtual assistance. Ensure that your data would be encrypted while in transit. And also, the virtual back office assistant won’t use it once your tenure is over. This is the only way to protect your privacy and your users’ personally identifiable information.
Why to consider data privacy during back office assistance is a matter of great concern. It can lead to the misuse of your users’ personally identifiable information. The commercial purposes of any third party can defame your company, as your users have shared their info on the matter of trust.
Instead of relying on one-size-fits-all solutions, modern businesses demand flexible enterprise ecommerce solutions. These solutions… Read More
As businesses aim to stay competitive in a digital-first world, many find that their legacy… Read More
Maintaining network security across multiple branch offices can be challenging for mid-sized businesses. With each… Read More
Steam turbines have been at the core of power generation for over a century, turning… Read More
Blockchain tech has become one of the most game-changing steps in the digital world. First… Read More
Today’s stock market offers exciting opportunities, with new IPO listings opening doors for investors to… Read More