Ethics as science is ancient, as our world.
And tedious – also as our world:)
But the curious thing in ethical concepts and principals is their flexibility to the real-life conditions.
Every time when new field or sphere arises on the horizon, ethic, as many-headed hydra, grows up the new head. The same story had repeated with the weapon, nuclear science, genetic biology, biochemistry, etc. Right now that powerful wave reaches out to the IT-sphere.
IT ethics – it’s the whole complicated field that deserves a separate article.
But today we’re gonna talk about ethics in data scraping.
Big data is the new black!
Everyone on the market scrap, buy and sell consumer’s data, sometimes even without their permission to do that. (Cambridge Analytics – is a vivid example).
So it’s not surprising when users want to cut them off of social media and other annoying platforms.
The ethics concept of data scraping appears a few years ago – on the pick of the Big Data trend and already receives many reviews, as positive as negative.
In today’s world every company that strives to be successful needs to buy and use customer’ data, AI concepts, and marketing strategies at the same time.
What should every user do in order to protect himself from aggressive data scrapping?
The Internet is full of instructions on how to correctly cut yourself out of Facebook (as it’s not as easy as it may seem) or how to use the Internet, avoiding Google services and not allowing it to collect data about you. In an attempt to get rid of their digital shadow, people even move to new places, switch to new jobs, use several different phones and laptops. This exciting game of cats and mice with corporations attracts many enthusiasts around the world, following Edward Snowden, who unscrew cameras and microphones from their smartphones.
However, the current scale of gathering information about users and the approaching era of the Internet of things, when even the laces on your shoes are connected to the network, completely remove themselves from the Eye of Sauron will not work at all: only dead people or hermits can’t leave the digital footprint.
Now everything depends on how states and corporations will be able to agree among themselves on the treatment of our data.
Why is it dangerous to leave marks on the Internet
Dissidents, spies, criminals, and journalists – all of them are interested in the purity of their digital fingerprints, the anonymity of their actions on the Internet, primarily because of thoughts about their own self-sufficient safety. For most ordinary users, a digital footprint does not carry such an obvious danger, and yet hundreds of thousands of people around the world are concerned with the erasure of their “digital shadow”.
- First, the thoughtlessly left digital footprint compromises the security of even law-abiding users. Because of the digital trail, attackers can hack personal user accounts, get access to bank accounts, personal correspondence, and working data. Internet harassment, doxing, stalking – all of these dangerous practices are largely possible because of the “digital shadow” of the victim.
- Second, “digital shadow” forms a reality tunnel around the user that can restrict, dull and radicalize users.
- And third, “digital shadow” is the main source of information for corporations like Facebook and Google, which allows them to turn users into an expensive product and with incredible accuracy to manage the attention of millions of people in the interests of third parties.
In the digital world, we are becoming increasingly aware that every action becomes part of the global community, which we call the Internet – a place where algorithms combine different behavioral patterns into one common picture of humanity. In other words, it is clear for each of us that algorithms learn to recognize, classify and identify things that are important for all people only because we behave in a certain way or prefer certain things.
When making decisions, the first things, that we advise are search engines, applications, and digital assistants. And they are very dependent on the “digital shadow” of our past choices.
In this sense, we certainly have global responsibility at the individual level, even if our personal contribution is negligible. To deny this would be to deny a new transparent reality.
The more difficult it is for us to avoid big data collecting absolutely all the information about our digital entity, the more obvious it becomes that we are what we do. As more information becomes available in the ocean of big data about our choices, awareness of our responsibility for our contribution should increase.
In our hands are complete power in determining ourselves and all of humanity through a collective contribution to big data. Of course, this is a very disturbing kind of freedom. But the increasing degree of anxiety before this responsibility is not so negative at all, considering how high the stakes are.
As they say, with great authority comes great responsibility.
And popular bloggers would love to add: great existential anxiety comes with great responsibility.
In conclusion, always better to use and collaborate with highly aware of ethical data scraping companies, and the bright example of that is our dearly beloved LaSoft IT-company.