Site Loader

LOCATION

510 5th Ave, New York, NY 10036, USA

Dates back to nearly 20 years ago, when the Internet started to move towards prosperous future, Amazon was criticized of charging old customers for higher prices. Exposure of this event did not mean that this kind of marketing would fade away. Instead, it is easier for companies to conduct price discrimination through more precise algorithm while humans have been stepping into a digitalised society, sometimes even without revealing relevant information to their customers. Given that big data analysis has been applied to varying degrees in a global scale, data based discrimination has also been mentioned and discussed under different circumstances these years. Although business decision-makers would regard this behaviour as marketing means if that has not incur mass accuse, still, there is a concern inside facing today as well as the upcoming years, that is more accurately targeted which may make people feel overwhelmed with horror and inequality. Accordingly, this essay will set out the point that price discrimination such as big data killing and group based bias might hurt the trust among the consumption system as well as do harm to the whole society.

Understanding Big Data Killing
The act of big data killing belongs to price discrimination, which has achieved a new degree within the realm of digital technology. This refers to regular customers getting a higher price or less coupons than the new ones, especially happening in online shopping. From the perspective of marketing, that sort of price differences can be called as personalized pricing meaning customers get different prices for the same product or service at the same place and time according to their types. Although is seen as a more serious problem at present age, this method of marketing is not novel even in offline shops in people’s everyday life. Three for two pounds is a typical example, which is purchasing more to save money for consumers while gaining higher profits for retailers. After knowing that implication, it is not difficult to understand that big data killing is the variant of personalized pricing with the help of big data. Unlike price labels of bricks-and-mortar stores, online quotes present a one-to-one mode, providing price discrimination with breeding ground which seemingly unknown to users. As online businesses are leading the fashion of purchase, prices of online travel industry like airfares and hotel rates become a fine experimental field for data based dynamic pricing. Increasing consuming data on these websites allows retailers to predict the willingness of how high a user would like to pay – they call it consumption will rather than being charged for more.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

Data Based Bias towards Some groups
Another kind of discrimination such as raciest, sexist and bias against the poor has in fact been existing and condemned for centuries. Taken into data collection of today, the tags showing the identity of various human groups become accomplices to enlarge bias sneakily. People used to know the ethnic groups of others by their appearances or actual contact but now it seem to grow into a fact that online service providers automatically collect lots of so-called basic information. Data collection and use grows as the business develops, that is to say, economic giants incur more attention when being seen as misusing data. Facebook responded to the accusation of providing advertisers with chances to excluding minority group in this August. However, before these complaints companies including Facebook had been using data analysis to lead some kind of differentiation even prejudice intentionally or unintentionally. According to Reuters, an Amazon programme about hiring showed bias against women. The team was developing an automatic tool using artificial intelligence to screen resumes since 2014 and later found gender inequality in its algorithm. This digital bias stems from preference for males in recruitment history because the recruiting engine was trained to refer to previous 10-year applications, which presented the male-dominated situation. As a dominant retailer, no one could ensure Amazon does not share the same digital model as that of recruitment to cast shadow on its female users. Humans would try to bring down subject judgement, nevertheless, it is worth noting that data might be biased to begin with.

Consequences of Discrimination
An undesirable consequence for digital operators might otherwise lose the estimated expectation including gaining greater profits. Public opinions about this cheating model, data based discrimination, possibly have great power to exert influence on some exposed incidents. The direct resist appears as the drop of purchase, which in turn, becomes low sales for businesses. This is a shock for companies if they quietly charge specific customers for more to boost sales, whereas they could end up accomplishing the very opposite.Apart from financial issue, there might also be a decrease in the future development due to the evidence that price and group discrimination, once be perceived, could do harm to the brand image. Consumers standing in public apply online resources to protest against businesses hiding in the dark side. Van Dijck put forward transparency imbalance between these two groups. He argues platforms demand confirmable and provable personal information, on the other hand, registered users have little idea about how their data will be used, reliably. The algorithm producing differences of the latter is a black box, but vivid facts about discrimination are the best defence in that online protest would bring about an uproar to affect the impression of a brand for original clients. Data based prejudice may generate chronic recession of customer loyalty in the near future, which is unexpected.

The second outcome might be the depletion of consumer trust when it comes to the interaction between businesses and customers. To some sensitive consumers, the feeling of being targeted is both flattered and afraid. For instance, retailer Target ran into trouble in 2012 when it sent coupons of baby-related products to a pregnant girl, whose father thought the company incredibly made a mistake since his daughter was just a high school student. It later came out that Target was true when the father found his daughter was going to have a baby. This was exactly what data analysis could do and Target used its pregnancy prediction score to target precisely. Several years later, different from uncovered targeting and powerful marketing in the past, targeted discrimination happens in the dark side. In the case of paying more for the same products, old customers are the target clients. Bias is based on ever-increasing data of e-commerce users. The ability of online retailers to predict consumption capacity make buyers have the feeling of being taken good care but also stared at the same time. Additionally, the man-made distinction among customers is not only showed in prices, but also in advertising. Platforms tend to present more advertisements for upgraded users or members than the new ones, restraining themselves like starting a new relationship, for example. Thus, Customers automatically walk into a cycle of searching and buying more in order to pay more. They consider themselves as royal clients whereas they are not treated like that due to the tricks of businesses. Some consumers see these platforms as useful tools and contribute much to them. Nevertheless, businesses sneakily set unfaithful prices for these royal users after knowing their consumption habits through big data, which destroys trust. Big data analysis today is not merely a powerful tool but also a tool charged by people with various purposes including those who ignore mutual trust in economic behaviours.

In terms of digital technology itself, data discrimination requires everyone to rethink the issue of security because bias comes from unexpected information collection. At the very first, data analysis and mining in economic field is used to improve the work of every aspect as well to create an efficient atmosphere. However, the misuse of some unauthorized data would end up being criticized as violation. It is possible that this action stays unknown to anyone except technical operators for a long period while the truth would finally come out. A Federal Trade Commission report Big Data: A Tool for Inclusion or Exclusion? claims that algorithm closes its ears to the consumer choice, meaning that users who do not want to share their information are still traced to support big data analysis. Even the data protected by privacy policy and cookie policy have the chance to leak, not to mention the data that users reject to offer. Excluding human interferences, originally biased data add to its instability, which may cause possible safety problems. In a journal article about algorithm discrimination based on data, Williams et al point out data analysis has been proved to produce self-sustaining bias, in a broader context than most can imagine. This reminds the users of digital technologies to refine the usage during practices, rather than applying new achievements for undesirable purposes.

In the long run, damaging social fairness and values could also be a key problem of data discrimination regarding above detrimental impacts. Firstly, the bias does not lead to single result, instead, it influences other aspects in a broader range. In a research article of University of Melbourne, scholars focusing on social equality argue the use of data should involve social consideration. They point out that decisions are not just made by data, more important, are made by their diverse outcomes in terms of society, economy and politics. People feel unfair as a result of being distinguished in a digital context, which is nothing different to face-to-face prejudice in essence. Opaque algorithm seems to conceal discrimination, however, the emergence of the truth behind would cause fair crisis likewise. Furthermore, online marketers acting as multiple identities have the responsibility to take moral standards into consideration otherwise it might cause a discussion about social values. Take car renting service as an example, the public scolded Uber for its unusual price surge responding to a London terrorist attack last year. Basically it is normal to see price increases with demand, however, when data analysis is unable to detect the real situation, the incident could fall into price discrimination that challenges ethical cognition. Except setting prices based on common demands, the challenge facing personal pricing is more complex since each take for every person is unique and uncertain, which is difficult for passengers to perceive the price differentiation and whether they are treated differently or not. The only ruler actually exists in the platform itself. If product or service platforms are malicious, it is not hard for them to become evil from exploiting their users, which are both passengers and drivers provided that is the case for car renting businesses.

Data based discrimination is a key concern at present and in the following years. It may cause destruction to both sides, one is data providers and the other is data users. The essay has argued that for the former, they do not trust accustomed businesses as before, while for the latter, they are unable to achieve commercial purposes that originally estimate. From digital perspective, the application of big data also remains a security problem. Finally, all these consequences may grow into a challenge for social values. Van Dijck stated in 2014, social behaviour has now been understood by data analysis to some degree. However, it is not desirable that increasing understanding of behaviours transforms into more bias. Therefore, it is suggested in a data discrimination blog that marketers should be cautious enough in case data based decisions bring about issues related to exclusion or discrimination. Data users should consider previous cases incurring business crisis and take the whole society into account when using data at the digital age.

Post Author: admin

x

Hi!
I'm Chloe

Would you like to get a custom essay? How about receiving a customized one?

Check it out