Australian - Writer | -
With big data comes big responsibilities.
Kate Crawford
DataBigResponsibilities
If you're not thinking about the way systemic bias can be propagated through the criminal justice system or predictive policing, then it's very likely that, if you're designing a system based on historical data, you're going to be perpetuating those biases.
JusticeThinkingDataYouBiasWay
Sexism, racism, and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many 'intelligent' systems that shape how we are categorized and advertised to.
TechnologyRacismDiscrimination
There's been the emergence of a philosophy that big data is all you need. We would suggest that, actually, numbers don't speak for themselves.
SpeakDataPhilosophyYouNumbers
As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.
DailyWaterShortNewKnowSafe
Hidden biases in both the collection and analysis stages present considerable risks and are as important to the big-data equation as the numbers themselves.
RisksPresentNumbersImportant
We should always be suspicious when machine-learning systems are described as free from bias if it's been trained on human-generated data. Our biases are built into that training data.
TrainingFreeDataBiasAlwaysBeen
Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation.
GoodBlindBetterDataBeliefBig
Rather than assuming Terms of Service are equivalent to informed consent, platforms should offer opt-in settings where users can choose to join experimental panels. If they don't opt in, they aren't forced to participate.
ServiceChooseInformedAssuming
We should have equivalent due-process protections for algorithmic decisions as for human decisions.
DecisionsHumanShouldEquivalent
Data and data sets are not objective; they are creations of human design. We give numbers their voice, draw inferences from them, and define their meaning through our interpretations.
VoiceDesignDataMeaningNumbers
When dealing with data, scientists have often struggled to account for the risks and harms using it might inflict. One primary concern has been privacy - the disclosure of sensitive data about individuals, either directly to the public or indirectly from anonymised data sets through computational processes of re-identification.
RisksDataPrivacyThroughBeen
Copyright © 2024 QuotesDict Kate Crawford quotes