The full text of this article hosted at iucr. If you do not receive an email within 10 minutes, your email address may not be registered, and you may need to create a new Wiley Online Library account.
If the address matches an existing account you will receive an email with instructions to retrieve your username. Lynette Kvasny E-mail address: lkvasny ist. Tools Request permission Export citation Add to favorites Track citation.
- Why I Dumped Google for DuckDuckGo;
- Google and the Digital Divide - 1st Edition!
- The Israeli State and Society: Boundaries and Frontiers;
- Theophany : the neoplatonic philosophy of Dionysius the Areopagite;
- Fall of Light (LaZelle, Book 2).
Share Give access Share full text access. Share full text access. Please review our Terms and Conditions of Use and check box below to share full-text version of article. Get access to the full version of this article.
View access options below. You previously purchased this article through ReadCube. Institutional Login. Log in to Wiley Online Library. Purchase Instant Access.
View Preview. Learn more Check out. Citing Literature. Volume 61 , Issue 12 December Pages Related Information. Close Figure Viewer. Households with less access to digital technologies are at a disadvantage in their ability to earn money and accumulate skills. But, as digital devices proliferate, the divide is no longer just about access.
How do people deal with information overload and the plethora of algorithmic decisions that permeate every aspect of their lives? The savvier users are navigating away from devices and becoming aware about how algorithms affect their lives. Meanwhile, consumers who have less information are relying even more on algorithms to guide their decisions.
Beyond Internet access: seeking knowledge justice online
The main reason for the new digital divide, in my opinion as someone who studies information systems, is that so few people understand how algorithms work. For a majority of users, algorithms are seen as a black box. AI algorithms take in data, fit them to a mathematical model and put out a prediction, ranging from what songs you might enjoy to how many years someone should spend in jail.
These models are developed and tweaked based on past data and the success of previous models. Most people — even sometimes the algorithm designers themselves — do not really know what goes inside the model. Researchers have long been concerned about algorithmic fairness. Other studies have shown that judicial algorithms are racially biased, sentencing poor black defendants for longer than others.
Join Kobo & start eReading today
This legislation treats the process of algorithmic decision-making like a recipe book. The thinking goes that if you understand the recipe, you can understand how the algorithm affects your life. Meanwhile, some AI researchers have pushed for algorithms that are fair, accountable and transparent , as well as interpretable , meaning that they should arrive at their decisions through processes that humans can understand and trust. What effect will transparency have?
- Your guardian angel and you.
- Buddha Nature.
- The Doctor and the Dinosaurs (Weird West Tales, Book 4)?
- Remaking Metropolis: Global Challenges of the Urban Landscape;
- Advanced Techniques of Population Analysis;
- American Auto Trail-Louisianas U.S. Highway 84 (American Auto Trails).
The students with more transparent explanations actually trusted the algorithm less. This, again, suggests a digital divide: Algorithmic awareness does not lead to more confidence in the system. But transparency is not a panacea.
Google and the Digital Divide: The bias of online knowledge by Elad Segev
Transparency will help only users who are sophisticated enough to grasp the intricacies of algorithms. For example, in , Ben Bernanke, the former chair of the Federal Reserve, was initially denied a mortgage refinance by an automated system. Most individuals who are applying for such a mortgage refinance would not understand how algorithms might determine their creditworthiness.
There are not many statistics about the number of people who are algorithm aware. Studies have found evidence of algorithmic anxiety , leading to a deep imbalance of power between platforms that deploy algorithms and the users who depend on them. A November report from the Pew Research Center found that a broad majority of the public had significant concerns about the use of algorithms for particular uses. A small fraction of individuals exercise some control over how algorithms use their personal data. For example, the Hu-Manity platform allows users an option to control how much of their data is collected.