CfP: Digital Inequalities and Discrimination in the Big Data Era


*Preconference of the International Communication Association ’17*

May 25, 2017, San Diego Hilton Bayfront, San Diego, California (USA)

Co-sponsored by the Pacific ICTD Collaborative, the School of
Communications (University of Hawaii at Manoa), and the Institute for
Information Policy (Penn State University)

*Abstracts due: February 10, 2017 *


A growing number of ordinary objects are being redesigned to include
digital sensors, computing power, and communication capabilities – and new
objects, and processes, are becoming part of the Internet. This emerging
Internet of Things (IoT) ecosystem – networks of physical objects embedded
with the ability to sense, and sometimes act upon, their environment, as
well as related communication, applications, and data analysis, enables
data to be collected from billions of everyday objects. The emerging
datasphere made possible by these developments offers immense potential to
serve the public good by fostering government transparency, energy
conservation, participatory governance, and substantial advances in medical
research and care. On the other hand, a growing body of research addresses
emerging privacy and civil liberties concerns related to big data,
including unjust discrimination and unequal access to data and the tools
needed to make use of it.

For example, big data analytics may reveal patterns that were previously
not detectable. Data about a variety of daily tasks that seem trivial is
increasingly being federated and used to reveal associations or behaviors,
and these analyses and the decisions made based on them pose potential
harms to individuals or groups. Many transactions that seemed innocuous can
now be used to discriminate – one’s movement throughout the day, items
purchased at the store, television programs watched, “friends” added or
looked at on social networks, or individuals communicated with or who were
in close proximity to the subject at various times, can all be used to make
judgements that affect an individual and his or her life chances. With the
advent of artificial intelligence and machine learning, we are increasingly
moving to a world where many decisions around us are shaped by these
calculations rather than traditional human judgement. For example,
sensitive personal information or behaviors (e.g., political or
health-related) may be used to discriminate when individuals seek housing,
immigration eligibility, medical care, education, bank loans or other
financial services, insurance, or employment. At the same time,
individuals, groups, or regions may also be disadvantaged due to a lack of
access to data (or related skills and tools) to make use of big data in
ways that benefit their lives and communities.

This preconference session seeks to advance understanding of digital
inequalities and discrimination related to big data and big data analytics.
*Papers between 5,000-8,000 words and position papers between 1,000-2,000
words are welcomed.*


We welcome scholarly and applied research on, but not limited to, the

• Social, economic, and ethical implications of big data analytics in a
variety of contexts (e.g., access to housing, immigration, medical care,
education, bank loans or other financial services, insurance, or

• Perspectives on big data from scholars from emerging economies or
traditionally marginalized groups.

• Predictive analytics, algorithmic discrimination, and
artificial-intelligence-based decision making.

• Digital inequalities, such as unequal access to big data sets, skills, or

• Emerging data literacies.

• Use of big data to counter social and economic inequality (e.g.,
promoting civil rights and social justice).

• Disclosure of algorithms, algorithmic transparency, and the public good.

• Big data, security and encryption (potential for hacking, theft,
third-party abuse).

• Government and corporate surveillance.

• Big data brokers and sale of personal data (is privacy a commodity or a

• International norms and standards for big data.

• Policy/legal analysis related to big data and the preconference theme
(e.g., standards of liability for injury and defective work products
(algorithms/burden of proof), the challenge of Notice and Consent,
liability for bad or false or slanted or insufficient data collection,
government regimes for supervision of big data policies).

• Consumer bill of rights for big data.

• Big data and anonymity, re-identification of anonymous data.

• Big data vs. privacy as an essential condition for safeguarding free
speech, intellectual property (i.e., how IP laws impact big data), or
Constitutional rights of freedom of assembly and association.

Papers may include empirical research as well as policy analyses, new
methodological approaches, or position papers addressing the preconference
theme.  Submissions by graduate students working in this area are welcomed.

*The costs of the workshop are heavily subsidized by the participating
Institutes, to keep fees for participants at a nominal level.*


*Abstracts due*: February 10, 2017

*Notifications to submitters*: February 27, 2017

*Full papers due*: May 12, 2017


Abstracts of up to 500 words and a short bio of the author(s) should be
emailed to  by  February 10, 2017. Please include
“Digital Inequalities ICA 2017” in the subject line.

Full papers accepted for presentation at the preconference will, with the
consent of the authors, be submitted to the Journal of Information Policy ( for consideration for a
Special Issue curated by guest editors from the field. The papers will be
blind peer-reviewed, to assure their academic value to both authors (for
academic credit) and readers.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.