• HC Visitor
Skip to content
Information Ecosystems
Information Ecosystems

Information, Power, and Consequences

Primary Navigation Menu
Menu
  • InfoEco Podcast
  • InfoEco Blog
  • InfoEco Cookbook
    • About
    • Curricular Pathways
    • Cookbook Modules

InfoEco Blog (Page 3)

It’s not “just an algorithm”

2020-01-23
By: Erin O'Rourke
On: January 23, 2020
In: Safiya Noble
Tagged: Algorithms, Data, Information Science, Safiya Umoja Noble

Safiya Umoja Noble, known for her best-selling book, Algorithms of Oppression: How Search Engines Reinforce Racism, as well as her scholarship in Information Studies and African American studies at UCLA, visited Pitt the week of January 24. She spoke with participants in the Sawyer Seminar, gave a public talk and spoke with me in an interview for the Info Ecosystems Podcast. In Algorithms of Oppression, Dr. Noble described her experiences searching for terms related to race, women, and girls, such as “black girls” and encountering pornographic or racist content. These initial searches led her to years of study in information science, using the first page of Google search results as data. Coming from an advertising background before obtaining her Ph.D. in Library and Information Sciences, Dr. Noble was uniquely situated in the early 2010s to recognize Google for the advertising company it really is, while working in a field where many scholars around her viewed it as a source with exciting potential. Noble’s book examines what is present and absent in that first page of search results, and what those results say about the underlying mechanisms of organizing information and corporate decisions that enable those searches to occur. To open her public talk, Dr. Noble discussed several events that have occurred since her book was published in 2018. They notably included the exposure of Facebook’s privacy violations from 2017–2018 and the use of facial recognition technology by law enforcement and in public housing, despite research from Dr. Joy Buolamwini indicating that facial recognition and analysis algorithms are inaccurate and can be discriminatory when applied to people of color. Over Read More

Racism and Representation in Information Retrieval

2020-01-23
By: SE (Shack) Hackney
On: January 23, 2020
In: Safiya Noble
Tagged: Algorithms, archives, black history month, Data, digital humanities, diversity, Information Ecosystems, Libraries, racism

Happy Black History Month! (originally published February 2020) by S.E. Hackney On Thursday, January 23rd, Dr. Safiya Noble spoke to an overflowing room of students, faculty, and community members about her best-selling book Algorithms of Oppression. The thesis of the book, and of Dr. Noble’s talk, is that not only racism is actually built in to the search algorithms which we use to navigate the internet, but that the big players of the internet (Google specifically) actually profit off of that racism by tokenizing the identities of people of color. It does this by associating identity phrases such as “black girls” or “phillipina” with the sites with the most streamlined (aka profitable) SEO, which is often pornography. This is a system of classification explicitly based on the centering of the white experience and and othering of Black people and other people of color. However, as Dr. Noble spoke about in her talk, tweaking a search result or two to avoid offense doesn’t actually solve a systemic problem — one where white voices are treated as the norm, and others eventually become reduced to SEO tags to be bought and sold. This idea played out recently in Barnes & Noble’s miss guided Black History Month project, where public domain books where the race of the protagonist is not specified (determined by algorithm) have new cover art created for them, depicting the characters as People of Color. Rod Faulkner, who first brought this issue to widespread attention describes it as “literary blackface,” and points out, “Slapping illustrations of Black versions Read More

Consequential Caring

2020-01-09
By: Sarah Reiff Conell
On: January 9, 2020
In: Jo Guldi
Tagged: democratic, digital humanities, empathy, Information Ecosystems, information overload

“The world is on fire,” is, by now, a familiar phrase. It is often used when we feel overwhelmed about escalations in geopolitics or in response to the catastrophic effects of climate change. Humanists are human, including the “digital humanists”, and the weight of crises is a reminder to making our scholarly work “count”. In Jo Guldi’s recent visit to the Sawyer Seminar, this desire to do meaningful work was a consistent topic of conversation. We have touched upon questions of making archives public in other visits, such as that of Richard Marciano in the Fall. During this most recent visit however, we spent more time discussing what it means to “democratize” information. For example, how making records “public” relates to the goal of making information more “democratic.” Personal and Professional One argument against treating publically available records as a solution to the problem of democratizing information is the fact that even available information is not guaranteed to reach the “public” or be legible to most. Fortunately, contextualizing and creating a narrative from dispersed evidence across a variety of records is a skill with which humanists are well prepared. What role does activism play in the articulation research stakes within scholarly endeavors? While scholars may also identify as activists, there is a tension between the role of an activist and that of the researcher — concerns about how enthusiasm might affect the quality of one’s work. Scholarly rigor and passion can seem at odds, particularly valuing dispassionate rationality over emotionally grounded arguments. Nevertheless, extended scholarly engagement Read More

Jo Guldi’s work studies historical infrastructure; in her digital humanities work, she builds it

2020-01-09
By: Briana Wipf
On: January 9, 2020
In: Jo Guldi
Tagged: British Empire, digital humanities, Topic modeling

Jo Guldi’s first book, Roads to Power: Britain Invents the Infrastructure State, argues that Britain became an “infrastructure state” during the eighteenth and nineteenth centuries, a period which saw an explosion in construction of roads, along with the accompanying surveying, management, and surveillance of that construction. Guldi’s work often deals with infrastructure, and when she turns her attention away from the history of the British Empire to the digital humanities, infrastructure is at the forefront of her mind there, too. Guldi spoke at the University of Pittsburgh’s Mellon Sawyer Seminar, Information Ecosystems, on Jan. 9 and 10. She also sat down with me for an interview that is part of the podcast series, Information Ecosystems, and will be published soon. As professor of history at Southern Methodist University, Guldi teaches history classes with a few glimpses of the digital humanities, and runs the Guldi Lab, where she employs distant reading techniques to better understand historic texts. While interviewing Guldi and hearing her speak, I was struck by the way the concept of infrastructure — be it the analog infrastructure of roads and canals or the digital infrastructure underlying the Internet — recurs in her work and her thinking. She told me during the podcast interview that she considers it important that her scholarship be available online in open-access form. Many of her articles are open-access, as is her second book, The History Manifesto, co-authored with David Armitage. Like the miles and miles of roads that connected Britain in the nineteenth century, the Internet has the power to connect people and Read More

Behind the Analogies

2019-12-06
By: Sarah Reiff Conell
On: December 6, 2019
In: Sandra González-Bailón
Tagged: Algorithms, data visualization, digital humanities, Information Ecosystems, metaphors, social science

“What’s going on behind the analogies”– Sandra González-Bailón Outcomes are not always intentional. We trigger anticipated and unforeseen things with our actions. The “invisible hand” is consequential, known only through its effects. Like contagion processes, our actions are enmeshed in interrelated networks. These are some of the metaphors discussed by Sandra González-Bailón in her research on metaphorical thinking, social processes, and communication structures. She engages head-on with the challenges and affordances of digital realities- using data to learn about or “decode” aspects of social life. “Analogies help make creative connections; but they can also draw pictures of the world that are too coarse-grained for any useful purpose.” (29, Decoding the Social World) Polar area diagram by Florence Nightingale published in Notes on Matters Affecting the Health, Efficiency, and Hospital Administration of the British Army and sent to Queen Victoria in 1858.  Models and metaphors are helpful for human cognition and communication, it seems unlikely that they can (or should) be avoided. The role of metaphors and other modes of abstraction are sorts of “black boxes” that are convenient for communication. We humans think with them, but they do shape our view of reality. “The language of argument is not poetic, fanciful, or rhetorical; it is literal. We talk about arguments that way because we conceive of them that way — and we act according to the way we conceive of things.” (pg. 5, Lakoff & Johnson’s Metaphors We Live By) Perhaps other metaphors might be more productive — other models may work better than their forerunners. Read More

How do we get news online? Networks and social influence may provide some answers

2019-12-05
By: Briana Wipf
On: December 5, 2019
In: Sandra González-Bailón
Tagged: journalism, social media, social science

Sandra González-Bailón began her presentation to the faculty and student participants of the University of Pittsburgh’s Information Ecosystems Sawyer Seminar on Friday, Dec. 6, by discussing one of the first sociology classes she took as an undergraduate in the late 1990s. She recalled learning about the debate between two early sociologists, Gabriel Tarde and Émile Durkheim, who disagreed about what role individuals played in social institutions and social interactions. Tarde suspected that social changes or developments occurred when people with social influence adopted the change or development. He argued this influence might not happen in physical space, but rather occurred between individuals writing letters to one another, or speaking to each other on a new invention, the telephone. Durkheim, on the other hand, argued that societies exist as amalgamations of the people who comprise them. This society is a new entity that is greater than any one group of people or small cabal of influencers. Durkheim was less interested in individual interpersonal interactions that was Tarde. Durkheim was able to back up his claims with data, and as González-Bailón put it, “won” the debate. He is considered to be a father of modern sociology. But González-Bailón, who teaches at the University of Pennsylvania’s Annenberg School of Communication and the Warren Center for Network and Data Sciences and studies the way social networks form and function online, suspects that, if Tarde had lived today and had the kind of data that social media or email can yield, he would have been able to better support some of Read More

The History of Science & Big Data’s Place in the Humanities

2019-11-15
By: Jane Rohrer
On: November 15, 2019
In: Sabina Leonelli
Tagged: Big Data, Data, Open Data, Philosophy of Science

The Sawyer Seminar’s November 15 guest was Dr. Sabina Leonelli. Dr. Leonelli teaches Philosophy and History of Science at the University of Exeter, where she is also the co-director of the Egenis Centre for the Study of Life Sciences. Her book Data-Centric Biology: A Philosophical Study, was published by the University of Chicago Press in 2016. She is now working on translating her 2018 book, Scientific Research in the Era of Big Data, into English from its original Italian. Both deal abundantly with the recent shifts and innovations in how researchers process and understand scientific data. In both her public talk on Thursday, November 14, and Sawyer Seminar lunch discussion, Dr. Leonelli walked us through the fundamentals of and distinctions between Big Data, Open Data, and FAIR Data (Findable, Accessible, Interoperable, Re-Usable); these distinctions—and mindful discussions about them—is increasingly necessary as, to quote Leonelli in Data-Centric Biology, “the rise of data centrism has brought new salience to the epistemological challenges involved in processes of data gathering, classification, and interpretation and…the social structures in which such processes are embedded” (2). As Leonelli described it, Big Data is definied by their capacity to move, be (re)used across situations & disciplines, and (re)aggregated into different useful and usable platforms. To elaborate here: while there is “no rigorous definition of Big Data,” we use them, in general, to complete large-scale projects that may not valuably be done at a smaller scale, often to extract new insights about an entire world, community, or issue. Humanist examples of this in practice would Read More

Open Data and data infrastructure across disciplines

2019-11-14
By: Erin O'Rourke
On: November 14, 2019
In: Sabina Leonelli
Tagged: Data, Information, Open Data, Philosophy of Science, Sabina Leonelli

On November 15th, Dr. Sabina Leonelli spoke to the participants of the Sawyer Seminar. As a historian and philosopher of science, she is currently the Co-Director of the Exeter Centre for the Study of the Life Sciences and has recently worked on a five-year grant about data access, openness, and infrastructure entitled The Epistemology of Data-Intensive Science. In her conversations at the Friday seminar, Dr. Leonelli focused on practices surrounding data collection and reuse, aiming to move towards a future of Open Data as the standard. One of her recent publications, an op-ed entitled “Data Shadows: Knowledge, Openness, and Absence,” spoke directly to many of the themes central to the Sawyer Seminar. She defines shadows, beyond being mere absences in data, as “the multiplicity of motives, goals, and conditions through which data may be construed as (in)significant, partial or complete, (un)intelligible, or (in)accessible.” Consequently, the degree to which these shadows exist depends on the context in which the data is considered, especially when data is being reused by parties other than the original creators. In her conversation with seminar participants, Dr. Leonelli discussed her vision for data use and distribution today, which involves most data being open-access, rather than owned by companies or individuals, as well as having the necessary metadata and methodological descriptions to make it valuable to others. This allows data to be reused, recontextualized, and further studied as more information becomes available, potentially allowing for discoveries in numerous fields. Dr. Leonelli identified several challenges in creating and maintaining open data, as well as some Read More

How Should We Handle Personal Data, Privacy, and Leisure Time in the Information Age?

2019-10-25
By: Jane Rohrer
On: October 25, 2019
In: Mario Khreiche
Tagged: Amazon, artificial intelligence, burnout, mechanization, Uber

On October 25th, the Seminar was led through a discussion on automation, AI, and the future of work by a fellow participant: recent Virginia Tech graduate and Information Ecosystems Sawyer Seminar Postdoctoral fellow Mario Khreiche. Mario discussed his recent publication in Fast Capitalism, “The Twilight of Automation,” in which he theorizes about “the scope and rate whereby human labor will be replaced by machines” (117). Throughout both this conversation on the 25th, and during his public talk the day before, Khreiche clarified that his approach is not a luddite one; he was quick to point out that AI and automation is, first of all, far from a recent concern—historical perspectives can do much to quiet our contemporary moments of panic—and secondly, that reducing AI and automation to its flaws would be, well, reductionist. Anyone who has spent time, for example, formatting citations on a laptop could imagine how much slower and more painful the whole ordeal would be on a manual typewriter. Khreiche has done an excellent job, then, of illuminating necessary critiques about automation without ignoring its multitude of perks. Khreiche spent much of his time examining the “gig economy” or “gigconomy,” in which temporary, part-time jobs are increasingly replacing the availability of lifelong careers. Khreiche specifically mentioned a part-time earner’s potential amalgamation of Uber, TaskRabbit, Amazon delivery, and Airbnb—a combination of gigs which I have actually met several millennials currently dabbling in at the same time. For those of you asking: what’s the big deal with that? As Khreiche himself points out, “automation unfolds Read More

Are services like Uber and Amazon’s Mechanical Turk Ethical? Sawyer Seminar turns to automation, future of work

2019-10-24
By: Briana Wipf
On: October 24, 2019
In: Mario Khreiche
Tagged: Amazon, burnout, Mechanical Turk, mechanization, Uber

The University of Pittsburgh’s Mellon Sawyer Seminar, Information Ecosystems, turned its attention to automation and artificial intelligence on Friday, Oct. 25, when the seminar’s postdoctoral fellow Mario Khreiche presented his research related to the future of work in an age of increasing automation. Khreiche can be described as neither a positivist nor a dystopian. His work lies somewhere in the middle. While he states in his 2019 paper “The Twilight of Automation,” published in Fast Capitalism this fall, “an unchecked project of automation is both ill-conceived and ill-fated” (117) he also takes to task postcapitalist interventions, which he argues “suffers from a certain naïveté, in that its authors undertheorize how emerging technologies unfold as sociotechnical systems, rather than isolated machines” (121).  Khreiche is interested in what he calls a “more nuanced question” – something along the lines of trying to figure out how a company like Uber can adapt or change systems to make them less susceptible to technological redlining, for example. In particular, Khreiche keeps asking what is new about this technological revolution. Work has changed many times in the past: think the Industrial Revolution of the eighteenth and nineteenth centuries or the first computer revolution of the mid-twentieth century. The Luddites of the nineteenth century smashed weaving machines – but not, as is commonly thought and as the term “Luddite” as it is used today indicates, because they were against technology of any kind. Rather, they were concerned about mechanization being used as a way to exploit workers and produce lower-quality goods. This technological Read More

Posts pagination

Previous 1 2 3 4 Next

Invited Speakers

  • Annette Vee
  • Bill Rankin
  • Chris Gilliard
  • Christopher Phillips
  • Colin Allen
  • Edouard Machery
  • Jo Guldi
  • Lara Putnam
  • Lyneise Williams
  • Mario Khreiche
  • Matthew Edney
  • Matthew Jones
  • Matthew Lincoln
  • Melissa Finucane
  • Richard Marciano
  • Sabina Leonelli
  • Safiya Noble
  • Sandra González-Bailón
  • Ted Underwood
  • Uncategorized

Recent Posts

  • EdTech Automation and Learning Management
  • The Changing Face of Literacy in the 21st Century: Dr. Annette Vee Visits the Podcast
  • Dr. Lara Putnam Visits the Podcast: Web-Based Research, Political Organizing, and Getting to Know Our Neighbors
  • Chris Gilliard Visits the Podcast: Digital Redlining, Tech Policy, and What it Really Means to Have Privacy Online
  • Numbers Have History

Recent Comments

    Archives

    • June 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • October 2020
    • September 2020
    • May 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019

    Categories

    • Annette Vee
    • Bill Rankin
    • Chris Gilliard
    • Christopher Phillips
    • Colin Allen
    • Edouard Machery
    • Jo Guldi
    • Lara Putnam
    • Lyneise Williams
    • Mario Khreiche
    • Matthew Edney
    • Matthew Jones
    • Matthew Lincoln
    • Melissa Finucane
    • Richard Marciano
    • Sabina Leonelli
    • Safiya Noble
    • Sandra González-Bailón
    • Ted Underwood
    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    Tags

    Algorithms Amazon archives artificial intelligence augmented reality automation Big Data Bill Rankin black history month burnout cartography Curation Darwin Data data pipelines data visualization digital humanities digitization diversity Education election maps history history of science Information Information Ecosystems Information Science Libraries LMS maps mechanization medical bias medicine Museums newspaper Open Data Philosophy of Science privacy racism risk social science solutions journalism Ted Underwood Topic modeling Uber virtual reality

    Menu

    • InfoEco Podcast
    • InfoEco Blog
    • InfoEco Cookbook
      • About
      • Curricular Pathways
      • Cookbook Modules

    Search This Site

    Search

    The Information Ecosystems Team 2026

    • This site is part of Knowledge Commons.
    • Explore other sites on this network or register to build your own.
    • Terms of Service
    • Privacy Policy
    • Guidelines for Participation