Copy
View this email in your browser
Table of Contents
 
 

AI Ethics: Global Perspectives

Feb 03, 2021 01:46 pm

“The Governance Lab (The GovLab), NYU Tandon School of Engineering, Global AI Ethics Consortium (GAIEC), Center for Responsible AI @ NYU (R/AI), and Technical University of Munich (TUM) Institute for Ethics in Artificial Intelligence (IEAI) jointly launched a free, online course, AI Ethics: Global Perspectives, on February 1, 2021. Designed for a global audience, it conveys the breadth and depth of the ongoing interdisciplinary conversation on AI ethics and seeks to bring together diverse perspectives from the field of ethical AI, to raise awareness and help institutions work towards more responsible use.

“The use of data and AI is steadily growing around the world – there should be simultaneous efforts to increase literacy, awareness, and education around the ethical implications of these technologies,” said Stefaan Verhulst, Co-Founder and Chief Research and Development Officer of The GovLab. “The course will allow experts to jointly develop a global understanding of AI.”

“AI is a global challenge, and so is AI ethics,” said Christoph Lütge, the director of IEAI. “Τhe ethical challenges related to the various uses of AI require multidisciplinary and multi-stakeholder engagement, as well as collaboration across cultures, organizations, academic institutions, etc. This online course is GAIEC’s attempt to approach and apply AI ethics effectively in practice.”

The course modules comprise pre-recorded lectures on either AI Applications, Data and AI, and Governance Frameworks, along with supplemental readings. New course lectures will be released the first week of every month. 

“The goal of this course is to create a nuanced understanding of the role of technology in society so that we, the people, have tools to make AI work for the benefit of society,” said Julia Stoyanvoich, a Tandon Assistant Professor of Computer Science and Engineering, Director of the Center for Responsible AI at NYU Tandon, and an Assistant Professor at the NYU Center for Data Science. “It is up to us — current and future data scientists, business leaders, policy makers, and members of the public — to make AI what we want it to be.”

The collaboration will release four new modules in February. These include lectures from: 

  • Idoia Salazar, President and Co-Founder of OdiselA, who presents “Alexa vs Alice: Cultural Perspectives on the Impact of AI.” Salazar explores why it is important to take into account the cultural, geographical, and temporal aspects of AI, as well as their precise identification, in order to achieve the correct development and implementation of AI systems; 
  • Jerry John Kponyo, Associate Professor of Telecommunication Engineering at KNUST, who sheds light on the fundamentals of Artificial Intelligence in Transportation System (AITS) and safety, and looks at the technologies at play in its implementation; 
  • Danya Glabau, Director of Science and Technology studies at the NYU Tandon School of Engineering, asks and answers the question, “Who is artificial intelligence for?” and presents evidence that AI systems do not always help their intended users and constituencies; 
  • Mark Findlay, Director of the Centre for AI and Data Governance at SMU, reviews the ethical challenges — discrimination, lack of transparency, neglect of individual rights, and more — which have arisen from COVID-19 technologies and their resultant mass data accumulation.

To learn more and sign up to receive updates as new modules are added, visit the course website at aiethicscourse.org


READ MORE

Geographic Citizen Science Design

Feb 03, 2021 10:42 am

Book edited by Artemis Skarlatidou and Muki Haklay: “Little did Isaac Newton, Charles Darwin and other ‘gentlemen scientists’ know, when they were making their scientific discoveries, that some centuries later they would inspire a new field of scientific practice and innovation, called citizen science. The current growth and availability of citizen science projects and relevant applications to support citizen involvement is massive; every citizen has an opportunity to become a scientist and contribute to a scientific discipline, without having any professional qualifications. With geographic interfaces being the common approach to support collection, analysis and dissemination of data contributed by participants, ‘geographic citizen science’ is being approached from different angles.

Geographic Citizen Science Design takes an anthropological and Human-Computer Interaction (HCI) stance to provide the theoretical and methodological foundations to support the design, development and evaluation of citizen science projects and their user-friendly applications. Through a careful selection of case studies in the urban and non-urban contexts of the Global North and South, the chapters provide insights into the design and interaction barriers, as well as on the lessons learned from the engagement of a diverse set of participants; for example, literate and non-literate people with a range of technical skills, and with different cultural backgrounds.

Looking at the field through the lenses of specific case studies, the book captures the current state of the art in research and development of geographic citizen science and provides critical insight to inform technological innovation and future research in this area….(More)”.


READ MORE

Solving Public Problems

Feb 03, 2021 07:16 am

“Today the Governance Lab (The GovLab) at the NYU Tandon School of Engineering launched a free, online course on Solving Public Problems. The 12-part program, presented by Beth Simone Noveck, and over two-dozen global changemakers, trains participants in the skills needed to move from demanding change to making it. 

Taking a practical approach to addressing entrenched problems, from systemic racism to climate change, the course combines the teaching of quantitative and qualitative methods with participatory and equitable techniques for tapping the collective wisdom of communities to design and deliver powerful solutions to contemporary problems. 

“We cannot expect to tackle tomorrow’s problems with yesterday’s toolkit,” said Noveck, a former advisor on open government to President Barack Obama. “In the 21st century, we must equip ourselves with the skills to solve public problems. But those skills are not innate, and this program is designed to help people learn how to implement workable solutions to our hardest but most important challenges.”  

Based on Professor Noveck’s new book, Solving Public Problems: A Practical Guide to Fix Government and Change the World (Yale University Press 2021), this online program is intended to democratize access to public problem-solving education, providing citizens with  innovative tools to tap the collective wisdom of communities to take effective, organized action for change. …(More)”.


READ MORE

Robot census: Gathering data to improve policymaking on new technologies

Feb 03, 2021 07:12 am

Essay by Robert Seamans: There is understandable excitement about the impact that new technologies like artificial intelligence (AI) and robotics will have on our economy. In our everyday lives, we already see the benefits of these technologies: when we use our smartphones to navigate from one location to another using the fastest available route or when a predictive typing algorithm helps us finish a sentence in our email. At the same time, there are concerns about possible negative effects of these new technologies on labor. The Council of Economic Advisers of the past two Administrations have addressed these issues in the annual Economic Report of the President (ERP). For example, the 2016 ERP included a chapter on technology and innovation that linked robotics to productivity and growth, and the 2019 ERP included a chapter on artificial intelligence that discussed the uneven effects of technological change. Both these chapters used data at highly aggregated levels, in part because that is the data that is available. As I’ve noted elsewhere, AI and robots are everywhere, except, as it turns out, in the data.

To date, there have been no large scale, systematic studies in the U.S. on how robots and AI affect productivity and labor in individual firms or establishments (a firm could own one or more establishments, which for example could be a plant in a manufacturing setting or a storefront in a retail setting). This is because the data are scarce. Academic researchers interested in the effects of AI and robotics on economic outcomes have mostly used aggregate country and industry-level data. Very recently, some have studied these issues at the firm level using data on robot imports to France, Spain, and other countries. I review a few of these academic papers in both categories below, which provide early findings on the nuanced role these new technologies have on labor. Thanks to some excellent work being done by the U.S. Census Bureau, however, we may soon have more data to work with. This includes new questions on robot purchases in the Annual Survey of Manufacturers and Annual Capital Expenditures Survey and new questions on other technologies including cloud computing and machine learning in the Annual Business Survey….(More)”.


READ MORE

Profiling Insurrection: Characterizing Collective Action Using Mobile Device Data

Feb 03, 2021 07:08 am

Paper by David Van Dijcke and Austin L. Wright: “We develop a novel approach for estimating spatially dispersed community-level participation in mass protest. This methodology is used to investigate factors associated with participation in the ‘March to Save America’ event in Washington, D.C. on January 6, 2021. This study combines granular location data from more than 40 million mobile devices with novel measures of community-level voting patterns, the location of organized hate groups, and the entire georeferenced digital archive of the social media platform Parler. We find evidence that partisanship, socio-political isolation, proximity to chapters of the Proud Boys organization, and the local activity on Parler are robustly associated with protest participation. Our research fills a prominent gap in the study of collective action: identifying and studying communities involved in mass-scale events that escalate into violent insurrection….(More)”.


READ MORE

Monitoring the R-Citizen in the Time of Coronavirus

Feb 03, 2021 07:05 am

Paper by John Flood and Monique Lewis: “The COVID pandemic has overwhelmed many countries in their attempts at tracking and tracing people infected with the disease. Our paper examines how tracking and tracing is done looking at manual and technological means. It raises the issues around efficiency and privacy, etc. The paper investigates more closely the approaches taken by two countries, namely Taiwan and the UK. It shows how tracking and tracing can be handled sensitively and openly compared to the bungled attempts of the UK that have led to the greatest number of dead in Europe. The key messages are that all communications around tracking and tracing need to open, clear, without confusion and delivered by those closest to the communities receiving the messages.This occurred in Taiwan but in the UK the central government chose to close out local government and other local resources. The highly centralised dirigiste approach of the government alienated much of the population who came to distrust government. As local government was later brought into the COVID fold the messaging improved. Taiwan always remained open in its communications, even allowing citizens to participate in improving the technology around COVID. Taiwan learnt from its earlier experiences with SARS, whereas the UK ignored its pandemic planning exercises from earlier years and even experimented with crude ideas of herd immunity by letting the disease rip through the population–an idea soon abandoned.

We also derive a new type of citizen from the pandemic, namely the R citizen. This unfortunate archetype is both a blessing and a curse. If the citizen scores over 1 the disease accelerates and the R citizen is chastised, whereas if the citizen declines to zero it disappears but receives no plaudits for their behaviour. The R citizen can neither exist or die, rather like Schrödinger’s cat. R citizens are of course datafied individuals who are assemblages of data and are treated as distinct from humans. We argue they cannot be so distinguished without rendering them inhuman. This is as much a moral category as it is a scientific one….(More)”.


READ MORE

Governance of Data Sharing: a Law & Economics Proposal

Feb 03, 2021 07:00 am

Paper by Jens Prufer and Inge Graef: “To prevent market tipping, which inhibits innovation, there is an urgent need to mandate sharing of user information in data-driven markets. Existing legal mechanisms to impose data sharing under EU competition law and data portability under the GDPR are not sufficient to tackle this problem. Mandated data sharing requires the design of a governance structure that combines elements of economically efficient centralization with legally necessary decentralization. We identify three feasible options. One is to centralize investigations and enforcement in a European Data Sharing Agency (EDSA), while decision-making power lies with National Competition Authorities in a Board of Supervisors. The second option is to set up a Data Sharing Cooperation Network coordinated through a European Data Sharing Board, with the National Competition Authority best placed to run the investigation adjudicating and enforcing the mandatory data-sharing decision across the EU. A third option is to mix both governance structures and to task national authorities to investigate and adjudicate and the EU-level EDSA with enforcement of data sharing….(More)”


READ MORE

A Worldwide Assessment of COVID-19 Pandemic-Policy Fatigue

Feb 03, 2021 06:56 am

Paper by Anna Petherick et al: “As the COVID-19 pandemic lingers, signs of “pandemic-policy fatigue” have raised worldwide concerns. But the phenomenon itself is yet to be thoroughly defined, documented, and delved into. Based on self-reported behaviours from samples of 238,797 respondents, representative of the populations of 14 countries, as well as global mobility and policy data, we systematically examine the prevalence and shape of people’s alleged gradual reduction in adherence to governments’ protective-behaviour policies against COVID-19. Our results show that from March through December 2020, pandemic-policy fatigue was empirically meaningful and geographically widespread. It emerged for high-cost and sensitising behaviours (physical distancing) but not for low-cost and habituating ones (mask wearing), and was less intense among retired people, people with chronic diseases, and in countries with high interpersonal trust. Particularly due to fatigue reversal patterns in high- and upper-middle-income countries, we observe an arch rather than a monotonic decline in global pandemic-policy fatigue….(More)”.


READ MORE

Are New Technologies Changing the Nature of Work? The Evidence So Far

Feb 02, 2021 06:30 pm

Report by Kristyn Frank and Marc Frenette for the Institute for Research on Public Policy (Canada): “In recent years, ground breaking advances in artificial intelligence and their implications for automation technology have fuelled speculation that the very nature of work is being altered in unprecedented ways. News headlines regularly refer to the ”changing nature of work,” but what does it mean? Is there evidence that work has already been transformed by the new technologies? And if so, are these changes more dramatic than those experienced before?

In this paper, Kristyn Frank and Marc Frenette offer insights on these questions, based on the new research they conducted with their colleague Zhe Yang at Statistics Canada. Two aspects of work are under the microscope: the mix of work activities (or tasks) that constitute a job, and the mix of jobs in the economy. If new automation technologies are indeed changing the nature of work, the authors argue, then nonautomatable tasks should be increasingly important, and employment should be shifting toward occupations primarily involving such tasks.

According to the authors, nonroutine cognitive tasks (analytical or interpersonal) did become more important between 2011 and 2018. However, the changes were relatively modest, ranging from a 1.5 percent increase in the average importance of establishing and maintaining interpersonal relationships, to a 3.7 percent increase in analyzing data or information. Routine cognitive tasks — such as data entry — also gained importance, but these gains were even smaller. The picture is less clear for routine manual tasks, as the importance of tasks for which the pace is determined by the speed of equipment declined by close to 3 percent, whereas other tasks in that category became slightly more important.

Looking at longer-term shifts in overall employment, between 1987 and 2018, the authors find a gradual increase in the share of workers employed in occupations associated with nonroutine tasks, and a decline in routine-task-related occupations. The most pronounced shift in employment was away from production, craft, repair and operative occupations toward managerial, professional and technical occupations. However, they note that this shift to nonroutine occupations was not more pronounced between 2011 and 2018 than it was in the preceding decades. For instance, the share of employment in managerial, professional and technical occupations increased by 1.8 percentage points between 2011 and 2018, compared with a 6 percentage point increase between 1987 and 2010.

Most sociodemographic groups experienced the shift toward nonroutine jobs, although there were some exceptions. For instance, the employment share of workers in managerial, professional and technical occupations increased for all workers, but much more so for women than for men. Interestingly, there was a decline in the employment shares of workers in these occupations among those with a post-­secondary education. The explanation for this lies in the major increase over the past three decades in the proportion of workers with post-secondary education, which led some of them to move into jobs for which they are overqualified….(More)”.


READ MORE

Democratizing data in a 5G world

Feb 02, 2021 06:16 pm

Blog by Dimitrios Dosis at Mastercard: “The next generation of mobile technology has arrived, and it’s more powerful than anything we’ve experienced before. 5G can move data faster, with little delay — in fact, with 5G, you could’ve downloaded a movie in the time you’ve read this far. 5G will also create a vast network of connected machines. The Internet of Things will finally deliver on its promise to fuse all our smart products — vehicles, appliances, personal devices — into a single streamlined ecosystem.

My smartwatch could monitor my blood pressure and schedule a doctor’s appointment, while my car could collect data on how I drive and how much gas I use while behind the wheel. In some cities, petrol trucks already act as roving gas stations, receiving pings when cars are low on gas and refueling them as needed, wherever they are.

This amounts to an incredible proliferation of data. By 2025, every connected person will conduct nearly 5,000 data interactions every day — one every 18 seconds — whether they know it or not. 

Enticing and convenient as new 5G-powered developments may be, it also raises complex questions about data. Namely, who is privy to our personal information? As your smart refrigerator records the foods you buy, will the refrigerator’s manufacturer be able to see your eating habits? Could it sell that information to a consumer food product company for market research without your knowledge? And where would the information go from there? 

People are already asking critical questions about data privacy. In fact, 72% of them say they are paying attention to how companies collect and use their data, according to a global survey released last year by the Harvard Business Review Analytic Services. The survey, sponsored by Mastercard, also found that while 60% of executives believed consumers think the value they get in exchange for sharing their data is worthwhile, only 44% of consumers actually felt that way.

There are many reasons for this data disconnect, including the lack of transparency that currently exists in data sharing and the tension between an individual’s need for privacy and his or her desire for personalization.

This paradox can be solved by putting data in the hands of the people who create it — giving consumers the ability to manage, control and share their own personal information when they want to, with whom they want to, and in a way that benefits them.

That’s the basis of Mastercard’s core set of principles regarding data responsibility – and in this 5G world, it’s more important than ever. We will be able to gain from these new technologies, but this change must come with trust and user control at its core. The data ecosystem needs to evolve from schemes dominated by third parties, where some data brokers collect inferred, often unreliable and inaccurate data, then share it without the consumer’s knowledge….(More)”.


READ MORE

From Tech Critique to Ways of Living

Feb 02, 2021 04:36 pm

Alan Jacobs at The New Atlantis: “Neil Postman was right. So what? In the 1950s and 1960s, a series of thinkers, beginning with Jacques Ellul and Marshall McLuhan, began to describe the anatomy of our technological society. Then, starting in the 1970s, a generation emerged who articulated a detailed critique of that society. The critique produced by these figures I refer to in the singular because it shares core features, if not a common vocabulary. What Ivan Illich, Ursula Franklin, Albert Borgmann, and a few others have said about technology is powerful, incisive, and remarkably coherent. I am going to call the argument they share the Standard Critique of Technology, or SCT. The one problem with the SCT is that it has had no success in reversing, or even slowing, the momentum of our society’s move toward what one of their number, Neil Postman, called technopoly.

The basic argument of the SCT goes like this. We live in a technopoly, a society in which powerful technologies come to dominate the people they are supposed to serve, and reshape us in their image. These technologies, therefore, might be called prescriptive (to use Franklin’s term) or manipulatory (to use Illich’s). For example, social networks promise to forge connections — but they also encourage mob rule. Facial-recognition software helps to identify suspects — and to keep tabs on whole populations. Collectively, these technologies constitute the device paradigm (Borgmann), which in turn produces a culture of compliance (Franklin).

The proper response to this situation is not to shun technology itself, for human beings are intrinsically and necessarily users of tools. Rather, it is to find and use technologies that, instead of manipulating us, serve sound human ends and the focal practices (Borgmann) that embody those ends. A table becomes a center for family life; a musical instrument skillfully played enlivens those around it. Those healthier technologies might be referred to as holistic (Franklin) or convivial (Illich), because they fit within the human lifeworld and enhance our relations with one another. Our task, then, is to discern these tendencies or affordances of our technologies and, on both social and personal levels, choose the holistic, convivial ones.

The Standard Critique of Technology as thus described is cogent and correct. I have referred to it many times and applied it to many different situations. For instance, I have used the logic of the SCT to make a case for rejecting the “walled gardens” of the massive social media companies, and for replacing them with a cultivation of the “digital commons” of the open web.

But the number of people who are even open to following this logic is vanishingly small. For all its cogency, the SCT is utterly powerless to slow our technosocial momentum, much less to alter its direction. Since Postman and the rest made that critique, the social order has rushed ever faster toward a complete and uncritical embrace of the prescriptive, manipulatory technologies deceitfully presented to us as Liberation and Empowerment. So what next?…(More)”.


READ MORE

Give more data, awareness and control to individual citizens, and they will help COVID-19 containment

Feb 02, 2021 01:48 pm

Paper by Mirco Nanni et al: “The rapid dynamics of COVID-19 calls for quick and effective tracking of virus transmission chains and early detection of outbreaks, especially in the “phase 2” of the pandemic, when lockdown and other restriction measures are progressively withdrawn, in order to avoid or minimize contagion resurgence. For this purpose, contact-tracing apps are being proposed for large scale adoption by many countries. A centralized approach, where data sensed by the app are all sent to a nation-wide server, raises concerns about citizens’ privacy and needlessly strong digital surveillance, thus alerting us to the need to minimize personal data collection and avoiding location tracking. We advocate the conceptual advantage of a decentralized approach, where both contact and location data are collected exclusively in individual citizens’ “personal data stores”, to be shared separately and selectively (e.g., with a backend system, but possibly also with other citizens), voluntarily, only when the citizen has tested positive for COVID-19, and with a privacy preserving level of granularity. This approach better protects the personal sphere of citizens and affords multiple benefits: it allows for detailed information gathering for infected people in a privacy-preserving fashion; and, in turn this enables both contact tracing, and, the early detection of outbreak hotspots on more finely-granulated geographic scale. The decentralized approach is also scalable to large populations, in that only the data of positive patients need be handled at a central level. Our recommendation is two-fold. First to extend existing decentralized architectures with a light touch, in order to manage the collection of location data locally on the device, and allow the user to share spatio-temporal aggregates—if and when they want and for specific aims—with health authorities, for instance. Second, we favour a longer-term pursuit of realizing a Personal Data Store vision, giving users the opportunity to contribute to collective good in the measure they want, enhancing self-awareness, and cultivating collective efforts for rebuilding society….(More)”.


READ MORE

The problem with prediction

Feb 01, 2021 06:51 pm

Article by Joseph Fridman: “…At precisely the same moment in which the idea of predictive control has risen to dominance within the corporate sphere, it’s also gained a remarkable following within cognitive science. According to an increasingly influential school of neuroscientists, who orient themselves around the idea of the ‘predictive brain’, the essential activity of our most important organ is to produce a constant stream of predictions: predictions about the noises we’ll hear, the sensations we’ll feel, the objects we’ll perceive, the actions we’ll perform and the consequences that will follow. Taken together, these expectations weave the tapestry of our reality – in other words, our guesses about what we’ll see in the world become the world we see. Almost 400 years ago, with the dictum ‘I think, therefore I am,’ René Descartes claimed that cognition was the foundation of the human condition. Today, prediction has taken its place. As the cognitive scientist Anil Seth put it: ‘I predict (myself) therefore I am.’

Somehow, the logic we find animating our bodies is the same one transforming our body politic. The prediction engine – the conceptual tool used by today’s leading brain scientists to understand the deepest essence of our humanity – is also the one wielded by today’s most powerful corporations and governments. How did this happen and what does it mean?

One explanation for this odd convergence emerges from a wider historical tendency: humans have often understood the nervous system in terms of the flourishing technologies of their era, as the scientist and historian Matthew Cobb explained in The Idea of the Brain (2020). Thomas Hobbes, in his book Leviathan (1651), likened human bodies to ‘automata’, ‘[e]ngines that move themselves by springs and wheeles as doth a watch’. What is the heart, Hobbes asked, if not ‘a Spring; and the Nerves, but so many Strings …?’ Similarly, Descartes described animal spirits moving through the nerves according to the same physical properties that animated the hydraulic machines he witnessed on display in the French royal gardens.

The rise of electronic communications systems accelerated this trend. In the middle of the 19th century, the surgeon and chemist Alfred Smee said the brain was made up of batteries and photovoltaic circuits, allowing the nervous system to conduct ‘electro-telegraphic communication’ with the body. Towards the turn of the 20th century, the neuroscientist Santiago Ramón y Cajal described the positioning of different neural structures ‘somewhat as a telegraph pole supports the conducting wire’. And, during the First World War, the British Royal Institution Christmas lectures featured the anatomist and anthropologist Arthur Keith, who compared brain cells to operators in a telephonic exchange.

The technologies that have come to dominate many of our lives today are not primarily hydraulic or photovoltaic, or even telephonic or electro-telegraphic. They’re not even computational in any simplistic sense. They are predictive, and their infrastructures construct and constrain behaviour in all spheres of life. The old layers remain – electrical wiring innervates homes and workplaces, and water flows into sinks and showers through plumbing hidden from view. But these infrastructures are now governed by predictive technologies, and they don’t just guide the delivery of materials, but of information. Predictive models construct the feeds we scroll; they autocomplete our texts and emails, prompt us to leave for work on time, and pick out the playlists we listen to on the commute that they’ve plotted out for us. Consequential decisions in law enforcement, military and financial contexts are increasingly influenced by automated assessments spat out by proprietary predictive engines….(More)”.


READ MORE

Spatial information and the legibility of urban form: Big data in urban morphology

Feb 01, 2021 01:18 pm

Paper by Geoff Boeing: “Urban planning and morphology have relied on analytical cartography and visual communication tools for centuries to illustrate spatial patterns, conceptualize proposed designs, compare alternatives, and engage the public. Classic urban form visualizations – from Giambattista Nolli’s ichnographic maps of Rome to Allan Jacobs’s figure-ground diagrams of city streets – have compressed physical urban complexity into easily comprehensible information artifacts. Today we can enhance these traditional workflows through the Smart Cities paradigm of understanding cities via user-generated content and harvested data in an information management context. New spatial technology platforms and big data offer new lenses to understand, evaluate, monitor, and manage urban form and evolution. This paper builds on the theoretical framework of visual cultures in urban planning and morphology to introduce and situate computational data science processes for exploring urban fabric patterns and spatial order. It demonstrates these workflows with OSMnx and data from OpenStreetMap, a collaborative spatial information system and mapping platform, to examine street network patterns, orientations, and configurations in different study sites around the world, considering what these reveal about the urban fabric. The age of ubiquitous urban data and computational toolkits opens up a new era of worldwide urban form analysis from integrated quantitative and qualitative perspectives….(More)”.


READ MORE

Governance models for redistribution of data value

Feb 01, 2021 12:29 pm

Essay by Maria Savona: “The growth of interest in personal data has been unprecedented. Issues of privacy violation, power abuse, practices of electoral behaviour manipulation unveiled in the Cambridge Analytica scandal, and a sense of imminent impingement of our democracies are at the forefront of policy debates. Yet, these concerns seem to overlook the issue of concentration of equity value (stemming from data value, which I use interchangeably here) that underpins the current structure of big tech business models. Whilst these quasi-monopolies own the digital infrastructure, they do not own the personal data that provide the raw material for data analytics. 

The European Commission has been at the forefront of global action to promote convergence of the governance of data (privacy), including, but not limited to, the General Data Protection Regulation (GDPR) (European Commission 2016), enforced in May 2018. Attempts to enforce similar regulations are emerging around the world, including the California Consumer Privacy Act, which came into effect on 1 January 2020. Notwithstanding greater awareness among citizens around the use of their data, companies find that complying with GDPR is, at best, a useless nuisance. 

Data have been seen as ‘innovation investment’ since the beginning of the 1990s. The first edition of the Oslo Manual, the OECD’s international guidelines for collecting and using data on innovation in firms, dates back to 19921 and included the collection of databases on employee best practices as innovation investments. Data are also measured as an ‘intangible asset’ (Corrado et al. 2009 was one of the pioneering studies). What has changed over the last decade? The scale of data generation today is such that its management and control might have already gone well beyond the capacity of the very tech giants we are all feeding. Concerns around data governance and data privacy might be too little and too late. 

In this column, I argue that economists have failed twice: first, to predict the massive concentration of data value in the hands of large platforms; and second, to account for the complexity of the political economy aspects of data accumulation. Based on a pair of recent papers (Savona 2019a, 2019b), I systematise recent research and propose a novel data rights approach to redistribute data value whilst not undermining the range of ethical, legal, and governance challenges that this poses….(More)”.


READ MORE

From satisficing to artificing: The evolution of administrative decision-making in the age of the algorithm

Feb 01, 2021 07:20 am

Paper by Thea Snow at Data & Policy: “Algorithmic decision tools (ADTs) are being introduced into public sector organizations to support more accurate and consistent decision-making. Whether they succeed turns, in large part, on how administrators use these tools. This is one of the first empirical studies to explore how ADTs are being used by Street Level Bureaucrats (SLBs). The author develops an original conceptual framework and uses in-depth interviews to explore whether SLBs are ignoring ADTs (algorithm aversion); deferring to ADTs (automation bias); or using ADTs together with their own judgment (an approach the author calls “artificing”). Interviews reveal that artificing is the most common use-type, followed by aversion, while deference is rare. Five conditions appear to influence how practitioners use ADTs: (a) understanding of the tool (b) perception of human judgment (c) seeing value in the tool (d) being offered opportunities to modify the tool (e) alignment of tool with expectations….(More)”.


READ MORE

The Coup We Are Not Talking About

Jan 31, 2021 08:07 am

Shoshana Zuboff in the New York Times: “Two decades ago, the American government left democracy’s front door open to California’s fledgling internet companies, a cozy fire lit in welcome. In the years that followed, a surveillance society flourished in those rooms, a social vision born in the distinct but reciprocal needs of public intelligence agencies and private internet companies, both spellbound by a dream of total information awareness. Twenty years later, the fire has jumped the screen, and on Jan. 6, it threatened to burn down democracy’s house.

I have spent exactly 42 years studying the rise of the digital as an economic force driving our transformation into an information civilization. Over the last two decades, I’ve observed the consequences of this surprising political-economic fraternity as those young companies morphed into surveillance empires powered by global architectures of behavioral monitoring, analysis, targeting and prediction that I have called surveillance capitalism. On the strength of their surveillance capabilities and for the sake of their surveillance profits, the new empires engineered a fundamentally anti-democratic epistemic coupmarked by unprecedented concentrations of knowledge about us and the unaccountable power that accrues to such knowledge.

In an information civilization, societies are defined by questions of knowledge — how it is distributed, the authority that governs its distribution and the power that protects that authority. Who knows? Who decides who knows? Who decides who decides who knows? Surveillance capitalists now hold the answers to each question, though we never elected them to govern. This is the essence of the epistemic coup. They claim the authority to decide who knows by asserting ownership rights over our personal information and defend that authority with the power to control critical information systems and infrastructures….(More)”.


READ MORE

Digital Age Samaritans

Jan 29, 2021 04:10 pm

Paper by Zachary D. Kaufman: “Modern technology enables people to view, document, and share evidence of crimes contemporaneously or soon after commission. Electronic transmission of this material — including through social media and mobile devices — raises legal, moral, and practical questions about spectators’ responsibilities. In the digital age, will these actors be bystanders or upstanders? What role can and should the law play in shaping their behavior?

This Article argues that certain witnesses who are not physically present at the scene of a crime should be held criminally accountable for failing to report specified violent offenses. Focusing on rape, police brutality, and other misconduct, this Article demonstrates that recent technological innovations create new opportunities and challenges to pursue justice and accountability. Such culpability centers on “Bad Samaritan laws”: statutes that impose a legal duty to assist others in peril through intervening directly (also known as “the duty to rescue”) or notifying authorities (also known as “the duty to report”). However, many of these antiquated laws arguably apply only to witnesses who are physically present, which limits their potential effectiveness today.

Not all virtual witnesses should be subject to liability. To consider which categories of actors may warrant criminal punishment, this Article introduces a novel typology of bystanders and upstanders in the digital age. This typology draws on an original case study of the first known sexual crime livestreamed in the United States by a third party, which more than 700 people viewed. Harnessing insights from that case study and other episodes, the Article recommends that legislators should modernize, refine, and proliferate Bad Samaritan laws and that law enforcement should enforce these statutes or leverage them to obtain witness testimony. To that end, the Article proposes a model duty-to-report statute that includes features such as applicability to virtual presence and reasoned exemptions for noncompliance….(More)”.


READ MORE

Personal experiences bridge moral and political divides better than facts

Jan 29, 2021 03:46 pm

Paper by Emily Kubin, Curtis Puryear, Chelsea Schein, and Kurt Gray: “All Americans are affected by rising political polarization, whether because of a gridlocked Congress or antagonistic holiday dinners. People believe that facts are essential for earning the respect of political adversaries, but our research shows that this belief is wrong. We find that sharing personal experiences about a political issue—especially experiences involving harm—help to foster respect via increased perceptions of rationality. This research provides a straightforward pathway for increasing moral understanding and decreasing political intolerance. These findings also raise questions about how science and society should understand the nature of truth in the era of “fake news.” In moral and political disagreements, everyday people treat subjective experiences as truer than objective facts….(More)”


READ MORE

Digital platforms for development: Foundations and research agenda

Jan 29, 2021 02:35 pm

Paper by Carla Bonina, Kari Koskinen, Ben Eaton, and Annabelle Gawer: “Digital platforms hold a central position in today’s world economy and are said to offer a great potential for the economies and societies in the global South. Yet, to date, the scholarly literature on digital platforms has largely concentrated on business while their developmental implications remain understudied. In part, this is because digital platforms are a challenging research object due to their lack of conceptual definition, their spread across different regions and industries, and their intertwined nature with institutions, actors and digital technologies. The purpose of this article is to contribute to the ongoing debate in information systems and ICT4D research to understand what digital platforms mean for development. To do so, we first define what digital platforms are and differentiate between transaction and innovation platforms, and explain their key characteristics in terms of purpose, research foundations, material properties and business models. We add the socio‐technical context digital platforms operate and the linkages to developmental outcomes. We then conduct an extensive review to explore what current areas, developmental goals, tensions and issues emerge in the literature on platforms and development and identify relevant gaps in our knowledge. We later elaborate on six research questions to advance the studies on digital platforms for development: on indigenous innovation, digital platforms and institutions, on exacerbation of inequalities, on alternative forms of value, on the dark side of platforms and on the applicability of the platform typology for development….(More)”.


READ MORE
color-twitter-48.png
color-facebook-48.png
color-link-48.png
 
Have a new article, report, initiative or paper worth sharing with the audience of The Digest? Share it here!

Browse recent issues of The Digest at the Living Library or subscribe now to get our curation in your inbox every week.


Our mailing address is:

TheGovLab, Tandon School of Engineering, NYU
2 MetroTech Center
Floor 9, Brooklyn
New York, NY 11201

Add us to your address book


Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.