Copy
View this email in your browser
Table of Contents
 
 

The Practice and Potential of Blockchain Technologies for Extractive Sector Governance

Sep 24, 2020 08:05 am

Press Release: “Important questions are being raised about whether blockchain technologies can contribute to solving governance challenges in the mining, oil and gas sectors. This report seeks to begin addressing such questions, with particular reference to current blockchain applications and transparency efforts in the extractive sector.

It summarizes analysis by The Governance Lab (GovLab) at the New York University Tandon School of Engineering and the Natural Resource Governance Institute (NRGI). The study focused in particular on three activity areas: licensing and contracting, corporate registers and beneficial ownership, and commodity trading and supply chains.

Key messages:

  • Blockchain technology could potentially reduce transparency challenges and information asymmetries in certain parts of the extractives value chain. However, stakeholders considering blockchain technologies need a more nuanced understanding of problem definition, value proposition and blockchain attributes to ensure that such interventions could positively impact extractive sector governance.
  • The blockchain field currently lacks design principles, governance best practices, and open data standards that could ensure that the technology helps advance transparency and good governance in the extractive sector. Our analysis offers an initial set of design principles that could act as a starting point for a more targeted approach to the use of blockchain in improving extractives governance.
  • Most blockchain projects are preliminary concepts or pilots, with little demonstration of how to effectively scale up successful experiments, especially in countries with limited resources.
  • Meaningful impact evaluations or peer-reviewed publications that assess impact, including on the implications of blockchain’s emissions footprint, are still lacking. More broadly, a shared research agenda around blockchain could help address questions that are particularly ripe for future research.
  • Transition to a blockchain-enabled system is likely to be smoother and faster in cases when digital records are already available than when a government or company attempts to move from an analog system to one leveraging blockchain.
  • Companies or governments using blockchain are more likely to implement it successfully when they have a firm grasp of the technology, its strengths, its weaknesses, and how it fits into the broader governance landscape. But often these actors are often overly reliant on and empowering of blockchain technology vendors and startups, which can lead to “lock-in”, whereby the market gets stuck with an approach even though market participants may be better off with an alternative.
  • The role played by intermediaries like financial institutions or registrars can determine the success or failure of blockchain applications….(More)”.

READ MORE

Exploring Digital Government Transformation in the EU – Understanding public sector innovation in a data-driven society

Sep 24, 2020 07:09 am

Report edited by Misuraca, G., Barcevičius, E. and Codagnone, C.: “This report presents the final results of the research “Exploring Digital Government Transformation in the EU: understanding public sector innovation in a data-driven society”, in short DigiGov. After introducing the design and methodology of the study, the report provides a summary of the findings of the comprehensive analysis of the state of the art in the field, conducted reviewing a vast body of scientific literature, policy documents and practitioners generated reports in a broad range of disciplines and policy domains, with a focus on the EU. The scope and key dimensions underlying the development of the DigiGov-F conceptual framework are then presented. This is a theory-informed heuristic instrument to help mapping the effects of Digital Government Transformation and able to support defining change strategies within the institutional settings of public administration. Further, the report provides an overview of the findings of the empirical case studies conducted, and employing experimental or quasi-experimental components, to test and refine the conceptual framework proposed, while gathering evidence on impacts of Digital Government Transformation, through identifying real-life drivers and barriers in diverse Member States and policy domains. The report concludes outlining future research and policy recommendations, as well as depicting possible scenarios for future Digital Government Transformation, developed as a result of a dedicated foresight policy lab. This was conducted as part of the expert consultation and stakeholder engagement process that accompanied all the phases of the research implementation. Insights generated from the study also serve to pave the way for further empirical research and policy experimentation, and to contribute to the policy debate on how to shape Digital Europe at the horizon 2040….(More)”.


READ MORE

Quantified Storytelling: A Narrative Analysis of Metrics on Social Media

Sep 23, 2020 06:28 am

Book by Alex Georgakopoulou, Stefan Iversen and Carsten Stage: “This book interrogates the role of quantification in stories on social media: how do visible numbers (e.g. of views, shares, likes) and invisible algorithmic measurements shape the stories we post and engage with? The links of quantification with stories have not been explored sufficiently in storytelling research or in social media studies, despite the fact that platforms have been integrating sophisticated metrics into developing facilities for sharing stories, with a massive appeal to ordinary users, influencers and businesses alike.

With case-studies from Instagram, Reddit and Snapchat, the authors show how three types of metrics, namely content metrics, interface metrics and algorithmic metrics, affect the ways in which cancer patients share their experiences, the circulation of specific stories that mobilize counter-publics and the design of stories as facilities on platforms. The analyses document how numbers structure elements in stories, indicate and produce engagement and become resources for the tellers’ self-presentation….(More)”.


READ MORE

Improving data access democratizes and diversifies science

Sep 22, 2020 09:46 pm

Research article by Abhishek Nagaraj, Esther Shears, and Mathijs de Vaan: “Data access is critical to empirical research, but past work on open access is largely restricted to the life sciences and has not directly analyzed the impact of data access restrictions. We analyze the impact of improved data access on the quantity, quality, and diversity of scientific research. We focus on the effects of a shift in the accessibility of satellite imagery data from Landsat, a NASA program that provides valuable remote-sensing data. Our results suggest that improved access to scientific data can lead to a large increase in the quantity and quality of scientific research. Further, better data access disproportionately enables the entry of scientists with fewer resources, and it promotes diversity of scientific research….(More)”


READ MORE

Research 4.0: research in the age of automation

Sep 22, 2020 09:34 pm

Report by Rob Procter, Ben Glover, and Elliot Jones: “There is a growing consensus that we are at the start of a fourth industrial revolution, driven by developments in Artificial Intelligence, machine learning, robotics, the Internet of Things, 3-D printing, nanotechnology, biotechnology, 5G, new forms of energy storage and quantum computing. This report seeks to understand what impact AI is having on the UK’s research sector and what implications it has for its future, with a particular focus on academic research.

Building on our interim report, we find that AI is increasingly deployed in academic research in the UK in a broad range of disciplines. The combination of an explosion of new digital data sources with powerful new analytical tools represents a ‘double dividend’ for researchers. This is allowing researchers to investigate questions that would have been unanswerable just a decade ago. Whilst there has been considerable take-up of AI in academic research, the report highlights that steps could be taken to ensure even wider adoption of these new techniques and technologies, including wider training in the necessary skills for effective utilisation of AI, faster routes to culture change and greater multi-disciplinary collaboration.

This report recognises that the Covid-19 pandemic means universities are currently facing significant pressures, with considerable demands on their resources whilst simultaneously facing threats to income. But as we emerge from the current crisis, we urge policy makers and universities to consider the report’s recommendations and take steps to fortify the UK’s position as a place of world-leading research. Indeed, the current crisis has only reminded us of the critical importance of a highly functioning and flourishing research sector. The report recommends:

The current post-16 curriculum should be reviewed to ensure all pupils receive a grounding in basic digital, quantitative and ethical skills necessary to ensure the effective and appropriate utilisation of AI.A UK-wide audit of research computing and data infrastructure provision is conducted to consider how access might be levelled up.

UK Research and Innovation (UKRI) should consider incentivising institutions to utilise AI wherever it can offer benefits to the economy and society in their future spending on research and development.

Universities should take steps to ensure that it is easier for researchers to move between academia and industry, for example, by putting less emphasis on publications, and recognise other outputs and measures of achievement when hiring for academic posts….(More)”.


READ MORE

Reimagining Help

Sep 22, 2020 08:33 pm

Guide by Nesta: “Now more than ever, there is a need to help people live well in their homes and communities. The coronavirus pandemic has highlighted the importance of diversifying sources of help beyond the hospital, and of drawing on support from friends, neighbours, local organisations and charities to ensure people can live healthy lives. We must think more flexibly about what ‘help’ means, and how the right help can make a huge difference.

While medical care is fundamental to saving lives, people need more than a ‘fix’ to live well every day. If we are to support people to reach their goals, we must move away from ʻexpertsʼ holding the knowledge and power, and instead draw on people’s own knowledge, relationships, strengths and purpose to determine solutions that work best for them.

We believe there is an opportunity to ‘reimagine help’ by applying insights from the field of behaviour change research to a wide range of organisations and places – community facilities, local charities and businesses, employment and housing support, as well as health and care services, all of which play a role in supporting people to reach their goals in a way that feels right for them….

Nesta, Macmillan Cancer Support, the British Heart Foundation and the UCL Centre for Behaviour Change have worked together to develop a universal model of ‘Good Help’ underpinned by behavioural evidence, which can be understood and accessed by everyone. We analysed and simplified decades of behaviour change research and practice, and worked with a group of 30 practitioners and people with lived experience to iterate and cross-check the behavioural evidence against real life experiences. Dartington Service Design Lab helped to structure and format the evidence in a way that makes it easy for everyone to understand.

Collectively, we have produced a guide which outlines eight characteristics of Good Help, which aims to support practitioners, system leaders (such as service managers, charity directors or commissioners) and any person working in a direct ‘helping’ organisation to:

  • Understand the behaviour change evidence that underpins Good Help
  • Develop new ideas or adapt offers of Good Help, which can be tested out in their own organisations or local communities….(More)”.

READ MORE

Digital Minilateralism: How governments cooperate on digital governance

Sep 22, 2020 07:29 pm

A policy paper by Tanya Filer and Antonio Weiss: “New research from the Digital State Project argues for the critical function of small, agile, digitally enabled and focused networks of leaders to foster strong international cooperation on digital governance issues.

This type of cooperative working, described as ‘digital minilateralism’, has a role to play in shaping how individual governments learn, adopt and govern the use of new and emerging technologies, and how they create common or aligned policies. It is also important as cross-border digital infrastructure and services become increasingly common….

Key findings: 

  • Already beginning to prove effective, digital minilateralism has a role to play in shaping how individual governments learn, adopt and govern the use of new and emerging technologies, and how they create common or aligned policy.
  • National governments should recognise and reinforce the strategic value of digital minilaterals without stamping out, through over-bureaucratisation, the qualities of trust, open conversation, and ad-hocness in which their value lies.
  • As digital minilateral networks grow and mature, they will need to find mechanisms through which to retain (or adapt) their core principles while scaling across more boundaries.
  • To demonstrate their value to the global community, digital multilaterals must feed into formal multilateral conversations and arrangements. …(More)“.

READ MORE

Sortition, its advocates and its critics: An empirical analysis of citizens’ and MPs’ support for random selection as a democratic reform proposal

Sep 22, 2020 07:20 pm

Paper by Vincent Jacquet et al: “This article explores the prospects of an increasingly debated democratic reform: assigning political offices by lot. While this idea is advocated by political theorists and politicians in favour of participatory and deliberative democracy, the article investigates the extent to which citizens and MPs actually endorse different variants of ‘sortition’. We test for differences among respondents’ social status, disaffection with elections and political ideology. Our findings suggest that MPs are largely opposed to sortitioning political offices when their decision-making power is more than consultative, although leftist MPs tend to be in favour of mixed assemblies (involving elected and sortitioned members). Among citizens, random selection seems to appeal above all to disaffected individuals with a lower social status. The article ends with a discussion of the political prospects of sortition being introduced as a democratic reform…(More).”


READ MORE

The Cruel New Era of Data-Driven Deportation

Sep 22, 2020 03:50 pm

Article by Alvaro M. Bedoya: “For a long time, mass deportations were a small-data affair, driven by tips, one-off investigations, or animus-driven hunches. But beginning under George W. Bush, and expanding under Barack Obama, ICE leadership started to reap the benefits of Big Data. The centerpiece of that shift was the “Secure Communities” program, which gathered the fingerprints of arrestees at local and state jails across the nation and compared them with immigration records. That program quickly became a major driver for interior deportations. But ICE wanted more data. The agency had long tapped into driver address records through law enforcement networks. Eyeing the breadth of DMV databases, agents began to ask state officials to run face recognition searches on driver photos against the photos of undocumented people. In Utah, for example, ICE officers requested hundreds of face searches starting in late 2015. Many immigrants avoid contact with any government agency, even the DMV, but they can’t go without heat, electricity, or water; ICE aimed to find them, too. So, that same year, ICE paid for access to a private database that includes the addresses of customers from 80 national and regional electric, cable, gas, and telephone companies.

Amid this bonanza, at least, the Obama administration still acknowledged red lines. Some data were too invasive, some uses too immoral. Under Donald Trump, these limits fell away.

In 2017, breaking with prior practice, ICE started to use data from interviews with scared, detained kids and their relatives to find and arrest more than 500 sponsors who stepped forward to take in the children. At the same time, ICE announced a plan for a social media monitoring program that would use artificial intelligence to automatically flag 10,000 people per month for deportation investigations. (It was scuttled only when computer scientists helpfully indicated that the proposed system was impossible.) The next year, ICE secured access to 5 billion license plate scans from public parking lots and roadways, a hoard that tracks the drives of 60 percent of Americans—an initiative blocked by Department of Homeland Security leadership four years earlier. In August, the agency cut a deal with Clearview AI, whose technology identifies people by comparing their faces not to millions of driver photos, but to 3 billion images from social media and other sites. This is a new era of immigrant surveillance: ICE has transformed from an agency that tracks some people sometimes to an agency that can track anyone at any time….(More)”.


READ MORE

AI planners in Minecraft could help machines design better cities

Sep 22, 2020 01:30 pm

Article by Will Douglas Heaven: “A dozen or so steep-roofed buildings cling to the edges of an open-pit mine. High above them, on top of an enormous rock arch, sits an inaccessible house. Elsewhere, a railway on stilts circles a group of multicolored tower blocks. Ornate pagodas decorate a large paved plaza. And a lone windmill turns on an island, surrounded by square pigs. This is Minecraft city-building, AI style.

Minecraft has long been a canvas for wild invention. Fans have used the hit block-building game to create replicas of everything from downtown Chicago and King’s Landing to working CPUs. In the decade since its first release, anything that can be built has been.

Since 2018, Minecraft has also been the setting for a creative challenge that stretches the abilities of machines. The annual Generative Design in Minecraft (GDMC) competition asks participants to build an artificial intelligence that can generate realistic towns or villages in previously unseen locations. The contest is just for fun, for now, but the techniques explored by the various AI competitors are precursors of ones that real-world city planners could use….(More)”.


READ MORE

Smart Rural: The Open Data Gap

Sep 22, 2020 06:42 am

Paper by Johanna Walker et al: “The smart city paradigm has underpinned a great deal of thevuse and production of open data for the benefit of policymakers and citizens. This paper posits that this further enhances the existing urban rural divide. It investigates the availability and use of rural open data along two parameters: pertaining to rural populations, and to key parts of the rural economy (agriculture, fisheries and forestry). It explores the relationship between key statistics of national / rural economies and rural open data; and the use and users of rural open data where it is available. It finds that although countries with more rural populations are not necessarily earlier in their Open Data Maturity journey, there is still a lack of institutionalisation of open data in rural areas; that there is an apparent gap between the importance of agriculture to a country’s GDP and the amount of agricultural data published openly; and lastly, that the smart
city paradigm cannot simply be transferred to the rural setting. It suggests instead the adoption of the emerging ‘smart region’ paradigm as that most likely to support the specific data needs of rural areas….(More)”.


READ MORE

Why Coming Up With Effective Interventions To Address COVID-19 Is So Hard

Sep 21, 2020 05:51 pm

Article by Neil Lewis Jr.: “It has been hard to measure the effects of the novel coronavirus. Not only is COVID-19 far-reaching — it’s touched nearly every corner of the globe at this point — but its toll on society has also been devastating. It is responsible for the deaths of over 905,000 people around the world, and more than 190,000 people in the United States alone. The associated economic fallout has been crippling. In the U.S., more people lost their jobs in the first three months of the pandemic than in the first two years of the Great Recession. Yes, there are some signs the economy might be recovering, but the truth is, we’re just beginning to understand the pandemic’s full impact, and we don’t yet know what the virus has in store for us.

This is all complicated by the fact that we’re still figuring out how best to combat the pandemic. Without a vaccine readily available, it has been challenging to get people to engage in enough of the behaviors that can help slow the virus. Some policy makers have turned to social and behavioral scientists for guidance, which is encouraging because this doesn’t always happen. We’ve seen many universities ignore the warnings of behavioral scientists and reopen their campuses, only to have to quickly shut them back down.

But this has also meant that there are a lot of new studies to wade through. In the field of psychology alone, between Feb. 10 and Aug. 30, 541 papers about COVID-19 were uploaded to the field’s primary preprint server, PsyArXiv. With so much research to wade through, it’s hard to know what to trust — and I say that as someone who makes a living researching what types of interventions motivate people to change their behaviors.

As I tell my students, if you want to use behavioral science research to address real-world problems, you have to look very closely at the details. Often, a simple question like, “What research should policy makers and practitioners use to help combat the pandemic?” is surprisingly difficult to answer.

For starters, there are often key differences between the lab (or the people and situations some social scientists typically study as part of our day-to-day research) and the real world (or the people and situations policy-makers and practitioners have in mind when crafting interventions).

Take, for example, the fact that social scientists tend to study people from richer countries that are generally highly educated, industrialized, democratic and in the Western hemisphere. And some social scientific fields (e.g., psychologyfocus overwhelmingly on whiter, wealthier and more highly educated groups of people within those nations.

This is a major issue in the social sciences and something that researchers have been talking about for decades. But it’s important to mention now, too, as Black and brown people have been disproportionately affected by the coronavirus — they are dying at much higher rates than white people and working more of the lower-paying “essential” jobs that expose them to greater risks. Here you can start to see very real research limitations creep in: The people whose lives have been most adversely affected by the virus have largely been excluded from the studies that are supposed to help them. When samples and the methods used are not representative of the real world, it becomes very difficult to reach accurate and actionable conclusions….(More)”.


READ MORE

Emerging models of data governance in the age of datafication

Sep 21, 2020 03:58 pm

Paper by Marina Micheli et al: “The article examines four models of data governance emerging in the current platform society. While major attention is currently given to the dominant model of corporate platforms collecting and economically exploiting massive amounts of personal data, other actors, such as small businesses, public bodies and civic society, take also part in data governance. The article sheds light on four models emerging from the practices of these actors: data sharing pools, data cooperatives, public data trusts and personal data sovereignty. We propose a social science-informed conceptualisation of data governance. Drawing from the notion of data infrastructure we identify the models as a function of the stakeholders’ roles, their interrelationships, articulations of value, and governance principles. Addressing the politics of data, we considered the actors’ competitive struggles for governing data. This conceptualisation brings to the forefront the power relations and multifaceted economic and social interactions within data governance models emerging in an environment mainly dominated by corporate actors. These models highlight that civic society and public bodies are key actors for democratising data governance and redistributing value produced through data. Through the discussion of the models, their underpinning principles and limitations, the article wishes to inform future investigations of socio-technical imaginaries for the governance of data, particularly now that the policy debate around data governance is very active in Europe….(More)”.


READ MORE

Enhancing Digital Equity

Sep 21, 2020 06:19 am

Book by Massimo Ragnedda on “Connecting the Digital Underclass…This book highlights how, in principle, digital technologies present an opportunity to reduce social disparities, tackle social exclusion, enhance social and civil rights, and promote equity. However, to achieve these goals, it is necessary to promote digital equity and connect the digital underclass.

The book focuses on how the advent of technologies may become a barrier to social mobility and how, by concentrating resources and wealth in few hands, the digital revolution is giving rise to the digital oligarchy, further penalizing the digital underclass. Socially-disadvantaged people, living at the margins of digital society, are penalized both in terms of accessing-using-benefits (three levels of digital divide) but also in understanding-programming-treatment of new digital technologies (three levels of algorithms divide). The advent and implementation of tools that rely on algorithms to make decisions has further penalized specific social categories by normalizing inequalities in the name of efficiency and rationalization….(More)”.


READ MORE

Coding Democracy

Sep 20, 2020 06:30 am

Book by Maureen Webb: “Hackers have a bad reputation, as shady deployers of bots and destroyers of infrastructure. In Coding Democracy, Maureen Webb offers another view. Hackers, she argues, can be vital disruptors. Hacking is becoming a practice, an ethos, and a metaphor for a new wave of activism in which ordinary citizens are inventing new forms of distributed, decentralized democracy for a digital era. Confronted with concentrations of power, mass surveillance, and authoritarianism enabled by new technology, the hacking movement is trying to “build out” democracy into cyberspace.

Webb travels to Berlin, where she visits the Chaos Communication Camp, a flagship event in the hacker world; to Silicon Valley, where she reports on the Apple-FBI case, the significance of Russian troll farms, and the hacking of tractor software by desperate farmers; to Barcelona, to meet the hacker group XNet, which has helped bring nearly 100 prominent Spanish bankers and politicians to justice for their role in the 2008 financial crisis; and to Harvard and MIT, to investigate the institutionalization of hacking. Webb describes an amazing array of hacker experiments that could dramatically change the current political economy. These ambitious hacks aim to displace such tech monoliths as Facebook and Amazon; enable worker cooperatives to kill platforms like Ubergive people control over their data; automate trust; and provide citizens a real say in governance, along with capacity to reach consensus. Coding Democracy is not just another optimistic declaration of technological utopianism; instead, it provides the tools for an urgently needed upgrade of democracy in the digital era….(More)”.


READ MORE

Models and Modeling in the Sciences: A Philosophical Introduction

Sep 19, 2020 08:33 pm

Book by Stephen M. Downes: “Biologists, climate scientists, and economists all rely on models to move their work forward. In this book, Stephen M. Downes explores the use of models in these and other fields to introduce readers to the various philosophical issues that arise in scientific modeling. Readers learn that paying attention to models plays a crucial role in appraising scientific work. 

This book first presents a wide range of models from a number of different scientific disciplines. After assembling some illustrative examples, Downes demonstrates how models shed light on many perennial issues in philosophy of science and in philosophy in general. Reviewing the range of views on how models represent their targets introduces readers to the key issues in debates on representation, not only in science but in the arts as well. Also, standard epistemological questions are cast in new and interesting ways when readers confront the question, “What makes for a good (or bad) model?”…(More)’.


READ MORE

How Algorithms Can Fight Bias Instead of Entrench It

Sep 19, 2020 07:51 pm

Essay by Tobias Baer: “…How can we build algorithms that correct for biased data and that live up to the promise of equitable decision-making?

When we consider changing an algorithm to eliminate bias, it is helpful to distinguish what we can change at three different levels (from least to most technical): the decision algorithm, formula inputs, and the formula itself.

In discussing the levels, I will use a fictional example, involving Martians and Zeta Reticulans. I do this because picking a real-life example would, in fact, be stereotyping—I would perpetuate the very biases I try to fight by reiterating a simplified version of the world, and every time I state that a particular group of people is disadvantaged, I also can negatively affect the self-perception of people who consider themselves members of these groups. I do apologize if I unintentionally insult any Martians reading this article!

On the simplest and least technical level, we would adjust only the overall decision algorithm that takes one or more statistical formulas (typically to predict unknown outcomes such as academic success, recidivation, or marital bliss) as an input and applies rules to translate the predictions of these formulas into decisions (e.g., by comparing predictions with externally chosen cutoff values or contextually picking one prediction over another). Such rules can be adjusted without touching the statistical formulas themselves.

An example of such an intervention is called boxing. Imagine you have a score of astrological ability. The astrological ability score is a key criterion for shortlisting candidates for the Interplanetary Economic Forecasting Institute. You would have no objective reason to believe that Martians are any less apt at prognosticating white noise than Zeta Reticulans; however, due to racial prejudice in our galaxy, Martian children tend to get asked a lot less for their opinion and therefore have a lot less practice in gabbing than Zeta Reticulans, and as a result only one percent of Martian applicants achieve the minimum score required to be hired for the Interplanetary Economic Forecasting Institute as compared to three percent of Zeta Reticulans.

Boxing would posit that for hiring decisions to be neutral of race, for each race two percent of applicants should be eligible, and boxing would achieve it by calibrating different cut-off scores (i.e., different implied probabilities of astrological success) for Martians and Zeta Reticulans.

Another example of a level-one adjustment would be to use multiple rank-ordering scores and to admit everyone who achieves a high score on any one of them. This approach is particularly well suited if you have different methods of assessment at your disposal, but each method implies a particular bias against one or more subsegments. An example for a crude version of this approach is admissions to medical school in Germany, where routes include college grades, a qualitative assessment through an interview, and a waitlist….(More)”.


READ MORE

Global citizen deliberation on genome editing

Sep 19, 2020 07:20 pm

Essay by John S. Dryzek et al at Science: “Genome editing technologies provide vast possibilities for societal benefit, but also substantial risks and ethical challenges. Governance and regulation of such technologies have not kept pace in a systematic or internationally consistent manner, leaving a complex, uneven, and incomplete web of national and international regulation (1). How countries choose to regulate these emergent technologies matters not just locally, but globally, because the implications of technological developments do not stop at national boundaries. Practices deemed unacceptable in one country may find a more permissive home in another: not necessarily through national policy choice, but owing to a persistent national legal and regulatory void that enables “ethics dumping” (2)—for example, if those wanting to edit genes to “perfect” humans seek countries with little governance capacity. Just as human rights are generally recognized as a matter of global concern, so too should technologies that may impinge on the question of what it means to be human. Here we show how, as the global governance vacuum is filled, deliberation by a global citizens’ assembly should play a role, for legitimate and effective governance….(More)”.


READ MORE

Politicians should take citizens’ assemblies seriously

Sep 19, 2020 04:44 pm

The Economist: “In 403bc Athens decided to overhaul its institutions. A disastrous war with Sparta had shown that direct democracy, whereby adult male citizens voted on laws, was not enough to stop eloquent demagogues from getting what they wanted, and indeed from subverting democracy altogether. So a new body, chosen by lot, was set up to scrutinise the decisions of voters. It was called the nomothetai or “layers down of law” and it would be given the time to ponder difficult decisions, unmolested by silver-tongued orators and the schemes of ambitious politicians.

This ancient idea is back in vogue, and not before time. Around the world “citizens’ assemblies” and other deliberative groups are being created to consider questions that politicians have struggled to answer (see article). Over weeks or months, 100 or so citizens—picked at random, but with a view to creating a body reflective of the population as a whole in terms of gender, age, income and education—meet to discuss a divisive topic in a considered, careful way. Often they are paid for their time, to ensure that it is not just political wonks who sign up. At the end they present their recommendations to politicians. Before covid-19 these citizens met in conference centres in large cities where, by mingling over lunch-breaks, they discovered that the monsters who disagree with them turned out to be human after all. Now, as a result of the pandemic, they mostly gather on Zoom.

Citizens’ assemblies are often promoted as a way to reverse the decline in trust in democracy, which has been precipitous in most of the developed world over the past decade or so. Last year the majority of people polled in America, Britain, France and Australia—along with many other rich countries—felt that, regardless of which party wins an election, nothing really changes. Politicians, a common complaint runs, have no understanding of, or interest in, the lives and concerns of ordinary people.

Citizens’ assemblies can help remedy that. They are not a substitute for the everyday business of legislating, but a way to break the deadlock when politicians have tried to deal with important issues and failed. Ordinary people, it turns out, are quite reasonable. A large four-day deliberative experiment in America softened Republicans’ views on immigration; Democrats became less eager to raise the minimum wage. Even more strikingly, two 18-month-long citizens’ assemblies in Ireland showed that the country, despite its deep Catholic roots, was far more socially liberal than politicians had realised. Assemblies overwhelmingly recommended the legalisation of both same-sex marriage and abortion….(More)”.


READ MORE

The forecasting fallacy

Sep 19, 2020 04:42 pm

Essay by Alex Murrell: “Marketers are prone to a prediction.

You’ll find them in the annual tirade of trend decks. In the PowerPoint projections of self-proclaimed prophets. In the feeds of forecasters and futurists. They crop up on every conference stage. They make their mark on every marketing magazine. And they work their way into every white paper.

To understand the extent of our forecasting fascination, I analysed the websites of three management consultancies looking for predictions with time frames ranging from 2025 to 2050. Whilst one prediction may be published multiple times, the size of the numbers still shocked me. Deloitte’s site makes 6904 predictions. McKinsey & Company make 4296. And Boston Consulting Group, 3679.

In total, these three companies’ websites include just shy of 15,000 predictions stretching out over the next 30 years.

But it doesn’t stop there.

My analysis finished in the year 2050 not because the predictions came to an end but because my enthusiasm did.

Search the sites and you’ll find forecasts stretching all the way to the year 2100. We’re still finding our feet in this century but some, it seems, already understand the next.

I believe the vast majority of these to be not forecasts but fantasies. Snake oil dressed up as science. Fiction masquerading as fact.

This article assesses how predictions have performed in five fields. It argues that poor projections have propagated throughout our society and proliferated throughout our industry. It argues that our fixation with forecasts is fundamentally flawed.

So instead of focussing on the future, let’s take a moment to look at the predictions of the past. Let’s see how our projections panned out….

Viewed through the lens of Tetlock, it becomes clear that the 15,000 predictions with which I began this article are not forecasts but fantasies.

The projections look precise. They sound scientific. But these forecasts are nothing more than delusions with decimal places. Snake oil dressed up as statistics. Fiction masquerading as fact. They provide a feeling of certainty but they deliver anything but.

In his 1998 book The Fortune Sellers, the business writer William A. Sherden quantified our consensual hallucination: 

“Each year the prediction industry showers us with $200 billion in (mostly erroneous) information. The forecasting track records for all types of experts are universally poor, whether we consider scientifically oriented professionals, such as economists, demographers, meteorologists, and seismologists, or psychic and astrological forecasters whose names are household words.” 

The comparison between professional predictors and fortune tellers is apt.

From tarot cards to tea leaves, palmistry to pyromancy, clear visions of cloudy futures have always been sold to susceptible audiences. 

Today, marketers are one such audience.

It’s time we opened our eyes….(More)”.


READ MORE
color-twitter-48.png
color-facebook-48.png
color-link-48.png
 
Have a new article, report, initiative or paper worth sharing with the audience of The Digest? Share it here!

Browse recent issues of The Digest at the Living Library or subscribe now to get our curation in your inbox every week.


Our mailing address is:

TheGovLab, Tandon School of Engineering, NYU
2 MetroTech Center
Floor 9, Brooklyn
New York, NY 11201

Add us to your address book


Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.