key: cord-0061079-uov026yk authors: Peterson, Gayle; Yawson, Robert; J. K., Ellen; Nicholls, Jeremy title: The Surround Sound of Technology as an Accelerator of Social Good date: 2020-10-28 journal: Navigating Big Finance and Big Technology for Global Change DOI: 10.1007/978-3-030-40712-4_5 sha: 3d09fa1551e263b337346363e499c59728234154 doc_id: 61079 cord_uid: uov026yk Social media, artificial intelligence (AI), blockchain technology, mobile technology, and fintech are just some of the technologies that have revolutionized the way that our society and institutions transact, interact, store data, and develop. In this chapter we discuss the promise and perils of technology as it relates to social impact and investing, including the myths and realities of sociotechnical implementation and the balance of rights. privacy, management of personal information, and access are all crucial sociotechnical issues. Amid an ever-changing world, where there is still a great divide between those who have access to technology and finance, and control over its implementation, it is vital that Deliberate Leaders carefully think through the sociotechnical implications of technology. It is also important to consider the longer-term implications of technological adoption. For instance, technology brings promise and peril when it comes to helping address global poverty and systemic change. In her 2019 article Banking on the Future of Women, Sarah Hendriks, Director of the Gender Equality program at the Bill & Melinda Gates Foundation, addressed the importance of digital financial services as a key part of sociotechnical solutions, in particular for 40% of women in the developing world, and who are among the world's poor. She notes, "Poverty is not a single fact or condition, but rather a collection of them: a lack of financial assets, a lack of access to property, and a lack of voice in one's community" (Hendriks 2019) . She cites mobile money, debit and credit cards, and e-commerce platforms as game changing, particularly for women in the developing world. She compares for instance the situation of unbanked female garment workers in Bangladesh. In the past, they had to surrender their cash earnings to a spouse or family member and had little say over how money was being spent (Hendriks 2019) . After wages became electronic, 69% of women reported more control over savings. Similarly, Liberian schoolteachers often made a 10-hour journey to obtain collect their paychecks. After wages and salaries were digitized, travel time was reduced by 90% (Hendriks 2019) . As these examples illustrate, technology as a tool holds the promise of helping people address Wicked Problems. Yet, as technical solutions grow in complexity and speed concerns regarding protections of individual rights and security, issues of self-regulation, and questions of accountability for harm grow exponentially. In this chapter, we explore the promise and perils of specific technologies as they address Wicked Problems and the UN Sustainable Development Goals (SDGs). We discuss the difficult choices leaders face in in terms of balancing competing "needs": the transactional needs of individuals; the business need to collect personal information; and the needs to ensure the sovereignty of individuals' rights of privacy and control of data. Throughout the chapter, the principles of Deliberate Leadership offer guideposts to help leaders manage these competing interests while helping to address complex, systemic problems. As a tool, technology can be a powerful enabler for Big Finance to do more to achieve social good. Consider three positive, poverty-alleviating technological solutions. First, expanding financial inclusion to the poor. As of 2018, an estimated 1.7 billion people comprised the world's "unbanked" (World Bank 2018). It is estimated that providing financial services to the world's poor could add more than $250 billion to global GDP. The unbanked lack financial infrastructure and access preventing them from being able to use, transfer, secure, and grow their money. Technology is seen as a way of opening up banking and financial transfer systems to the poor. Second, technology is seen as a tool to accelerate and secure the transfer of money and aid. Today transfers of money and aid are often slow and insecure. Many aid organizations and technology companies are piloting ways to address this problem through private blockchains and common ledger technology combined with other technologies such as AI and biometrics as well as the use of cryptocurrencies. Third, technology is being piloted to solve problems in verifying, documenting, and facilitating individual rights and executing contracts so that people have control over their property and money, and corruption is minimized. While the tools of technology can play an important role in addressing these and other social problems, technology alone is not applicable to all sustainable development challenges. They can also bring unforeseen problems that can cause harm. This is particularly true when commercial technology solutions are adapted to complex social issues. History shows several common technology myths including: • The myth of common vision. Technology will be used in the social sphere in the same way that developers and organizations envision. • The myth of a common frame. Those implementing sociotechnical solutions share a common frame of reference when it comes to ideologies, laws, policies, including common norms and ideals of freedom, individual rights, and democracy; and • The myth of effective self-regulation. In the absence of laws and regulations (particularly in states with weak rule of law and among vulnerable communities) sociotechnical solutions that are implemented through means of self-regulation will be secured, monitored, and individual rights protected such that the public is not susceptible to harm. When these myths are treated as truth, big problems can arise. These include: the unintended use of technology, violations of privacy rights, conflicts over property and ownership rights, and issues involving control over technology. Other problems include the lack of technological security, the lack of liability for unintended harm, and problems addressing service levels and performance. Compounding these problems is the fact that technology is inconsistently regulated under national or international law. As technology spans international borders and enters nation states with either weak rule of law, or on the flip side, overbearing restrictions, society must rely on private organizations, corporations, and consortia to self-regulate when it comes to sociotechnical decision-making. Thus, vulnerable individuals (particularly the poor and marginalized) whose individual rights are not adequately protected by local, national, and international laws and standards, and who are not developers of the technology itself, often have very little or no recourse for harm done to them as a result of sociotechnical implementation. The result, intended or not, is that the world's poor risk being exposed to a range of technologies that are not designed with their personal well-being in mind. Such technologies are designed with commercial purposes in mind, and users may be unaware of the harm they are exposed to if their personal information is collected, stored on systems of private companies or consortia, coded, and used in ways that users never intended. Once the three technology myths are exposed, the need for conscious, purposeful human leadership becomes apparent. Technology cannot replace Deliberate Leadership which requires leaders to think, feel, imagine, and exercise moral judgments in finding solutions to Wicked Problems, in ways only humans can. Adoption of the Deliberate Leadership principles set out in Chapter 2 can help manage the use of technology in order to increase its benefits and reduce its harm. Highlighted throughout the chapter are the Deliberate Leadership principles necessary to harness technology for social good including compassion for others, seeking out community voices and placing them at the center of the table (i.e., the technology is in service to them not vice versa), collaboration with others working to address the same or related Wicked Problems, candor to speak the truth about what is and isn't working with a technology especially in terms of serving social good and doing no harm, and technology's support for all forms of capital-social, environmental and financial-to address Wicked Problems. Before leaders undertake any technology solution, they first should answer two important questions. • How do we ensure that revolutionary innovations and good intentions in using technology translate to meaningful progress rather than harm? • Do we have the foresight to avoid harm to society to avoid repeating the sad refrain, "If only we'd known?" Intentional and authentic application of Deliberate Leadership principles provides a superstructure to help answer those questions. In our research and interviews we find two things. First, that in the rush to apply technology to solve either Wicked Problems or transactional problems, leaders do not adequately address these questions up front. And, second, that liability for harm to individual rights arising from technical solutions applied to the social sphere is not adequately addressed. Answering these questions requires a new understanding of technology and the challenges of sociotechnical implementation, which is often uncomfortable to discuss, and too often not honestly addressed by technology leaders. We look at these challenges through three critical structural issues with Big Tech and Big Finance. Each raises the issue of who controls access to the technologies, and the assets and voices of those using the technologies. Essentially each challenge is about democratic control and engagement, both of which are foundational aspects of Deliberate Leadership. The promise and perils of Big Tech is nowhere more apparent than in the issue of control. Technology's predominance in the social sphere-the proliferation of free accounts on social media, the internet, and communications and mobile technology-promises access, assets, and voice to society. But at the same time, there are perils in having technologies that are widely used and adopted by a diverse and global body of people being controlled by a handful of tech investors and tech executives who have corporate control over the implementation of technology. This tiny group have strategic and voting control over many of the leading technology corporations, and the sociotechnical direction they take is an important issue that today's leaders from both the private and public sector must be prepared to address. For example, Tech entrepreneur founders such as Mark Zuckerberg (Facebook) (2.38B monthly users), Sergey Brin and Larry Page (Alphabet Google)-which had seven products with 1 billion users as of 2016 (Harding 2016) , and other tech entrepreneurs worldwide have more powerful voting rights than other stockholders giving them voting control over their companies and their corporate and sociotechnical decisions even though the firms' economic interests are more widely held across a diversity of shareholders (Govindarajan et al. 2018) . For instance, Facebook's CEO Mark Zuckerberg controls over 50% of the voting power of Facebook. In 2016, Facebook "issued a new class of stock that [allowed Zuckerberg] … to maintain control over the company even if he sells or gives away most of his shares" (Ingram 2016) . Because of Zuckerberg's voting control and final say over the company's decisions he is ultimately responsible for the company's decisions. In testimony before the United States Senate on April 10, 2018 over the privacy breach of the accounts of millions of Facebook users, Zuckerberg apologized and acknowledged his responsibility for the sociotechnical harm that arose in the privacy breach. He stated: "We didn't take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I'm sorry. I started Facebook, I run it, and I'm responsible for what happens here" (Zuckerberg 2018) . Some tech companies such as Twitter-330 million monthly active users as of the first quarter 2019 (Statista 2020c)-do not have this share structure, but even they retain the right to issue preferred stock with special voting rights (Kafka 2013) . The practice of concentrating control over the company's decisionmaking by placing majority voting power in the hands of a concentrated few blends "public shares with the private-equity model- [and] many investors bristle at its undemocratic nature" (Tan and Robertson 2017). A second major issue facing not only Big Tech but also Big Finance is the role and governance of digital assets such as cryptocurrency. Explored here are the challenges, promises, and pitfalls of blockchain and distributed ledger technology that underlie digital assets. Today there are two competing overarching visions of blockchain and distributed ledger technology that are driving its development and implementation. The "public" vision is one based on a 100% independent, distributed, decentralized, autonomous, blockchain such as the public blockchain that underlies Bitcoin. Computer scientist Roshaan Khan describes this vision as one of a supplemental world economy where "code is the law and you don't have to rely on the nature of humans" (Khan 2019) . For example, Bitcoin's public blockchain is distributed, decentralized, and autonomous with its code verified by the consensus of the public at large. No single individual, group, or company can exert control over the blockchain or the verification of its code. For instance, Bitcoin and the public blockchain vision supports the notion that even if you come from a country that has hyperinflation or conflict, you could transact in bitcoin independent of the government's fiscal policy and tap into a supplemental world economy that's not reliant on national law or policy. The second, "private" vision is just that-an organization (such as a financial institution, corporation, non-governmental organization [NGO], etc.) can have some control over who can join the network and code verification and so they use private blockchain technology or a distributed ledger technology. The vision is that private organizations can exercise some control over code verification while enjoying the benefits of a distributed ledger. Many NGOs and companies are piloting private blockchains in the social space. Consortium blockchains which combine characteristics of public and private blockchains and involve groups of organizations as well as common ledger technology such as Ripple also have greater control. Blockchain technology and distributed ledger technology has seen a meteoritic rise in both investment and interest for its application in the social sphere. Some advocates hail it as the holy grail-a game changer for solving Wicked Problems. Yet, it comes with a set of implementation challenges as well, ones that if met with a Deliberate Leadership approach could lead to less social harm being done, and social good being optimized. In social finance, the notion of blockchain as a force for good is largely premised on the promise that: (1) technology will be used and applied in the way that developers and investors originally intended it to be used; (2) partners, users, governments, and social impact investors share the same institutional, moral, and ethical view of how technology will and should be used, developed, and applied in the social sphere; and (3) technology is immutable and free from vulnerability. These promises form many of the core preconceptions of sociotechnical implementation. The truth is that if these promises are breached, it can bring harm to the lives and welfare of people. Furthermore, there can be a general lack of remedy for breaches that cause harm particularly in vulnerable and poor communities where not only can the rule of law be weak, but the distance (physically, technologically, culturally, ideologically, etc.) between technology developers and communities can be very great. In Chapter 4, we mentioned that a weakness of impact accounting at present is that the intended beneficiaries of impact investing (e.g., the poor and marginalized) were excluded from the accounting process. This is in stark contrast to financial accounting, which is targeted at a particular group of stakeholders, i.e., investors. A similar weakness to do with control and accountability is apparent in cryptocurrency. When a sociotechnical initiative is launched, there is generally great enthusiasm and high hopes for success. Over time though, implementation challenges, costs, and liability for the sociotechnical program can cause service levels and performance to wane. Service levels and performance are of the major legal issues regarding the implementation of blockchain technology projects in the commercial world (McKinlay 2018). When technologies are deployed in the social space there can be even fewer financial incentives for vendors to commit to performance assurances. This is particularly true when an investor or private third party financing the project runs out of funding or loses interest in the social impact project. We find that there is very little if any insurance when social impact projects are undertaken that service levels and performance will continue in the event that a social impact investor runs out of money, resources, or loses interest in the initiative. This can cause people who choose to rely on the technology and data to be left with technology solutions that no longer are serviced or performed. This loss of access to technology and data resources can harm communities and individual's lives, health, benefits, and welfare; it is a failure of leadership. In this section we take a closer look at some examples of some of the ways that blockchain and common ledger technology is being envisioned and piloted in the social sphere. Public and private blockchains and common ledgers are being piloted and applied for financial inclusion in many sectors and in many ways. Let's look at the vision proponents have about the role of cryptocurrencies and public blockchains in achieving the vision of an alternative or supplemental economy. Of particular interest is their potential relationship to poor and vulnerable people who have escaped conflict or have been subject to the ravages of hyperinflation, lack of access to financial intermediaries, conflict, censure, corruption, modernday enslavement, and failed economic policies which have subjected them to famine, extreme poverty, loss of identity, and despair. In a December 2018 Time ideas article, Alex Gladstein, Chief Strategy Officer at the Human Rights Foundation (an NGO which itself has been an early adopter of accepting bitcoin donations) observed that facing "hyperinflation and strict financial controls, Venezuelans are adopting and experimenting with Bitcoin as a censorship-resistant medium of exchange" (Gladstein 2018) . He observed that facing long food lines and rationing, no savings, censuring, and foreign wire transfer fees as high as 56%, people have few other options. Gladstein and others see public blockchains which underlie cryptocurrencies such as Bitcoin, enabling a supplemental economy that can offer individuals an alternative to state currencies which are subject for instance to hyperinflation and ravaged domestic economies. And as a means through which they can use the peer-to-peer technology to receive bitcoin from relatives outside the country on their mobile phones. He observes that "In a refugee camp, you might not be able to access a bank, but as long as you can find an Internet connection, you can receive bitcoin, without asking permission and without having to prove your identity" (Gladstein 2018) . Gladstein acknowledges the problems of using Bitcoin and public blockchains as a new technology as it "doesn't offer cutting-edge usability, speed, or privacy." However, looking into the future he sees decentralized technologies including public blockchains as a means to provide individuals with freedom and control over their assets. It can be a "countering force" to the prospect of authoritarian regimes using "peer-to-peer digital money to create state-controlled cryptocurrencies like the Petro, which could allow them to more effectively censor transactions, surveil user accounts, and evade sanctions" (Gladstein 2018) . But on the flip side, legal advocates warn that because cryptocurrencies and public blockchains operate outside of the sphere of government regulation, they can be used by corrupt organizations and individuals to advance social harm. In addition, they can be subject to volatility and are not immune from breaches of security. Some tech leaders are advocating other alternatives to public blockchains or state-controlled cryptocurrencies to reach the world's unbanked. For example, in June 2019 The Libra Association, (a Swissbased independent membership organization whose members include multinational companies, venture capitalists, and nonprofits which was established to manage the Libra project, currency, and transactions) published a white paper in which it outlined its plan to create "a simple global currency and financial infrastructure that empowers billions of people" (Libra 2019a ). The proposal is that the Libra project, its currency, and its transactions will be managed and cryptographically entrusted to the Libra Association Initially described as "a decentralized, programmable database designed to support a low-volatility cryptocurrency", Libra was originally envisioned to be a type of global currency able to rival the major national currencies of the world (Libra 2019b). It was considered especially important for the millions of unbanked people in countries such as India-countries which also have high numbers of Facebook and related products users (Fig. 5.1) . However, the Libra has faced opposition. In 2019, Bruno Le Maire, France's finance minister held that plans for Libra's development in Europe "could not move ahead until concerns over consumer risk and governments' monetary sovereignty were addressed" (Partington 2019) . Blockchain is envisioned for other uses in the social sphere. One way is in the implementation of rights (from land and title, i.e., property, rights, etc.). Other applications include helping small businesses prove creditworthiness, and more. For example, in 2017, Coindesk reported that Arjuna Costa, a partner at Omidyar Network, observed that land registries could be a use case by which people may be able to establish property rights and title that they could use as collateral for loans (Hochstein 2017) . Another use case Costa envisions is "using blockchains to analyze payment flows (including receipts and invoices) for small businesses, which would then help financial institutions to assess their creditworthiness and therefore lend to them" (Hochstein 2017) . In some cases, it is claimed that the impact of a technology is additive or transformative in and of itself. Having access to it levels the playing field of power. An example of this additive theory of technology is that the Internet, per se, democratizes access to information. Or equally technologies that help administer and speed up development aid are a de facto good. However, the situation is more complicated than it might seem. The 2017 implementation of the United Nations World Food Program (WFP) trial in using blockchain to distribute funds to refugees combined with eye-scanning hardware made by London-based IrisGuard is one example of the concerns technologists and human rights advocates have voiced in the way that blockchain technologies are being combined with other forms of data such as biometrics (iris scanning) data. On May 1, 2017, the United Nations World Food Program launched their Building Blocks trial program in Jordan that used a private or permissioned version of the Ethereum blockchain to provide aid in the form of "cryptographically unique coupons" equivalent to a certain amount of local money to stores in five refugee camps in Jordan (Juskalian 2018) . To redeem the coupons people would have their eye scanned and their personal biometric data would allow them to access the funds through the coupons (del Castillo 2017). The use of the iris scanning technology to collect personal biometric data by IrisGuard was already being used to for identity verification of more than 500,000 receiving traditional aid (del Castillo 2017). So, using the same biometric data system in this program was just seen as just a new application of that (del Castillo 2017). Yet, social change advocates voiced their concerns particularly with respect to who controls the personal data through a private blockchain, and the hazards associated with the bulk collection of identifying information. Zara Rahman, a data and technology researcher at the nonprofit, Berlin-based, Engine Room, observed, "information and biometrics has historically been a disaster for people on the run. Think of the Holocaust, or the more recent ethnic cleansing of Rohingya in Myanmar" (Juskalian 2018) . From a sociotechnical implementation perspective, refugees as a group must submit to their biometric identifying information being captured by the private blockchain system-a system which uses their data in ways they currently have no control or say over. Russ Juskalian in the MIT Technology Review April 2018 article "Inside the Jordan refugee camp that runs on blockchain" observed that unlike a public blockchain where "anyone can join the network and validate transactions…On a permissioned blockchain, a central authority decides who can participate" (Juskalian 2018) . He notes that the public blockchain consensus system "makes it difficult for any one person or agency to tamper with or forge transactions, but transaction fees tend to add up" (Juskalian 2018) . The benefit of the private blockchain is that the WFP "can process transactions faster and more cheaply. The downside is that since the WFP has control over who joins its network, it also has the power to rewrite transaction histories. Instead of cutting the banks out of the equation, it has essentially become one" (Juskalian 2018) . Juskalian also raises the concern whether under such a system of biometric identification (IDs) refugees will gain ownership of their own "digital identification" or whether such systems "simply become an easier way for corporations and states to control people's digital existence" (Juskalian 2018) . Further, others question whether or not other forms of identification other than biometric data such as passcodes or numbers could accomplish the identification task to protect the personal information of the individual. Technology that decreases ownership over one's digital identity exemplifies for some the tilting of democratic control and access to information even further from communities and further serves to amplify the status quo. In this example, the decision to use biometric data was motivated by the success of the existing use of the technology for the disbursement of traditional aid. But, since individuals currently do not have complete control over their personal data and digital identification through the WFP system, does technology risk amplifying the existing inequalities of users in terms of their individual control over their own biometric data vs. institutional collection and control and private consensus? These are important types of questions for Deliberate Leaders to consider particularly in light of the complexity of sociotechnical implementation. Examples that highlight problematic aspects of blockchain technology when it comes to social impact are a reason to consider Deliberate Leadership, not to dismiss the technologies entirely. An example where blockchain technology is increasing people's power is the administration of individual property rights. For instance, DeSoto, Inc. (an independent third-party venture) with offices in the United States and Peru led by economist Hernando de Soto and Overstock.com CEO technology entrepreneur, Patrick Byrne, are using blockchain technology "to create the world's first 'Global Property Rights Book' providing businesses, governments, and individuals better information about who has enforceable property rights over assets worldwide" (DeSoto 2019). DeSoto, Inc. stresses that its approach differs from other land rights registries using blockchain and other companies such as Factom because it has gone to local communities to collect and input existing land rights that are not formally recorded in a national registry but rather are held by local individuals under a series of local formats (Allison 2018) . At the 2018 Skoll World Forum at the University of Oxford, DeSoto, Inc. COO, Julie Smith, highlighted the sociotechnical problems that DeSoto, Inc. is trying to address. First, turning property rights into formally recorded titles can be extremely costly in many parts of the world. She observed "in Tanzania obtaining title in land takes 19 steps, 380 days, $1,443 in costs, which is a small fortune in a place where per capita GDP is $877 so who'd go through this process?" (Skoll World Forum 2018). Second, Smith noted that when people do not have their ownership rights recorded in national registries, it can place limitations on their lives. As Smith further noted, without formal ownership it is possible individuals cannot register a business, have access to the legal system, and engage in basic other human development functions (Skoll World Forum 2018) . Smith also pointed out that when property rights are not documented they can be subject to armed terrorist groups who take over a region's property (Skoll World Forum 2018) . Additionally, environmental damage can occur with lack of accountability due to property not being properly recorded (Skoll World Forum 2018). Finally, particularly in post-conflict regions and/or nations which have experienced forced migration with refugees entering neighboring regions and setting up refugee camps, informal rights to land and property can be challenged. Smith points to the benefits DeSoto sees in surfacing informal ledgers including a decrease in terrorists' power over informal property rights, capital is unleashed in a widely distributed manner, and entrepreneurs with formally recorded property rights can go to work lifting the population of poverty (Skoll World Forum 2018) . Technology can scale to reach millions of people at one time. Humanity United (HU) is using blockchain and other technologies in a different way. A key component of the Omidyar Group, Humanity United is a private philanthropic foundation founded by Pam Omidyar (pfc social impact advisors October 2018, p. 9). Pam, and Pierre Omidyar (the founder of eBay), furnished the motivating ideas behind Humanity United-human dignity, the unity of humankind, and the need for collaborative, cooperative actions in the face of Wicked Problems. Applying a mix of philanthropic and investment capital, Humanity United has gone all-in to make progress against forced labor. Its story is one of evolution, moving from grants with a focus on education and field building and advocacy, to one that is more focused on using and applying technology in supply chains. Modern slavery pervades almost all countries. In 2018, the Global Slavery Index found instances of human bondage in nearly all 167 nations it studied, with North Korea, Eritrea, Burundi, the Central African Republic, and Afghanistan targeted as the worst offenders (as cited in pfc social impact advisors October 2018, p. 5). Girls and women, many of whom are caught up in human trafficking, make up 71% of victims (pfc social impact advisors October 2018, p. 5). Supply chains, which involve a significant proportion of the estimated 16 million forced laborers (pfc social impact advisors October 2018, p. 7), have become a key focus of impact investors, including Humanity United's efforts. As supply chains for the global economy expand, with factories and other facilities located many time zones away from home offices, it is difficult to know the conditions under which workers are creating a firm's products. In some cases, management and shareholders are genuinely uninformed about the human cost involved; other times they choose simply not to find out. But in either case, it turns out to be a major problem for firms when the world finally discovers what is happening through a disaster or an investigative exposé. Humanity United set up the Working Capital Investment Fund (Working Capital) to address the lack of technical tools to provide visibility into and enhance accountability for corporations in their own supply chains. It invests in the development of the third-party solutions-technological tools to help firms monitor and improve their own performance in combating forced labor. Publicly launched in January 2018, as of June 2020 Working Capital has eight partners-the original six, The Walmart Foundation, C&A Foundation, Stardust Equity, Open Society Foundations (Soros Economic Development Fund), The Ray and Dagmar Dolby Family Fund, and The Walt Disney Corporation-and two more limited partners, the Children's Investment Fund Foundation (CIFF) and Zalando (pfc social impact advisors October 2018, p. 18). With the UK Department for International Development contributing 2.4 million British Pounds (about US$3.5 million), Working Capital is fully capitalized at US$23 million. As of June 2020, Working Capital has invested in nine technology companies (Provenance 2020). One of its earliest investments, Provenance uses blockchain technology that enables "brands, suppliers, and stakeholders to trace products along their journey from producer to consumer," accompanied by verified data on labor conditions that are attached to the blockchain ledger. The tool also provides workers with a secure and confidential platform for reporting on working conditions" (pfc social impact advisors October 2018, p. 22). Prior to the investment, Provenance received an $85,000 grant from Humanity United that enabled it to pilot the tech platform focusing on Indonesian fisheries. Since then, Provenance's blockchain technology is being used by over 200 companies (Provenance 2020) . In terms of public blockchain, cryptocurrencies such as Bitcoin are increasingly being donated by a new group of donors who have made recent wealth in cryptocurrencies and see that they can donate this property to nonprofit organizations such as The Human Rights Foundation without incurring capital gains tax and other tax implications in some jurisdictions. This has opened up a new door for NGOs and charities receiving money to fund their social missions. Yet, tracking the source of the wealth is not always possible. Examples such as the above are encouraging for social investors interested in the role of Big Technology. However, we need to be cautious about making generalizations based on limited evidence. Perhaps one of the greatest myths of sociotechnical implementation is what is known as the fallacy of composition, "The error of assuming that what is true of a member of a group is true for the group as a whole" (Oxford University Press 2019). Applied to the sociotechnical space, for example, people may assume that if a part of the technology is secure then the whole technology system will be secure. Or, if each member of the technological implementation team is moral, then the group as a whole will produce moral outputs, etc. As demonstrated here, this is a fallacy particularly salient in the sociotechnical implementation of blockchain technology. A third challenge which has already raised its head in the previous two challenges is privacy of individual information. The right to privacy is referenced in the legal traditions of approximately 150 jurisdictions (constituteproject.org 2017). It is a right that our research found is often challenged in the implementation of commercial and sociotechnical solutions. Why is this? Privacy rights are designed to limit the actions of individuals, organizations, and governments from encroaching on individuals' private enjoyment of rights. However, the rise of technologies such as AI, the Internet, blockchain, and social media have been so rapid because of the amount of data that people are increasingly making public. This has increased the ability of individuals, governments, and companies to collect and harvest data on individuals worldwide. As access to more data becomes more available and more integrated and accessible, the threat of compromising privacy rights grows (often in ways that system developers had not intended or anticipated). Privacy rights are often most challenged with new technologies (or in the case of social impact investing new applications of commercial technology in the social sphere) because the focus of technology leaders and sociotechnical implementers is generally not on privacy but on using technology to address other problems in the transactional environment involving scale, data capture, speed, integration, and transactions. Historically, "privacy first" has not been the siren song of most technology implementers. Instead the focus has been on growth and the push to make more things public including connections between friends (for example, Facebook), businesses, individuals, and partners (for example, LinkedIn), and a rush to publish information online that now can be used and mined by companies, individuals, and nation states. For instance, Facebook only recently underwent a "Pivot to Privacy" after large breaches in privacy came to light (Lapowsky and Thompson 2019) . Beyond free services that people can choose to use or not use, other issues of privacy related to sociotechnical implementation concern the question: Can people truly opt out of technologies which they do not opt into but which intrude on their individual privacy? Technologies can be used for purposes of tracking and surveillance by governments, individuals, and organizations in ways that violate individual privacy rights. The European Union has legislation that strictly regulates what companies can do without the consumer's explicit consent (and the requirement is that users must "opt in" rather than "opt out"), but in much of the world-in countries rich and poor-this is not the case. Many consumers who use search engines, e-commerce, social media, email, and other technologies do not understand the ways that companies and governments are using their data-their voice-until after a data privacy breach is revealed. Although legal disclaimers and opt out provisions may be in place, many users are driven by the need to connect to friends, colleagues, or businesses by using technology. They find it difficult, if not impossible, to maintain individual privacy rights without feeling they are being cut adrift from technologies that seem essential to live in the modern world. The challenge of protecting individual privacy rights is even greater in nation-states with weak rule of law where privacy rights are not well protected, and in developing nations where people may have very limited resources, education, and ability to opt out of technologies that are introduced. Leaders in the social impact space must be particularly mindful of privacy rights when they apply sociotechnical solutions to address Wicked Problems. Yet, we often find that leaders are not mindful of individual privacy rights in their implementation of sociotechnical solutions. The right to privacy is too often considered by leaders to be either a right that stands in the way of growth or progress, or an individual right that can be violated in the interests of national growth or, in the context of social impact investment, a larger social objective and bringing about social change. Many projects place progress over privacy. Many individuals whose privacy rights are violated, either are not aware of the violation of their rights until after damage has occurred, or do not have the knowledge or resources to effectively seek recourse or remedy for a violation of their privacy rights. As technologies are combined to hasten the sharing of data between and among organizations and jurisdictions, the problem of privacy protection becomes even more glaring, particularly as technology and data span both jurisdictions that do and do not recognize the individual right to privacy. Systems that combine or utilize technologies such as blockchain, social media, fintech and accounting systems, Internet/e-commerce technologies, and AI on the one hand offer the promise of doing good for communities and solving Wicked Problems, but on the other hand they can also be used for surveillance and used as instruments of oppression, violence, and abuse, to influence the outcomes of democratic elections and voting. They can even be used to curtail human rights if controlled by authorities in states with weak human rights protections or cyber adversaries. Often technical solutions framed in the social context that are designed to solve a particular problem in the transactional environment can be used in ways other than investors and leaders intended particularly when it comes to the right of privacy. The chorus of "we didn't know what we didn't know" seems to be an all too common theme of leaders we interviewed. This increases the risk that people can do harm to others with little recourse to the person harmed. This is not the outcome that benefits society. In the wake of events such as the 2018 revelation of the Facebook-Cambridge Analytical data breach where the personal data of millions of Facebook users was harvested without their consent, there has been talk of a "growing backlash among the public and government officials against big technology companies over privacy and transparency concerns, dubbed the tech-lash" (Murphy 2019 ). Yet, while the concerns over privacy rights are both observable and widespread by leaders of sociotechnical projects, when leaders of those projects were asked about privacy rights their thinking seemed to be that the goodness of their social mission outweighed the potential harm of violations of privacy. For example, we asked, technology entrepreneurs to ask themselves the question, "it's five years or ten years down the road [from now] … and you're testifying before Congress [about a breach of privacy rights] and you were too idealistic [about the technological solution that you put in place], what do you learn from this emerging field? What are you experiencing now that you are afraid of? …. What's getting in your way?" We found when entrepreneurs paused to answer this question, they identified either aspects of their own technology or more commonly that of another person's technology that could foreseeably cause harm to individual rights of privacy. Yet, their responses overwhelmingly seemed to suggest that the foreseeable harm to privacy was outweighed with the need to act to fill a void in the data available or to "bring change." Such thinking seems to be common. The push to collect more data and to make it public or to structure, index, integrate, or to connect it is too often a far bigger economic, individual, and societal driver than privacy rights. And, even though leaders of sociotechnical projects we interviewed touted the societal benefits of their solutions, few addressed the issue of privacy rights. So, what do we learn from this? Well, there seems to be a general trend toward justifying the foreseeable violation of privacy rights with a broader organizational or individual mission. This is of concern to advocates of privacy rights because once privacy rights are violated there is currently very little that can be done by those harmed in terms of restitution or remedy. Although, as noted, the European Union has introduced legislation to protect individual privacy, other countries are currently not regulated in this way. Social impact leaders work at a different scale, but nonetheless they envision large-scale projects that cut across jurisdictions, and the question of how privacy rights will be protected under these projects remains open. Another privacy issue that challenges the implementation of commercial solutions and their sociotechnical applications, involves the sharing of data between and among partners and third-party developers. This problem which occurs in the commercial sector may have similar effects on privacy rights in the social sector, insofar as third parties using data that is captured through sociotechnical initiatives in ways that were unintended. How we safeguard the right of privacy and personal data is crucial. Individuals that favor open access argue that restricting public access in the name of individual privacy can also invite corruption. It can enable those who wish to keep private transactions that are illegal. And it can curtail public review of the data. Clearly, the issue of privacy remains a crucial issue with many far-reaching implications that leaders must consider at the outset of their implementation of technology. Sociotechnical solutions confront us with new opportunities and risks. We have seen in this chapter that there are several features of technology (both positive and negative) that are replicated when a commercial technology is repositioned in a social setting. These can include the ability to speed up transactions, the ability to store and combine data in ways that can provide users with new insights, and new ways to communicate. Yet, we have also seen how vital it is that those implementing the technology take a broader view of technology such that they protect the privacy, security, property, and human rights of users. They need to ensure that vulnerable communities, who may not have a role in the development of the technology, are given a voice that is heard at the top. Technology leaders must also take responsibility for their platforms and solutions so that they are used for good and not for bad. Social change and technology experts need to know more about each other's fields. In particular, they need to know that there can be different forms of capital and different forms of shareholder control of technology companies that can accelerate certain approaches to sociotechnical solutions across these fields that may bring about diverse sets of outcomes both positive and negative when the technology is implemented and grows to include a large global user base. We have examined how crucial the leadership practice of taking a broader view of technology is in influencing social change. And, we have discussed the myths and realities of sociotechnical implementation. Technology's implementation in the social sphere requires leadership that seeks out many voices before they act, who engage the communities to be affected by their technologies, who are candid about what is and is not working with their technology. This applies equally to situations where technology is used for a specific social purpose, and ones where the technology is held to be the improvement and its unintended or unanticipated side-effects are ignored (adaptive technology). Leaders involved in technical implementation must have the foresight to understand the role of their technology in the broader social sphere and the good and the bad that can result from the sociotechnical strategies and choices that they make. Hernando de Soto and Patrick Byrne's Mission to Put the Developing World's Property Rights on a Blockchain. International Business Times UK A Branch of the UN Just Launched Its First Large-Scale Ethereum Test. CoinDesk (Blog) Why Bitcoin Matters for Freedom. Time Should Dual-Class Shares Be Banned? Google Has 7 Products With 1 Billion Users. Popular Science How Digital Financial Services Are Empowering Women. IMF Finance & Development Magazine Blockchain for Inclusion? Gates Foundation Strikes Tepid Tone at Money2020. CoinDesk (Blog) Mark Zuckerberg's Absolute Control Over Facebook Is Not New. Fortune Inside the Jordan Refugee Camp That Runs on Blockchain One Thing Twitter Won't Have When It Goes Public: Two Classes of Shares Author Interview with Roshaan Khan Facebook's Pivot to Privacy Is Missing Something Crucial. Wired Libra White Paper | Blockchain, Association The Libra Blockchain Blockchain: Background, Challenges, and Legal Issues. DLA Piper Mark Zuckerberg Calls for More Regulation of Big Tech Fallacy of Composition | Definition of Fallacy of Composition in English by Oxford Dictionaries France to Block Facebook's Libra Cryptocurrency in Europe. The Guardian Launching the Working Capital Fund. A Case Study of Humanity United The Blockchain, Artificial Intelligence and the Future of Impact Finance Number of Monthly Active Twitter Users Worldwide from 1st Quarter 2010 to 1st Quarter Why Investors Are Fretting Over Dual-Class Shares Global Findex Database Hearing Before the United States Senate Committee on the Judiciary and the United States Senate Committee on Commerce, Science, and Transportation