A number of respondents to this canvassing about the likely future of social and civic innovation shared concerns. Some said that technology causes more problems than it solves. Some said it is likely that emerging worries over the impact of digital life will be at least somewhat mitigated as humans adapt. Some said it is possible that any remedies may create a new set of challenges. Others said humans’ uses and abuses of digital technologies are causing societal harms that are not likely to be overcome.
The following comments were selected from among all responses, regardless of an expert’s answer to this canvassing’s main question about the impact of people’s uses of technology. Some of these remarks of concern happen to also include comments about innovations that may emerge. Concerns are organized under four subthemes: Something is rotten in the state of technology; technology use often disconnects or hollows out a community; society needs to catch up and better address the threats and opportunities of tech; and despite current trends, there is reason to hope for better days.
The chapter begins with some overview insights:
Larry Masinter, internet pioneer, formerly with Adobe, AT&T Labs and Xerox PARC, who helped create internet and web standards with IETF and W3C, said, “Technology and social innovation intended to overcome the negatives of the digital age will likely cause additional negative consequences. Examples include: the decentralized web, end-to-end encryption, AI and machine learning, social media.”
James Mickens, associate professor of computer science at Harvard University, formerly with Microsoft, commented, “Technology will obviously result in ‘civic innovation.’ The real question is whether the ‘innovation’ will result in better societal outcomes. For example, the gig economy is enabled by technology; technology finds buyers for workers and their services. However, given the choice between an economy with many gig workers and an economy with an equivalent number of traditional middle-class jobs, I think that most people would prefer the latter.”
Michael Aisenberg, chair, ABA Information Security Committee, wrote, “Misappreciation of limits and genesis of, e.g., AI/machine learning will produce widely disparate results in deployment of tech innovations. Some will be dramatically beneficial; some may enable abuse of law enforcement, economic systems and other fundamental civic institutions and lead to exacerbation of gaps between tech controllers/users and underserved/under- or mis-skilled populations (‘digital divide’) in what may be a significant (embed limitations on career/economic advancement) or even life-threatening (de facto health care or health procedure rationing) manner.”
The problem is that we are becoming more and more dependent on machines and hence more susceptible to bugs and system failures.
Yaakov J. Stein
Peter Lunenfeld, a professor of design, media arts and digital humanities at the University of California, Los Angeles, and author of “Tales of the Computer as Culture Machine,” predicted, “We will use technology to solve the problems the use of technology creates, but the new fixes will bring new issues. Every design solution creates a new design problem, and so it is with the ways we have built our global networks. Highly technological societies have to be iterative if they hope to compete, and I think that societies that have experienced democracy will move to curb the slide to authoritarianism that social media has accelerated. Those curbs will bring about their own unintended consequences, however, which will start the cycle anew.”
Yaakov J. Stein, chief technology officer of RAD Data Communications, based in Israel, responded, “The problem with AI and machine learning is not the sci-fi scenario of AI taking over the world and not needing inferior humans. The problem is that we are becoming more and more dependent on machines and hence more susceptible to bugs and system failures. This is hardly a new phenomenon – once a major part of schooling was devoted to, e.g., penmanship and mental arithmetic, which have been superseded by technical means. But with the tremendous growth in the amount of information, education is more focused on how to retrieve required information rather than remembering things, resulting not only in less actual storage but less depth of knowledge and the lack of ability to make connections between disparate bits of information, which is the basis of creativity. However, in the past humankind has always developed a more-advanced technology to overcome limitations of whatever technology was current, and there is no reason to believe that it will be different this time.”
A vice president for research and economic development wrote, “The problems we see now are caused by technology, and any new technological fixes we create will inevitably cause NEW social and political problems. Attempts to police the web will cause freedom of speech conflicts, for example.”
Something is rotten in the state of technology
A large share of these experts say among the leading concerns about today’s technology platforms are the ways in which they are exploited by bad actors who spread misinformation; and the privacy issues arising out of the business model behind the systems.
Misinformation – pervasive, potent, problematic
Numerous experts described misinformation and fake news as a serious issue in digital spaces. They expressed concern over how users will sort through fact and fiction in the coming decade.
Stephanie Fierman, partner, Futureproof Strategies, said, “I believe technology will meaningfully accelerate social and civic innovation. It’s cheap, fast and able to reach huge audiences. But as long as false information is enabled by very large websites, such social and civic innovators will be shadow boxing with people, governments, organizations purposely countering truthful content with lies.”
Sam Lehman-Wilzig, a professor of communications at Bar-Ilan University specializing in Israeli politics and the impact of technological evolution, wrote, “The biggest advance will be the use of artificial intelligence to fight disinformation, deepfakes and the like. There will be an AI ‘arms race’ between those spreading disinformation and those fighting/preventing it. Overall, I see the latter gaining the upper hand.”
Greg Shatan, a lawyer with Moses & Singer LLP and self-described “internet governance wonk,” predicted, “I see success, enabled by technology, as likely. I think it will take technology to make technology more useful and more meaningful. Many of us pride ourselves on having a ‘BS-meter,’ where we believe we can tell honestly delivered information from fake news and disinformation. The instinctual BS-meter is not enough. The next version of the ‘BS-meter’ will need to be technologically based. The tricks of misinformation have far outstripped the ability of people to reliably tell whether they are receiving BS or not – not to mention that it requires a constant state of vigilance that’s exhausting to maintain. I think that the ability and usefulness of the web to enable positive grassroots civic communication will be harnessed, moving beyond mailing lists and fairly static one-way websites. Could there be ‘Slack for Community Self-Governance?’ If not that platform, perhaps something new and aimed specifically at these tasks and needs.”
Oscar Gandy, a professor emeritus of communication at the University of Pennsylvania, said, “Corporate actors will make use of technology to weaken the possibility for improvements in social and civic relationships. I am particularly concerned about the use of technology in the communications realm in order to increase the power of strategic or manipulative communications to shape the engagement of members of the public with key actors within a variety of governance relationships.”
An expert in the ethics of autonomous systems based in Europe responded, “Fake news is more and more used to manipulate a person’s opinion. This war of information is becoming so important that it can influence democracy and the opinion of people before the vote in an election for instance. Some AI tools can be developed to automatically recognize fake news, but such tools can be used in turn in the same manner to enhance the belief in some false information.”
A research leader for a U.S. federal agency wrote, “At this point in time, I don’t know how we will reduce the spread of misinformation (unknowing/individual-level) and disinformation (nefarious/group-level), but I hope that we can.”
A retired information science professional commented, “Dream on, if you think that you can equate positive change with everybody yelling and those with the most clout (i.e., power and money) using their power to see their agendas succeed. Minority views will always be that, a minority. At present and in the near future the elites manipulate and control.”
A research scientist for a major technology company whose expertise is technology design said, “We have already begun to see increased protections around personal privacy. At present, it is less clear how we might avoid the deliberate misuse of news or news-like content to manipulate political opinions or outcomes, but this does not seem impossible. The trick will be avoiding government censorship and maintaining a rich, vigorous exchange of opinions.”
Privacy issues will continue to be a hot button topic
Multiple experts see a growing need for privacy to be addressed in online spaces.
Ayden Férdeline, technology policy fellow at the Mozilla Foundation, responded, “Imagine if everyone on our planet was naked, without any clear options for obtaining privacy technology (clothing). It would not make sense to ask people what they’d pay or trade to get this technology. This is a ‘build it and they will come’ kind of scenario. We’re now on the verge, as a society, of appropriately recognizing the need to respect privacy in our Web 2.0 world, and we are designing tools and rules accordingly. Back in 1992, had you asked people if they’d want a free and open internet, or a graphical browser with a walled garden of content, most would have said they prefer AOL. What society needed was not AOL but something different. We are in a similar situation now with privacy; we’re finally starting to grasp its necessity and importance.”
We’re now on the verge, as a society, of appropriately recognizing the need to respect privacy in our Web 2.0 world, and we are designing tools and rules accordingly.
Graham Norris, a business psychologist with expertise in the future of work, said, “Privacy no longer exists, and yet the concept of privacy still dominates social-policy debates. The real issue is autonomy of the individual. I should own my digital identity, the online expression of myself, not the corporations and governments that collect my interactions in order to channel my behaviour. Approaches to questions of ownership of digital identity cannot shift until the realization occurs that autonomy is the central question, not privacy. Nothing currently visible suggests that shift will take place.”
Eduardo Villanueva-Mansilla, an associate professor of communications at Pontificia Universidad Catolica, Peru, and editor of the Journal of Community Informatics, wrote, “I’m trying to be optimistic, by leaving some room to innovative initiatives from civic society actors. However, I don’t see this as necessarily happening; the pressure from global firms will probably too much to deal with.”
An international policy adviser on the internet and development based in Africa commented, “Technology is creating and will continue to evolve and increase the impact of social and civic innovation. With technology we will see new accountability tools and platforms to raise voices to counter societal ills, be it in leadership, business and other faculties. We must however be careful so that these innovations themselves are not used to negatively impact end users, such issues like privacy and use of data must be taken on in a way that users are protected and not exposed to cybercrime and data breaches that so often occur now.”
Jamie Grady, a business leader, wrote, “As technology companies become more scrutinized by the media and government, changes – particularly in privacy rights – will change. People will learn of these changes through social media as they do now.”
Technology use often disconnects or hollows out community
Some respondents commented on rising problems with a loss of community and the need for more-organic, in-person, human-to-human connection and the impact of digital distancing.
Jonathan Grudin, principal researcher at Microsoft, commented, “Social and civic activity will continue to change in response to technology use, but will it change its trajectory? Realignments following the Industrial Revolution resulted from the formation of new face-to-face communities, including union chapters, community service groups such as Rotary Club and League of Women Voters, church groups, bridge clubs, bowling leagues and so on. Our species is designed to thrive in modest-sized collocated communities, where everyone plays a valued part. Most primates become vulnerable and anxious when not surrounded by their band or troop. Digital media are eroding a sense of community everywhere we look. Can our fundamental human need for close community be restored or will we become more isolated, anxious and susceptible to manipulation?”
Rebecca Theobald, an assistant research professor at the University of Colorado, Colorado Springs, said, “Technology seems to be driving people apart, which would lead to fewer connections in society.”
The program director of a university-based informatics institute said, “There is still a widening gap between rural and urban as well as digital ‘haves’ and ‘have nots.’ As well, the ability to interact in a forum in which all members of society have a voice is diminishing as those with technology move faster in the digital forums than the non-tech segment of the population that use non-digital discourse (interpersonal). The idea of social fabric in a neighborhood and neighborly interactions is diminishing. Most people want innovation – it is the speed of change that creates divisions.”
An infrastructure architect and internet pioneer wrote, “The kind of social innovation required to resolve the problems caused by our current technologies relies on a movement back toward individual responsibility and a specific willingness to engage in community. As both of these work against the aims of the corporate and political elite as they exist today, there is little likelihood these kinds of social innovations are going to take place. The family and church, for instance, which must be the core institutions in any rebuilding of a culture that can teach the kind of personal responsibility required, were both hollowed out in the last few decades. The remaining outward structures are being destroyed. There is little hope either families or churches will recover without a major societal event of some sort, and it will likely take at least one generation for them to rebuild. The church could take on the task of helping rebuild families, but it is too captured in attempts to grow ever larger, and consume or ape our strongly individualistic culture, rather than standing against it.”
Angela Campbell, a professor of law and co-director of the Institute for Public Representation at Georgetown University, responded, “I think there will be efforts to address the social and civic impacts of technology but they may not be sufficient. In particular, I am concerned about the impact of overuse or over-reliance on technology with respect to children and teens. I am concerned about the safety of children online, not just from predators but from peers (bullying). Overuse may also contribute to physical maladies such as obesity, bad posture, eye problems, ADHD, insufficient sleep and even addiction. While technology can help to educate older children (not preschoolers who need to interact with humans and objects), it needs to be selected [and] used carefully and should not subject children to commercialism or invade their privacy. My other major concerns are job loss and discrimination. It seems inevitable that many jobs will be eliminated by technology, and while technologies may generate new jobs, I suspect there will be fewer jobs, and those that remain will require certain skills. It will be important, and difficult, to ensure that everyone is able to have employment and to make enough to live at a reasonable level. As competition for jobs increases, I am also worried about how big data allows hidden discrimination in education, health and employment.”
A researcher based in North America predicted a reining in of the digital in favor of the personal: “Between email and phones, I think we’re close to peak screen time, a waste of time, and it’s ruining our eyes. Just as we have forsaken our landlines, stopped writing letters, don’t answer our cellphones, a concept of an average daily digital budget will develop, just as we have a concept of average daily caloric intake. We’ll have warning labels that rate content against recommended daily allowances of different types of content that have been tested to be good for our mental health and socialization, moderately good, bad, and awful – the bacon of digital media. And people who engage too much will be in rehab, denied child custody and unemployable. Communities, residences and vacation areas will promote digital-free, mindfulness zones – just as they have quiet cars on the train.”
Society needs to catch up and better address the threats and opportunities of tech
Some of these experts said that the accelerating technological change of the digital age is making it difficult for humans to keep up and respond to emerging challenges.
A chair of political science based in the American South commented, “Technology always creates two new problems for every one it solves. At some point, humans’ cognitive and cooperative capacities – largely hard-wired into their brains by millennia of evolution – can’t keep up. Human technology probably overran human coping mechanisms sometime in the later 19th century. The rest is history.”
There is a gap between the rate at which technology develops and the rate at which society develops. We need to take care not to fall into that gap.
Larry Rosen, a professor emeritus of psychology at California State University, Dominguez Hills, known as an international expert on the psychology of technology, wrote, “I would like to believe that we, as citizens, will aid in innovation. Smart people are already working on many social issues, but the problem is that while society is slow to move, tech moves at lightning speed. I worry that solutions will come after the tech has either been integrated or rejected.”
Louisa Heinrich, a futurist and consultant expert in data and the Internet of Things, said, “There is a gap between the rate at which technology develops and the rate at which society develops. We need to take care not to fall into that gap. I hope we will see a shift in governance toward framework-based regulation, which will help mitigate the gap between the pace of change in technology and that in government. At the very least, we need to understand the ways in which technology can extend or undermine the rules and guidelines we set for our businesses, workplaces, public spaces and interactions. To name just one common example, recruitment professionals routinely turn to Facebook as a source of information on prospective employees. This arguably violates a number of regulations designed to protect people from being denied work based on personal details not relevant to that work. How do we unravel this conundrum, bearing in mind that there will always be another social network, another digital source to mine for information about people? Taken from another angle, there is a significant gap between what users understand about certain bits of technology and the risks they take using them. How can we educate people about these risks in a way that encourages participation and co-creation, rather than passivity? As the so-called Gen Z comes of age, we will see a whole generation of young adults who are politically engaged at a level not seen in several generations, who are also native users of technology tools. This could bring about a positive revolution in the way technology is used to facilitate civic engagement and mutually empower and assist citizens and government. Technology provides us with powerful tools that can help us advance socially and civically, but these tools need to be thoughtfully and carefully put to use – when we encode barriers and biases into the applications that people need to use in daily life, whether intentionally or no, we may exclude whole segments of society from experiencing positive outcomes. We are living through a time of rapid and radical change – as always, the early stages feel uncomfortable and chaotic. But we can already see the same tools that have been used to mislead citizens being used to educate, organise, motivate and empower them. What’s needed is a collective desire to prioritise and incentivise this. New Zealand is leading the way with the world’s first ‘well-being’ budget.”
Bulbul Gupta, founding adviser at Socos Labs, a think tank designing artificial intelligence to maximize human potential, responded, “Until government policies, regulators, can keep up with the speed of technology and AI, there is an inherent imbalance of power between technology’s potential to contribute to social and civic innovation and its execution in being used this way. If technology and AI can make decisions about people in milliseconds that can prevent their full social or civic engagement, the incentive structures to be used toward mitigating the problems of the digital age cannot then be solved by technology.”
Gene Policinski, a journalist and First Amendment law expert at the Freedom Forum Institute, observed, “We forget how new the ‘tech revolution’ really is. As we move forward in the next decade, the public’s awareness of the possibilities inherent in social and civic innovation, the creativity of the tech world working with the public sector and public acceptance of new methods of participation in democratic processes will begin to drown out and eventually will surpass the initial problems and missteps.”
Gabriel Kahn, former bureau chief for The Wall Street Journal, now a professor of journalism researching innovation economics in emerging media at the University of Southern California, wrote, “We are not facing a ‘Terminator’-like scenario. Nor are we facing a tech-driven social utopia. Humans are catching up and understanding the pernicious impact of technology and how to mitigate it.”
Kathee Brewer, director of content at CANN Media Group, predicted, “Much like society developed solutions to the challenges brought about by the Industrial Revolution, society will find solutions to the challenges of the Digital Revolution. Whether that will happen by 2030 is up for debate. Change occurs much more rapidly in the digital age than it did at the turn of the 20th century, and for society to solve its problems it must catch up to them first. AND people, including self-interested politicians, must be willing to change. Groups like the Mozilla Foundation already are working on solutions to invasions of privacy. That work will continue. The U.S. government probably won’t make any major changes to the digital elections framework until after the 2020 election, but changes will be made. Sadly, those changes probably will result from some nastiness that develops due to voters of all persuasions being unwilling to accept electoral results, whatever the results may be.”
Valerie Bock of VCB Consulting, former Technical Services Lead at Q2 Learning, responded, “I think our cultures are in the process of adapting to the power our technologies wield, and that we will have developed some communal wisdom around how to evaluate new ones. There are some challenges, but because ordinary citizens have become aware that images can be ‘photoshopped’ the awareness that video can be ‘deepfaked’ is more quickly spreading. Cultural norms as well as technologies will continue to evolve to help people to apply more informed critiques to the messages they are given.”
Bach Avezdjanov, a program officer with Columbia University’s Global Freedom of Expression project, said, “Technological development – being driven by the Silicon Valley theory of uncontrolled growth – will continue to outpace civic and social innovation. The latter needs to happen in tandem with technological innovation, but instead plays catch-up. This will not change in the future, unless political will to heavily regulate digital tools is introduced – an unlikely occurrence.”
A computing science professor emeritus from a top U.S. technological university commented, “Social/civic innovation will occur but most likely lag well behind technological innovation. For example, face-recognition technology will spread and be used by businesses at a faster pace than social and legal norms can develop to protect citizens from any negative effects of that technology. This technology will spread quickly, due to its various positives (increased efficiencies, conveniences and generation of profits in the marketplace) while its negatives will most likely not be countered effectively through thoughtful legislation. Past Supreme Court decisions (such as treating corporations as persons, WRT unlimited funding of political candidates, along with excessive privacy of PACs) have already undermined U.S. democracy. Current populist backlashes, against the corruption of the Trump government, may also undermine democracy, such as the proposed Elizabeth Warren tax, being not on profits, but upon passive wealth itself – a tax on non-revenue-producing illiquid assets (whose valuation is highly subjective), as in her statement to ‘tax the jewelry of the rich’ at 2% annually. Illiquid assets include great private libraries, great private collections of art, antiques, coins, etc. – constituting an assault on the private sector, that if successful, will weaken democracy by strengthening the confiscatory power of government. We could swing from current excesses of the right to future excesses of the left.”
Despite current trends, there is reason to hope for better days
Many of the experts in this canvassing see a complicated and difficult road ahead, but express hope for the future.
Cheryl B. Preston, an expert in internet law and professor at Brigham Young University Law School, said, “Innovation will bring risk. Change will bring pain. Learning will bring challenges. Potential profits will bring abuse. But, as was the decision of Eve in the Garden of Eden, we need to leave the comfortable to learn and improve. If we can, by more informed voting, reduce the corruption in governmental entities and control corporate abuse, we can overcome difficulties and advance as a society. These advances will ultimately bring improvement to individuals and families.”
John Carr, a leading global expert on young people’s use of digital technologies, a former vice president of MySpace, commented, “I know of no proof for the notion that more people simply knowing more stuff, even stuff that is certifiably factually accurate, will necessarily lead to better outcomes for societies. But I do harbour a hope that if, over time, we can establish the idea that there are places on the internet that are reliable sources of information, it will in the medium to longer term help enough people in enough countries to challenge local demagogues and liars, making it harder for the demagogues and liars to succeed, particularly in times of national crisis or in times when war might be on the visible horizon. I used to think that if the internet had been around another Hitler would be impossible. Recently I have had a wobble on that but my optimism ‘trumps’ that gloomy view.”
Mike Douglass, an independent developer, wrote, “There is a significant realization that a stampede to create connections between anonymous people and devices was a bad idea. It’s up to the technologists and – more importantly – those who want to make money out of technology – to come up with a more measured approach. There’s a reason why gentlemen obtained letter of introduction to other gentlemen – one shouldn’t trust some random individual turning up on your doorstep. We need the equivalent approach. I’ve no idea what new innovations might turn up. But if we don’t get the trust/privacy/security model right we’ll end up with more social media disasters.”
Hume Winzar, an associate professor and director of the business analytics undergraduate program at Macquarie University, Sydney, Australia, predicted, “With more hope than evidence, I’d like to think that reason will eventually overcome the extraordinary propaganda machines that are being built. When the educated upper-middle classes realise that the ‘system’ is no longer serving them, then legal and institutional changes will be necessary. That is, only when the managers who are driving the propaganda machine(s) start to feel that they, personally, are losing privacy, autonomy, money and their children’s future, then they will need to undermine the efforts of corporate owners and government bureaucrats and officials.”
Carolyn Heinrich, a professor of education and public policy at Vanderbilt University, said, “My hope (not belief) is that the ‘techlash’ will help to spur social and civic innovations that can combat the negative effects of our digitization of society. Oftentimes, I think the technology developers create their products with one ideal in mind of how they will be used, overlooking that technology can be adapted and used in unintended and harmful ways. We have found this in our study of educational technology in schools. The developers of digital tools envision them as being used in classrooms in ‘blended’ ways with live instructors who work with the students to help customize instruction to their needs. Unfortunately, more often than not, we have seen the digital tools used as substitutes for higher-quality, live instruction and have observed how that contributes to student disengagement from learning. We have also found some of the content lacking in cultural relevance and responsiveness. If left unchecked, this could be harmful for far larger numbers of students exposed to these digital instructional programs in all 50 states. But if we can spur vendors to improve the content, those improvements can also extend to large numbers of students. We have our work cut out for us!”
In the field I follow, artificial intelligence, the numbers of professionals who take seriously the problems that arise as a consequence of this technology are reassuring.
Heywood Sloane, entrepreneur and banking and securities consultant, wrote, “I’m hopeful the it will be a positive contributor. It has the ability to alter the way we relate to our environment in ways that shrink the distances between people and help us exercise control over our personal and social spaces. We are making substantial progress, and 5G technology will accelerate that. On the flip side, we need to find mechanisms and processes to protect our data and ourselves. They need to be strong, economic and simple to deploy and use. That is going to be a challenge.”
Pamela McCorduck, writer, consultant and author of several books, including “Machines Who Think,” commented, “I am heartened by the number of organizations that have formed to enhance social and civic organization through technology. In the field I follow, artificial intelligence, the numbers of professionals who take seriously the problems that arise as a consequence of this technology are reassuring. Will they all succeed? Of course not. We will not get it right the first time. But eventually, I hope.”
Yoshihiko Nakamura, a professor of mechno-informatics at the University of Tokyo, observed, “The current information and communication technology loses diversity because it is still insufficient to enhance the affectivity or emotion side of societies. In this sense I can see the negative side of current technology to human society. However, I have a hope that we can invent uses of technology to enhance the weaker side and develop tomorrow’s technology. The focus should be on the education of society in the liberal arts.”
Ryan Sweeney, director of analytics at Ignite Social Media, commented, “In order to survive as a functioning society, we need social and civic innovation to match our use of technology. Jobs and job requirements are changing as a result of technology. Automation is increasing across a multitude of industries. Identifying how we protect citizens from these changes and help them adapt will be instrumental in building happiness and well-being.”
Miles Fidelman, founder, Center for Civic Networking and principal Protocol Technologies Group, responded, “We can see clear evidence that the internet is enabling new connections, across traditional boundaries – for the flow of information, culture and commerce. It is strengthening some traditional institutions (e.g., ties between geographically distributed family members) and weakening others (e.g., the press). Perhaps the most notable innovation is that of ad hoc, network-centric organizations – be they global project teams, or crisis response efforts. How much of this innovation will make things better, how much it will hurt us, remains an open question.”
A technology developer active in IETF said, “I hope mechanisms will evolve to exploit the advantages of new tech and mitigate the problems. I want to be optimistic, but I am far from confident.”
A renowned professor of sociology known for her research into online communications and digital literacies observed, “New groups expose the error of false equivalence and continue to challenge humans to evolve into our pre-frontal cortex. I guess I am optimistic because the downside is pretty terrible to imagine. It’s like E.O. Wilson said: ‘The real problem of humanity is the following: We have paleolithic emotions; medieval institutions; and god-like technology. And it is terrifically dangerous, and it is now approaching a point of crisis overall.’”