Technology Ethics (eBook)
146 Seiten
John Wiley & Sons (Verlag)
978-1-5095-6406-4 (ISBN)
Technology Ethics shows how responsible innovation can be achieved. Demonstrating how design and philosophy converge, the book delves into the intricate narratives that shape our understanding of technology - from instrumentalist views to social constructivism. Yet, at its core, it champions interactionalism as the most promising and responsible narrative. Through compelling examples and actionable tools, this book unravels the nuances of these philosophical positions tailored to foster responsible innovation and thoughtful design. As our everyday lives further intertwine with technology, understanding and implementing these design principles becomes not just beneficial, but essential.
This concise and accessible introduction is essential reading for students and scholars of philosophy of technology, engineering ethics, science and technology studies, human-machine communication, as well as policymakers.
Steven Umbrello is a postdoctoral research fellow at the University of Turin.
Technologies cannot simply be understood as neutral tools or instruments; they embody the values of their creators and may unconsciously reinforce existing inequalities and biases. Technology Ethics shows how responsible innovation can be achieved. Demonstrating how design and philosophy converge, the book delves into the intricate narratives that shape our understanding of technology from instrumentalist views to social constructivism. Yet, at its core, it champions interactionalism as the most promising and responsible narrative. Through compelling examples and actionable tools this book unravels the nuances of these philosophical positions, and is tailored to foster responsible innovation and thoughtful design. As our everyday lives further intertwine with technology, understanding and implementing these design principles becomes not just beneficial, but essential.This concise and accessible introduction is essential reading for students and scholars of philosophy of technology, engineering ethics, science and technology studies, and human machine communication, as well as policymakers.
1
Technology and Society
In 1980 Langdon Winner published what would become a foundational work in the burgeoning field of philosophy of technology. In his paper, “Do Artifacts Have Politics?”, Winner described how the parkways of Long Island, New York were built intentionally low (Winner 1980). The reason for this was that Robert Moses, the American urban planner responsible for planning much of New York’s metropolitan area throughout the early and mid twentieth century, purposefully designed the parkways low to ensure that poor and lower-middle-class families (mostly African Americans and other minority groups) could not access Jones Beach, one of his prized strands. Moses knew that these groups had limited access to cars and relied on public transit, and those low-hanging parkways could not accommodate tall city buses. The parkways thus created an infrastructural barrier limiting access to Long Island’s beaches to only those who could afford cars (Caro 1975). Moses’ racist values were thereby embodied in the technology, low-tech as it may be, of the parkways, and this is, in fact, exactly what Winner showed, that technologies are not merely tools, but that they embody values.
Since Winner’s work, philosophy of technology has come a long way, and it is now standard to view technologies not as isolated artifacts, but as infrastructures, systems, or, more specifically, as sociotechnical systems. But what exactly does that mean? What does it mean to understand technology as somehow being “sociotechnical”? In both academic and everyday circles, people generally talk about technology in (at least) one of three ways. The first is to conceive of technology purely as a tool or instrument. Usually referred to as instrumentalism, such views are often pushed by those who wish to tout the benefits of a given technology while downplaying possible negatives. A notable exemplar is the oft-quoted motto of American gun rights activists; “guns don’t kill people; people kill people.” The second way to construe technology is as being purely deterministic. This position, <ism, holds that both human action and our social world are determined by technology, a view nicely illustrated in the popular cyberpunk video game Deus Ex: Mankind Divided, where the hashtag #CantKillProgress is repeatedly used to show there is no way to stop the inevitable march of technology and its societal consequences (Deus Ex 2011). The third way of looking at technology is to understand it as socially constructed. This position, known as social constructivism, sees technology as being nothing other than the product of human actions; humans, therefore, are completely responsible for what technologies are and what they do. Each of these narratives sees continual propagation in both popular culture and academia, but do they accurately capture what technologies really are?
Robert Moses’ bridges show that technologies can both instantiate values and be shaped by them. Moreover, technological limitations can impact how values are embodied in technologies and may alter the very values themselves; interaction effects may stack, interfere with one another, or shift the course of design. All in all, it seems plain that technology is not as simple as any of the single above conceptions would have us believe. Rather, sociotechnicity is a rich yet complex topic in constant development, referring to the dynamic interaction between technologies and people, which form a complex infrastructure (Ruth and Goessling-Reisemann 2019). This means that technologies are not isolated objects. Instead, they are connected systems, part of a larger network of other technologies and people. This sociotechnical understanding of technology highlights a combination of instrumentalism, and social constructivism, and represents what some scholars call interactionalism. Fundamental to interactionalism is the understanding that technologies are in constant and dynamic interaction with other technologies and people.
It may go without saying, but it is also worthwhile to make clear, that technologies provide us with a host of benefits, and we should not automatically assume that all technologies embody disvalues like Moses’ racism in his bridges. That example is used to demonstrate that technologies are characterized by the values that they embody and that those values have material impacts on the world and our future alongside them. However, as the world changes, those impacts may change as well; as cars became more affordable, those groups Moses hoped to keep out became more and more able to pass under his parkways and access Long Island’s beaches. How a technology embodies a value, therefore, changes over time. This further illustrates how technologies are interactional,1 part of a larger environment of relationships with people and other technologies. Each technology is sure to be designed for an explicit purpose, but they will also interact with other technologies, forming a network of shifting relationships which is important to fully understand if we are to ensure that we design our technologies for good.
Focusing on the values behind development can also be crucial for identifying when a design is failing to fully live up to those values. As an example, artificial intelligence (AI) technologies can illustrate in distressing clarity what can happen when core human values are not clearly and explicitly designed for (Coeckelbergh 2020). For example, IBM spent $62 million USD to develop their famed Watson AI system to help provide medical doctors with cancer treatment advice (Ross and Swetlitz 2018). However, when tested in real-world settings, the system often recommended “unsafe and incorrect” cancer treatments, such as recommending medications that would aggravate, rather than help patients with serious bleeding. Because the data used to train the system was mostly hypothetical, rather than real, the system made poor recommendations. Documents revealed that the decision to use hypothetical clinical scenarios rather than the statistical data of real patients and cases was the consequence of training the system according to the preferences of doctors rather than the Big Data that is available in healthcare, presumably in order for the designers to quickly implement the system. Accuracy and safety were obviously not the values explicitly designed for in this system, leading to potentially lethal consequences. There are, moreover, numerous examples where systems have, as a function of design, not only made errors but reinforced existing problems. This is what happens when technologies are not approached from an applied ethics perspective, when we do not look at them as interactional, paying heed to how their various facets impact on one another. Good intentions are not enough; good design is better.
This Book
Technologies, arguably, are an inextricable part of what characterizes human beings, and they are certainly here to stay. Likewise, we are currently experiencing an almost dizzying boom in information and communication (ICT) technologies and artificial intelligence systems that are increasingly difficult to understand (Ihde and Malafouris 2018). If technologies embody the values of their creators, whether they intend to embody them or not, that means that we exert a degree of control over how those technologies impact on our world and the future. This is a hopeful prospect.
This book explores the nuances of how our different sociotechnical systems, systems we often overlook and take for granted, influence and are influenced by our actions. It aims to give the reader a clear overview of how technological design has been traditionally handled, how and why philosophy has become so important in design, as well as the various approaches for actually doing the dirty work now so that we don’t suffer the consequences later. More broadly, this book will introduce philosophical concepts and positions as they relate to how we understand technologies and our relationship with them, while also showing how important it is for engineering ethics that we have an accurate and holistic understanding of technology.
Towards this end, this book will explore some of the main historical and current views of technology, as well as connect philosophical concepts to practical applications. This will help guide readers in understanding the importance of engineering ethics, that is, understanding and promoting the ethical practice of engineers (Harris et al. 2013). Given the ubiquity of technologies in our hyperconnected world, and given the role that engineers play in the creation of those technologies, understanding and promoting engineering ethics is an important goal. Doing so requires people from various disciplines and fields like philosophy, public policy, and, of course, engineering, to come together. Huge investments at regional levels, like that of the European Union, moreover demonstrate the overall interest in promoting this practice.
Focusing on what technology is and what engineers can do to ensure that technologies are designed and developed ethically means that we can focus more on pressing real-world issues that come as part and parcel of technologies and less on the techno-utopian or techno-dystopian narratives that have been dominant in both public as well as academic spaces. Many scholars that have directed their energy toward engineering ethics have found that those latter hyperbolic debates often come at the opportunity cost of more proximal issues that contemporary technologies present and that need immediate attention, like...
| Erscheint lt. Verlag | 10.6.2024 |
|---|---|
| Sprache | englisch |
| Themenwelt | Geisteswissenschaften ► Philosophie ► Allgemeines / Lexika |
| Technik ► Bauwesen | |
| Schlagworte | Automation • Big Tech • Communication & Media Studies • Communication & Media Studies Special Topics • design ethics • Engineering Ethics • ethics • Ethik • Human-Computer interaction • Human-Machine Communication • Kommunikation u. Medienforschung • Philosophie • Philosophie i. d. Technik • Philosophy • Philosophy of Technology • responsible technological innovation • Science and Technology Studies • Spezialthemen Kommunikation u. Medienforschung • Steven Umbrello • Technikethik • technological bias • Technological Development • Technology Ethics |
| ISBN-10 | 1-5095-6406-3 / 1509564063 |
| ISBN-13 | 978-1-5095-6406-4 / 9781509564064 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM
Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belletristik und Sachbüchern. Der Fließtext wird dynamisch an die Display- und Schriftgröße angepasst. Auch für mobile Lesegeräte ist EPUB daher gut geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine
Geräteliste und zusätzliche Hinweise
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich