The is done for scientific integrity and academic integrity so why not include educational tools to address the personal ethics and legal obligations of technology empowerment.
This could be done very badly. The announcement of a mandatory course on technology ethics as part of the response to the Swartz Report would be a message interpreted by many as evidence of an institutional crackdown. The message must be "We won't always be able to protect you, so here's what you need to know," rather than "Don't embarrass the Institute and put us in awkward legal positions! Follow these rules of acceptable conduct, or else we will exhibit undefined behavior."
I think this would be best done under the aegis of the MIT Law Center as proposed in Question 4 (and, as I suggested in response to that question, headed by Joi Ito). In my view, no mandatory class should be created, but the proposed Law Center should produce a one-page handout to be included in the welcome packets of all incoming students, and an elective course for those interested would likely be worthwhile.
I think that for some majors/courses there should be a mandatory ethics and law component. Technology is only improving more rapidly, and I believe that we, as the people who make the discoveries and truly understand the science behind them, are the most capable of making the policy decisions that govern them.
There has been a realization in recent years that communication is an essential component of an engineering education. Their has been a concerted effort to teach the communication skills necessary to effectively present and discuss your ideas. For example, 6.UAT, or UPOP, and the GEL program are all MIT programs that focus on communication and leadership. I believe that it is as important to include a legal component in the engineering education. It is clear that the law is not keeping up with the rate of technological development. The only way to fix that is to study where the law has failed and is failing, to understand the shortcomings from a formal, legal standpoint. Having a groundwork understanding of the law will enable MIT graduates to make effective impact on existing and future policy.
Why not develop a series of offerings -- as intimate seminar discussions -- staged from different perspectives and encourage (rather than require) students to take them in freshman year? I believe hacker culture needs a better understanding of the history of technological change and principled response to it. Civil disobedience is a right and in some cases a profound duty, yet few understand its history (Thoreau went to jail, as did King and Gandhi, etc.) Contributing disciplines could include Sloan, STS, History, Philosophy, Media Lab, and others. Visiting faculty from area law schools should be actively solicited for participation in these subjects.
It will, whether it should or not. By tacitly condoning a stop and frisking of all network users without a subpoena, MIT has already addressed the personal ethics and legal obligations of technology empowerment: you have allowed the institute to be empowered and all its members to be weakened. No thanks for that weak, lilly-livered response; I expect better and less naive thinking in the future.
One of the most surprisingly useful courses I took as a Freshman was "Athens in the Fifth Century B.C." It opened my eyes to life-long issues of ethics, governance, and culture. MIT has always had an important role in the shaping of the ethical consciousness of its students. This role needs to be rethought and updated to bring core issues into the ongoing context. I recall an EE pedagogical technique called the Varistor--a generalized component that taught one to handle future, unknown challenges. What is needed is the ethical courseware equivalent of the Varistor--a set of general ethical analysis principles that will serve the student throughout his professional life.
This could be done very well or it could be done very badly. Doing it well would be to show the conflicts inherent in these laws and technical tradeoffs ala Michael Sandel. Doing it poorly would be MIT corporate big-brother telling people what they can or cannot do and how they must act. Doing it well would be encouraging people to think deeply and to dig deeply to live morally meaningful lives -- whether or not those lives match up one-to-one with current US legal doctrine. Was Aaron Swartz engaging in a small act of civil disobedience or a large scale act of property theft? Were his actions justified? Were MIT's actions justified? Frequently we all take actions in support of personal profit, or personal intellectual or political goals, that ought to make one morally queasy. But parts of the question: "personal ethics" and "legal obligations" and "technology empowerment" are all already reaching for a conclusion. "Personal ethics" are not distinguishable from "institutional ethics" nor "societal ethics" -- teaching to think small ["personal"] is teaching to fail. "Legal obligations" includes the requirement to oppose hurtful laws. And "technology empowerment" needs to consider the opposite: when technology acts improperly to disempower and enslave.
Yes. The ethical uses of the fruits of technology and science didn't start with the internet. These questions have been with us since the first caveman or cavewoman built an arrowhead, and they should be part of the education of any MIT student.
The communities at MIT that regularly encounter ethical issues already make and teach ethical solutions.
MIT should not attempt to teach legal obligations to those who don't feel quite so obligated.
Yes, but I doubt many students would be interested. Young minds do not want to be encombered by case studies and rules of conduct. Engineers at large are short on the liberal arts education that might give them a sense of ethics and humanity. A general sense of right and wrong should be sufficient for individuals to make the right choices.
Special courses could be of interest academically, but the real way to inculcate consciousness of the importance of ethics is to include it in every single course. At any point that a new technology is being presented which has an ethical implication, it should be mentioned and discussed. If time cannot be taken from the teaching of the technology, the issue should be recommended for student discussion outside of class.
Surely the ethic around physical hacks are already well established amongst the MIT community (http://hacks.mit.edu/misc/ethics.html). Should we as the MIT community not develop our own informal ethic that goes beyond just pranks and covers broader aspects of rule-breaking for the sake of inquiry and learning?
Every student should be obliged to think about ethics, as they will be faced with ethical decisions over and over again in their personal and professional lives. They should not be taught any particular ethical standard, but they should understand the principles. Students will also have to navigate the legal minefield posed by too many, often ill-considered, laws pertaining to the Internet and intellectual property (the landscape of patents is a mess). Helping them understand the laws and patent system would help them in their personal and professional lives. Finally, giving them ideas as to how to influence the creation and modification of laws and standards might be helpful. I envision a course that covers ethics, law (particularly as applied to the Internet and intellectual property), and way laws and standards are created as a way to produce alumni who are more empowered to deal with the world they will live and work in.
I'm a PhD student at the Media Lab. Over the last few weeks, I have been listening to gradstudents, faculty, staff, and alums about the report. (I'm trying to link up people interested in responding to the report. Contact me if you're interested)
Frankly, this particular question has puzzled many people because it puts the emphasis on student learning, when the report seems to be more about the personal ethics and legal obligations of MIT and its administration.
That said, I think there are several meaningful responses within the area of student learning:
1. MIT doesn't have anything like Harvard's cyberlaw clinic, where anyone can get pro bono legal help on technology questions. Lots of students and research projects end up using their services. We should consider publicising the Cyberlaw clinic's work more broadly within MIT and look into establishing something similar
2. Creating and promoting classes and resources currently available within MIT, including IAP offerings in this space. A session on being disruptive without getting in trouble, on technology innovation and the law, could attract a lot of would-be entrepreneurs and politically interested students
3. Several people have talked with me about what it means to learn to care about the ethics, politics, and law of technology. Initiatives should be considered that raise the profile of these questions across all of MIT -- and not in a manner that ends with intellectually stimulation, but one that encourages people to roll up their sleeves.
Else we are arming people with guns and not giving so much as a safety course.
One of MIT's great legacies of the 1960s was the Union of Concerned Scientists (http://en.wikipedia.org/wiki/Union_of_Concerned_Scientists). It was understood, at the time, that MIT was playing a non-neutral role in weapons proliferation, and that building inertial guidance systems and bombs were not "neutral", but instead highly political, ideological professional choices.
In an era of big data, of algorithms, of building robots of code who do work for us (and in the process: categorize people, trade stocks, suggest financial penalties, guide police patrols, etc), it is absolutely important that the ethics of technology be foregrounded in all parts of our study. I don't mean "ethics" with a particular norm or politics in mind. Instead, I mean training students to be thoughtful about the tools they make and use, as opposed to treating the tools as somehow neutral or harmless abstractions.
I am adding this comment on behalf of an anonymous commenter within the Media Lab who did not want to be identified:
"yes, but in a way which shows the law as malleable and constructed, just as tech is. Something to be challenged. And the ethics of being a continuing contributor as a part of a group."
17 Comments
The is done for scientific
Submitted by jbreen@mit.edu on
The is done for scientific integrity and academic integrity so why not include educational tools to address the personal ethics and legal obligations of technology empowerment.
This could be done very badly
Submitted by davidad@mit.edu on
This could be done very badly. The announcement of a mandatory course on technology ethics as part of the response to the Swartz Report would be a message interpreted by many as evidence of an institutional crackdown. The message must be "We won't always be able to protect you, so here's what you need to know," rather than "Don't embarrass the Institute and put us in awkward legal positions! Follow these rules of acceptable conduct, or else we will exhibit undefined behavior."
I think this would be best done under the aegis of the MIT Law Center as proposed in Question 4 (and, as I suggested in response to that question, headed by Joi Ito). In my view, no mandatory class should be created, but the proposed Law Center should produce a one-page handout to be included in the welcome packets of all incoming students, and an elective course for those interested would likely be worthwhile.
I think that for some majors
Submitted by descioli@mit.edu on
I think that for some majors/courses there should be a mandatory ethics and law component. Technology is only improving more rapidly, and I believe that we, as the people who make the discoveries and truly understand the science behind them, are the most capable of making the policy decisions that govern them.
There has been a realization in recent years that communication is an essential component of an engineering education. Their has been a concerted effort to teach the communication skills necessary to effectively present and discuss your ideas. For example, 6.UAT, or UPOP, and the GEL program are all MIT programs that focus on communication and leadership. I believe that it is as important to include a legal component in the engineering education. It is clear that the law is not keeping up with the rate of technological development. The only way to fix that is to study where the law has failed and is failing, to understand the shortcomings from a formal, legal standpoint. Having a groundwork understanding of the law will enable MIT graduates to make effective impact on existing and future policy.
Why not develop a series of
Submitted by cajones@mit.edu on
Why not develop a series of offerings -- as intimate seminar discussions -- staged from different perspectives and encourage (rather than require) students to take them in freshman year? I believe hacker culture needs a better understanding of the history of technological change and principled response to it. Civil disobedience is a right and in some cases a profound duty, yet few understand its history (Thoreau went to jail, as did King and Gandhi, etc.) Contributing disciplines could include Sloan, STS, History, Philosophy, Media Lab, and others. Visiting faculty from area law schools should be actively solicited for participation in these subjects.
It will, whether it should or
Submitted by boneye on
It will, whether it should or not. By tacitly condoning a stop and frisking of all network users without a subpoena, MIT has already addressed the personal ethics and legal obligations of technology empowerment: you have allowed the institute to be empowered and all its members to be weakened. No thanks for that weak, lilly-livered response; I expect better and less naive thinking in the future.
One of the most surprisingly
Submitted by sternlight on
One of the most surprisingly useful courses I took as a Freshman was "Athens in the Fifth Century B.C." It opened my eyes to life-long issues of ethics, governance, and culture. MIT has always had an important role in the shaping of the ethical consciousness of its students. This role needs to be rethought and updated to bring core issues into the ongoing context. I recall an EE pedagogical technique called the Varistor--a generalized component that taught one to handle future, unknown challenges. What is needed is the ethical courseware equivalent of the Varistor--a set of general ethical analysis principles that will serve the student throughout his professional life.
Gyristor
Submitted by sternlight on
Gyristor
This could be done very well
Submitted by jimad on
This could be done very well or it could be done very badly. Doing it well would be to show the conflicts inherent in these laws and technical tradeoffs ala Michael Sandel. Doing it poorly would be MIT corporate big-brother telling people what they can or cannot do and how they must act. Doing it well would be encouraging people to think deeply and to dig deeply to live morally meaningful lives -- whether or not those lives match up one-to-one with current US legal doctrine. Was Aaron Swartz engaging in a small act of civil disobedience or a large scale act of property theft? Were his actions justified? Were MIT's actions justified? Frequently we all take actions in support of personal profit, or personal intellectual or political goals, that ought to make one morally queasy. But parts of the question: "personal ethics" and "legal obligations" and "technology empowerment" are all already reaching for a conclusion. "Personal ethics" are not distinguishable from "institutional ethics" nor "societal ethics" -- teaching to think small ["personal"] is teaching to fail. "Legal obligations" includes the requirement to oppose hurtful laws. And "technology empowerment" needs to consider the opposite: when technology acts improperly to disempower and enslave.
Yes. The ethical uses of the
Submitted by buratti on
Yes. The ethical uses of the fruits of technology and science didn't start with the internet. These questions have been with us since the first caveman or cavewoman built an arrowhead, and they should be part of the education of any MIT student.
The communities at MIT that
Submitted by duffield@mit.edu on
The communities at MIT that regularly encounter ethical issues already make and teach ethical solutions.
MIT should not attempt to teach legal obligations to those who don't feel quite so obligated.
Yes, but I doubt many
Submitted by jjb9_85 on
Yes, but I doubt many students would be interested. Young minds do not want to be encombered by case studies and rules of conduct. Engineers at large are short on the liberal arts education that might give them a sense of ethics and humanity. A general sense of right and wrong should be sufficient for individuals to make the right choices.
Special courses could be of
Submitted by hiflomen on
Special courses could be of interest academically, but the real way to inculcate consciousness of the importance of ethics is to include it in every single course. At any point that a new technology is being presented which has an ethical implication, it should be mentioned and discussed. If time cannot be taken from the teaching of the technology, the issue should be recommended for student discussion outside of class.
Surely the ethic around
Submitted by pwnel@mit.edu on
Surely the ethic around physical hacks are already well established amongst the MIT community (http://hacks.mit.edu/misc/ethics.html). Should we as the MIT community not develop our own informal ethic that goes beyond just pranks and covers broader aspects of rule-breaking for the sake of inquiry and learning?
Every student should be
Submitted by budtripp on
Every student should be obliged to think about ethics, as they will be faced with ethical decisions over and over again in their personal and professional lives. They should not be taught any particular ethical standard, but they should understand the principles. Students will also have to navigate the legal minefield posed by too many, often ill-considered, laws pertaining to the Internet and intellectual property (the landscape of patents is a mess). Helping them understand the laws and patent system would help them in their personal and professional lives. Finally, giving them ideas as to how to influence the creation and modification of laws and standards might be helpful. I envision a course that covers ethics, law (particularly as applied to the Internet and intellectual property), and way laws and standards are created as a way to produce alumni who are more empowered to deal with the world they will live and work in.
I'm a PhD student at the
Submitted by jnmatias@mit.edu on
I'm a PhD student at the Media Lab. Over the last few weeks, I have been listening to gradstudents, faculty, staff, and alums about the report. (I'm trying to link up people interested in responding to the report. Contact me if you're interested)
Frankly, this particular question has puzzled many people because it puts the emphasis on student learning, when the report seems to be more about the personal ethics and legal obligations of MIT and its administration.
That said, I think there are several meaningful responses within the area of student learning:
1. MIT doesn't have anything like Harvard's cyberlaw clinic, where anyone can get pro bono legal help on technology questions. Lots of students and research projects end up using their services. We should consider publicising the Cyberlaw clinic's work more broadly within MIT and look into establishing something similar
2. Creating and promoting classes and resources currently available within MIT, including IAP offerings in this space. A session on being disruptive without getting in trouble, on technology innovation and the law, could attract a lot of would-be entrepreneurs and politically interested students
3. Several people have talked with me about what it means to learn to care about the ethics, politics, and law of technology. Initiatives should be considered that raise the profile of these questions across all of MIT -- and not in a manner that ends with intellectually stimulation, but one that encourages people to roll up their sleeves.
Yes.
Submitted by petey@mit.edu on
Yes.
Else we are arming people with guns and not giving so much as a safety course.
One of MIT's great legacies of the 1960s was the Union of Concerned Scientists (http://en.wikipedia.org/wiki/Union_of_Concerned_Scientists). It was understood, at the time, that MIT was playing a non-neutral role in weapons proliferation, and that building inertial guidance systems and bombs were not "neutral", but instead highly political, ideological professional choices.
In an era of big data, of algorithms, of building robots of code who do work for us (and in the process: categorize people, trade stocks, suggest financial penalties, guide police patrols, etc), it is absolutely important that the ethics of technology be foregrounded in all parts of our study. I don't mean "ethics" with a particular norm or politics in mind. Instead, I mean training students to be thoughtful about the tools they make and use, as opposed to treating the tools as somehow neutral or harmless abstractions.
I am adding this comment on
Submitted by jnmatias@mit.edu on
I am adding this comment on behalf of an anonymous commenter within the Media Lab who did not want to be identified:
"yes, but in a way which shows the law as malleable and constructed, just as tech is. Something to be challenged. And the ethics of being a continuing contributor as a part of a group."