Is the creator of written word responsible for all the terrible things written? Is the caveman who discovered fire responsible for all the arson that has happened in the past few thousand years? What about the creator of guns, are they responsible for the tragedies caused by their invention?
Most people would say that no, of course the person or civilization that created language or writing isn't responsible for the unintended ways that people have used it, and it could be argued convincingly that the good caused by language far outweighs the bad— but what about when the creation was intended to be used in a detrimental way? Of course, the creator of a lethal weapon would say that it's okay to use their technology on whoever it was intended to be used on, but unless they restrict who has access to it, they don't truly have any control over who uses it for any purpose.
The difference here between the creation of language and weaponry is that the potential ramifications of the gun are clearly known to the inventor. The individual that creates a weapon to be used against enemy soldiers knows that it is a destructive tool. If that tool is used against people who were not originally intended to be harmed, does it not befall the responsibility of the creator?
Tools in the field of security, Open Source Intelligence (although using open source information by definition), and others have the goal to make it easier to find relevant information oftentimes pertaining to an individual. Many of these tools put a specific focus on social media which heavily relate to the personal lives of many. Because of this, there is a dilemma when deciding to publicly release scripts/tools.
- A: Release the tool and hope that by doing so you promote knowledge about what kind of information is publicly available, that people use it for good purposes (such as TraceLabs), and people do not use what you've provided for malicious purposes.
- B: Do not release the tool, avoiding relinquishing control over what you've made is used for, while also greatly limiting its potential for positive impact.
I recently wrote a tool which might be of use to some people wanting to use it positively, however I realized if I open sourced it or released it in any matter I would have no control for who it was used towards, something I was not comfortable with in this circumstance. Because of this, I opted for option B. However, this is definitely a case-by-case choice.
I have also been writing a tool, Traverse, to automate discovery of related sites that use the same adsense/analytics ids/js/html/etc with the goal of being able to quickly spot relationships between news sites or blogs who may be promoting false information across their hosts. In this instance, I believe that the positive outweighs any negative impact it may have, and the tool is made entirely open source.
Whether or not you agree with me, I hope everyone that reads this can accept the sentiment that before something is released, the potential impact, both positive and negative, should always be considered.