Hi Lee, how's it going?
It seems you might have missed the fact that my last posting was a reply to
a message from Terry Donaghe-- I'll fill in the particulars:
> > This argument is getting boring... "You can't own it, you can't
> > track it, you can't control it, so why even bother?"
> > Which is a defeatist attitude. There are two distinct issues here:
> > a) is it -desirable- to own it? and b) is it -possible- to own it?
> > Arguing against 'b' is not a good way to argue against 'a.' If it
> > *is* desirable then we should get more imaginative about making it
> > possible. If it's *not* desirable, then you should argue that position.
> You apparently reading different arguments than I am.
Here's the part of Terry's message to which I was responding:
recording industry (and the motion picture industry and the book publishers,
etc. etc.) can whine all they want, but their claim is akin to me claiming
that I own all the air in a 20 foot radius about my body - I can't keep
anyone from breathing it, I can't track everyone who is breathing it or has
previously breathed it, and I don't even know where all of it is at any
Basically its an argument from impossibility, to which I was objecting.
Clearly, I don't think that, and I don't use
> inevitability as a reason for supporting my ethical point of view.
No, -you- didn't. It's good, because as you've stated, you (and I) are
already in a war against some pretty big so-called 'inevitabilities.'
> > If we decide we want to be able to protect something that comes out of
> > our own head, then we look for ways to do so.
> I've never argued against that either, so long as the methods are
> technological rather than government force.
Okay, so it looks like we agree to some extent as to the ends, but not the
means. I don't really have a problem with operating under a social contract
that forces me to relinquish *some* of my freedom if it creates an operating
environment that is superior. In fact, I take it as a given that I cannot
expect absolute freedom if I wish to continue to interact with the human
race. I'm actually in favor of using both technology and social engineering
to create those environments. My suspicion is that neither alone will be
sufficient for long.
> > You claim that encryption will be only a minor annoyance
> > in the future: if that's true, then we've got some big issues.
> Who has claimed this? Certainly not I, or anyone I remember in
> this discussion.
Again, I quote from Terry's message:
"As bandwidth increases along with similar increases in storage space,
processing speed and number of connected users, all digital information
located anywhere on the Internet will eventually be available to anyone at
anyplace - at no or negligible cost. Encryption can just slow the process
down a bit."
I.E.: Insecurity is inevitable and encryption is not going to be a
> > They should be able to make the determination of what stays
> > private and what becomes public.
> Absolutely. But once they make it public, /they shouldn't be able
> to change their decision/ and enforce that change of mind with guns.
Hehehe... guns, huh?
I don't see things quite as black and white. Why must something be either
absolutely controlled or totally free? Why can't we operate under social
conditions that allow me to maintain ownership over stuff that comes out of
my brain, while still sharing it will lots of people? Or does this sound
like I want to have my cake and eat it too?
> > And in the absence of good technological security, do I think that
> > we should collectively agree to protect that information through legal
> > means? Absolutely.
> I don't like arguing counterfactuals since there /will/ be technological
> means, but I'm still not clear what you mean by "protect" here.
Okay, I -hope- there will be technological means also, but the point is that
currently, there aren't. This is not a counterfactual.
> mean some way of creating the legal fiction that a certain piece of
> information has never been "released" (an odd concept, but one I might
> be able to support) or the state of current IP law, which tries to have
> it both ways, both publicly released and "protected" from competition?
Uh, yeah, the second one.
Am I the only one who is not disturbed by this apparent contradiction?
> > People always get up in arms about this: "I don't want government
> > bureaucracy interfering with my freedom!" But humans have always
> > understood the importance of social contracts, ever since the first time
> > we figured out that the right of your fist ends where the right of my
> > nose begins.
> How is a social contract necessary to redress torts like this? Doesn't
> civil law (which can be privately produced) address this well?
Maybe it can. You'll have to give me an example of how this might work in
> > I anticipate us requiring more intricate rule sets to enable us to
> > successfully interact with each other in the future and this is simply a
> > testament to the increasing complexity and depth of those interactions.
> Complexity does not /necessarily/ indicate progress. Sometimes simpler
> is better. Sometimes it's not. But that statement, whether made by you
> or me, has no content until you talk about /specifically/ which
> complications you favor and /why/.
It's no mystery-- the 'complexity' we are discussing are the laws
surrounding intellectual property.
I currently favor them to the extent that they grant the creator ownership
over his/her work, which to me represents the empowerment of the creator.
Additional complexity might result from how that creator chooses to
relinquish or impart that ownership, partially or completely, etc. and the
various benefits that will accrue from such relinquishment. Again, I say,
if you don't want to pay me for my work, what makes you think you have a
right to it? Shouldn't the creator have the freedom to determine how those
rights will be allotted? I don't care if the arrangement is intricate, I
just think the default state should favor the person who came up with the
This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:35:16 MDT