Re: LAW: Legality of deep linking to Web sites

From: KPJ (kpj@sics.se)
Date: Mon Jan 22 2001 - 06:25:00 MST


It appears as if Chris Russo wrote:

|A better solution would probably be to have some dynamic renaming
|system that frequently re-maps the files to new URLs. Any specific
|URL should only last a few hours or a day (depending on what you're
|trying to accomplish). That way, no one could really expect to
|maintain deep links to your pages. Everyone would have to start at
|your commonly-accessed gateway to get the current links.

You seem to think in term of mapping file systems to published URI's.
Rather old-fashioned, and obsolescent, in my not-so humble opinion.

Compare this to creating a virtual URI space where the URI do not map
directly to physical files and directories, but instead into access
requests filtered through some form of dynamic generator program.

Example: <URL: https://www.sics.se/~kpj/kyber.shtml>
         <URL: http://www.sics.se/~kpj/kyber.shtml>

         This implements a file repository, simulating various operating
         systems, while adding an unseen layer of security for some areas.

Notes:

a. Currently, the example cruft above uses semi-permanent URI's, but it is
   quite trivial to make URI's change by the hour if one really wants to.

b. And if one really wants to make certain the reader gets to read all the
   important adverts, one could simply add them to each generated URI's
   contents.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:22 MDT