Emlyn, <emlyn@one.net.au>, writes:
> I've read Berner-Lee saying that he never intended hand-coding of HTML, and
> that the first web-related application was a primitive WYZIWYG html editor
> that he built himself.
Yes, I think both of these are right. His original idea was that
people would not write HTML manually, but rather that what we know as
web browsers would have built-in authoring/editing facilities. He wrote
a system like this using Objective C (an early object oriented C) on the
Next workstation. Later a company called Navisoft here in Santa Barbara
produced a similar product for PCs, but they got acquired by AOL which
eventually killed it. (Several of my friends managed to retire at 30
with fortunes in AOL stock options though.)
However XML is intended to maintain the simplicity, readability and,
indeed, writability of HTML. And I note that one of the requirements
for XLinks is that they are human readable, which probably implies
human writable. So I think he has come around on this issue.
> The goals of machine readability, etc, etc, of XML are laudable. How long do
> you reckon they'll last in the face of big commercial interests & a zillion
> (well, many million) users of various flavours, with their own ideas about
> how they want to use this thing called the web?
Fragmentation is always a danger, but weighed against this is the
advantage of standardization. And of course XML is "eXtensible" so the
hope is that people can customize it to the degree they desire while
staying within the overall framework of compatibility.
The idea of the Semantic Web is that people, and especially machines,
are seeking information. Data, in this model, should be expressed in
"declarative" form. That is, the web pages should say what the data is,
what it means, what its relationships are, and not try to express what
you should do with it.
http://www.xml.com/pub/a/2000/12/xml2000/timbl.html describes a talk
by Berners-Lee in which he lays out his ideas a little more clearly.
Above the basic declarative data layer he projects higher layers
which support logical connectives, so that you could combine data from
different sources and draw logical inferences in a quantitative way.
These higher layers are pretty speculative, almost approaching the AI
on the web concepts like www.webmind.com.
But the basic techniques of using XML for the raw data and style sheets to
"prettify" it for human consumption are already implemented by the newest
browsers. This should work well for B2B collaborations or any situation
where you want to promote automated reading of your published data.
I think it's likely to succeed.
Hal
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:34 MDT