It was JavaScript that killed the Java applet

Why did Alan Kay say, "The internet was so well made, but the web was made by amateurs"?


OK, so I paraphrased. The full quote:

The internet has been done so well that most people consider it a natural resource like the Pacific Ocean rather than something man-made. When was the last time a technology with such a scale was so flawless? The web is a joke by comparison. The web was made by amateurs. - Alan Kay.

I am trying to understand the history of the internet and the internet and that statement is difficult to understand. I've read elsewhere that the internet is now being used for very different things than it was intended, and that maybe this plays a role.

What makes the internet so good and what makes the web so amateur?

(Of course, Alan Kay is fallible, and no one here is Alan Kay, so we can't exactly know why he said that, but what are possible explanations?)

* See also the original interview *.






Reply:


This is exactly the topic he goes into on the second page of the interview. It's not the technical flaws of the protocol he is complaining about, but the vision of the web browser designer. As he put it:

You want it to be a mini OS and the people who created the browser mistook it for an application.

He gives some specific examples, such as the Wikipedia page in a programming language that cannot run sample programs in that language, and the lack of WYSIWYG editing even though it was available in desktop applications long before the web existed. 23 years later, we are barely able to circumvent the limitations imposed by the original web browser design decisions.







In some ways he was right. The original (pre-determined) versions of HTML, HTTP, and URL were Developed by amateurs (not standard people). And there are aspects of the respective designs ... and the following (original) specifications ... that (to put it politely) are not as good as they could have been. For example:

  • HTML didn't separate the structure / content from the presentation and it took a number of revisions ... and additional specifications (CSS) to fix this.

  • HTTP 1.0 was very inefficient and required a new TCP connection for each "document" retrieved.

  • The URL specification was actually an attempt to roll back a specification for something that was essentially ad hoc and inconsistent. There are still gaps in the definition of schemes, and the syntax rules for URLs (e.g. what has to be escaped where) are baroque.

And if there had been more "professional" standards in the past, many of these "missteps" would be may not have been made . (Of course we will never know.)

However, the railway has succeeded splendidly in spite of of these things. And all merit should go to the people who made it possible. Whether or not they were "amateurs" at the time, they are definitely not amateurs now.







It seems to be due to a fundamental disagreement between Alan Kay and the people (especially Tim Berners-Lee) who designed the Web about how such a system should work.

According to Kay, the ideal browser should actually be a mini operating system with only one task: safely execute code downloaded from the Internet. In Kay's design, the web is not made up of pages, but of black box "objects" that can contain any type of code (provided it is secure). For this reason, a browser should not have any functions. A browser would not need an HTML parser or rendering engine, as all of these should be implemented by the objects. This is also the reason why he doesn't seem to like standards. If the content is not being rendered by the browser but by the object itself, no standard is required.

Obviously, this would be immensely more powerful than today's web, where pages are constrained by the errors and limitations of current browsers and web standards.

The philosophy of Tim Berners-Lee, the inventor of the web, is almost exactly the opposite. The Least Performance Principle document outlines the design principles underlying HTTP, HTML, URLs, and so on. He points out the advantages of restrictions. For example, it is easier to parse a well-specified declarative language like HTML, which search engines like Google make possible. Indexing black box objects is not really possible on Kay's web. So the lack of constraints on the objects makes them less useful. How valuable are powerful objects if you can't find them? And without a standard notion of links and URLs, Google's page rank algorithm couldn't work. And no bookmarks for that matter either.

Another problem is content creation. Now we have different tools, but from the beginning any amateur could learn to create an HTML page in the editor. This is what kickstarted the web and spread it like wildfire. Consider if the only way to build a web page is to start programming your own rendering engine. The barrier to entry would be immense.

Java applets and Silverlight are somewhat similar to Kay's vision. Both systems are much more flexible and powerful than the Internet (since you could implement a browser on it) but suffer from the problems described above. And both technologies are basically dead in the water.

Tim Berners-Lee was a computer scientist who had experience with networks and information systems before he invented the web. It seems that Kay doesn't understand the ideas behind the web, which is why he believes the designers are amateurs with no knowledge of computer history. But Tim Berners-Lee was definitely not an amateur.







I read this when Kay was unaware of the lower level logs to assume they are much cleaner than the higher level web. The "Designed by Pros" era he speaks of still had major security issues (spoofing is still too easy), reliability and performance, so new work is still being done to fine-tune everything for high speed or high packet loss connections. Go back a little further and hostnames were resolved by searching through a text file that people had to distribute!

Both systems are complex heterogeneous systems and have significant backward compatibility issues when trying to repair a wart. Problems are easy to spot, difficult to fix, and as the number of failed competitors show, it's surprisingly hard to design something equivalent without going through the same learning curve.

As a biologist might tell an advocate of a smart design, if you look at one of these and discover that it is an ingenious design, you are not looking carefully enough.




Ahh yes, I asked Alan this question a few times, for example when he was in Potsdam and was on the Fonc mailing list. Here is a more recent quote from the list that I put together very well:

After literally decades of trying to add more and more functionality and still not conforming to the software when it was running on the machines that were running the original browser, they are slowly getting the idea that they should safely run programs written by others . Only in the last few years - with Native Client in Chrome - can really fast programs be downloaded securely as executable files without the permission of a SysAdmin.

I understand his various answers to mean that web browsers should not display (HTML) documents, possibly enriched, but simply run programs. Personally, I think he's wrong, even though I can see where he's from. With ActiveX, Java applets, Flash, and now "rich" JavaScript apps, we've had this before, and the experience has generally not been good, and my personal opinion is that most JavaScript websites are already stepping back from good ones HTML represent websites, no stopping forward.

In theory, of course, it all makes sense: trying to gradually add interactivity to what is basically document description language is backwards and is equivalent to adding more and more epicycles to the Ptolemaic system while the "correct" answer is figuring out (abundant) text is a special case of a program and therefore we should only broadcast programs.

However, given the practical success of the WWW, I think it advisable to modify our theories instead of criticizing the WWW for not conforming to our theories.







It cannot really be said that the internet or the web was invented by amateurs or professionals as these areas are completely new. All humans were amateurs at Internet logs before they were invented. From the point of view of Internet inventors, these were amateurs too.

If we really wanted to be able to make judgments, the internet wasn't that great: IPv6 is needed. And it's not just about the address space; IPv6 has a new header with fewer and different fields.

Another big difference to the Internet and the Web is how they are perceived by the programmer. A programmer rarely interacts with the Internet. From his point of view, you have an additional port in IP addresses and TCP and you can be sure that the packets will be sent. That's about it ... While the programmer has more intense interaction with the web: HTTP methods, headers, HTML, URLs, etc. It's normal to see the limits of something with a lot more options than something with almost none at all Possibilities. With that I don't mean to say that the internet is easy:

In terms of the size of these two technologies, the internet is so valued because it is a very scalable technology and the idea of ​​layering was very good. Basically, you can use any technology you want on the lower levels (WLAN, Ethernet, Token Ring, etc.) and use IP as the standard intermediate protocol on which TCP and UDP are placed. Basically, you can add the desired application log via this.

The size of the web is closely related to the size of the internet as the web is heavily dependent on the internet and has the TCP / IP stack underneath. But I would say the internet is also dependent on the web. The internet existed 20 years before the web and was kind of anonymous, but 20 years after the web the internet is ubiquitous and all thanks to the web.







I think he was pointing to something a little less dark - TBL knew nothing about the hypertext work that had been done in the 1960s so that work did not affect the design of the web. He often speaks of computers as pop culture in which practitioners are ignorant of their history and "keep reinventing the flat tire".


The Internet has proven itself remarkably well as a prototype of the packet switching concept discovered by Baran, Pouzin and contemporaries. Contrary to popular belief, this does not mean that IPv4 is traditionally the perfect protocol architecture or that IPv6 is the way to go. John Day, who was instrumental in the development of ARPANET and IP, explains this in his book Patterns of Network Architecture, published in 2008.

As for the web, in the words of Richard Gabriel, "worse is better". Tim Berners-Lee's report Weaving The Web is decent. Like the web Created by Gillies & Cailliau, it is denser and less legible, but has lots of detail and some fascinating links to other personal computing events. I don't think Kay gives enough credit for that.


I don't know any part of the non-web internet has some terrible warts. Email was before the web and is part of the internet. The standard is very open and requires many additional hacks to address (but not solve) the spam problem.




"Amateur" does not refer to a lack of programming skills, but rather to a lack of imagination.

The fundamental problem with Tim Berners-Lee's Web is that it was never made for developers . (This is in stark contrast to Alan Kay's web.)

Tim's train was built for non-programmers who would publish on the web directly by amateurism with files with their magazines / articles interspersed with HT markup language: it's like the 1980s WordPerfect and MS-Word except they would "use" <b> < / b> "instead of clicking on the symbol and displaying it as an open" .htm "Format instead of proprietary" .doc "Format to save. The invention is here the " <a> "Tag with which these static journals / articles can be linked globally.

And that's it, that's Tim's entire web vision: his web is one nakedness global highway of interconnected static Objects. If you had the money, maybe you could use an editor like Dreamweaver, Nexus, Publisher, Citydesk ( ? ) Etc., with which you can buy all of these " <b> </ b> "Tags by clicking the icon.

..And we see how his vision didn't work as intended. In fact, there have been powerful red flags from the start that the world wanted far more than what Tim's vision offered:

  • Red Flag 1: The rapid rise of "Smart CGI" (PHP).

  • Rote Fahne 2: The rapid rise of "Smart HTML" (Javascript).

These days there are even more red flags like the rise of Chrome-OS-is-the-browser-is-the-OS ( exactly the , what Alan Kay intended for the browser) and WASM / browser extensions.


Unlike Tim's Web, Alan Kay's Web is one dynamic Web designed for programmers: a global highway of interconnected dynamic Programs. Not- programmer who need a "Page" just publish one by use a program on the web . (And the program itself was obviously written by programmers, not HTML players.)

This is exactly the status quo of Tim's web in the 2000s, but if we had Alan's web it would have been in the 1990s. You're right when the web started in the 1990s.

Similarly, we will no longer have programs like Steam, Visual Studio, Warcraft, and VM Ware on the web in the 2040s, but now in the 2010s. (The delay of several decades is due to these programs already for the OS-is-the-browser-is-the-browser, reducing the economic incentive to recreate them on the OS-is-the-browser-is-the-new OS.)

So the is what people mean when they say Tim Berners-Lee had killed the real Dynamic Web by pushing his "shabby static web" onto the world. Ever heard of the terms "Web 2.0" and "Web 3.0"? They would have just been called "The Web" if we had Alan's Web instead of Tim's Web. But Tim's web constantly needs to be revised because it's so static .

Obviously, all hope is not lost as the web can be rebuilt as browser vendors define it. But the whole point is that all of these things they "invent" on the internet were invented long ago. We could have everything today, not tomorrow.

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from.

By continuing, you consent to our use of cookies and other tracking technologies and affirm you're at least 16 years old or have consent from a parent or guardian.

You can read details in our Cookie policy and Privacy policy.