Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Reference implementation ? Remember Amaya (W3C) (Score 1) 150

Amaya: The last of the web authoring browsers. I remember it fondly.
It isn't fast, but it can do everything, as long as "everything" doesn't include XHTML2 and HTML5.

In practice, Chrome is the reference implementation. Whenever Chrome breaks compatibility, it is every other browser that is broken.
The W3C was created to prevent exactly that from happening.

Which is why the WHAT-WG is now the officially inofficial web standards body. It documents Chrome in a way that makes it look as if it wasn't all about Chrome. They call it a "living document", which means there will never be a standard for HTML5. Shame, because there are some useful things in there, like definition lists.

Comment Re:Inertia and Too Big to Change (Score 4, Informative) 150

we're stuck with a 26 year old web html / javascript / css / connectionless protocol technology stack

HTTP is not connectionless, it is based on TCP, which emulates telephone connections.
HTTP 1.1 has support for re-using an open TCP connection.
(HTTP/3 uses QUIC instead of TCP, which does connection multiplexing instead.)

HTTP is stateless, or at least it's supposed to be. It's a request-response protocol for documents. For that purpose, it has largely replaced Gopher and FTP.
It has the advantage of reporting the MIME type of the document.

HTTP and HTML date from 1989, which is a little more than 26 years old. Closer to 36. (I guess the naughties never happened?)
HTTPS dates from 1994, JavaScript dates from 1995, and CSS dates from 1996.
HTTP/1.1 dates from 1997.

none of the largest companies and stakeholders with a voice are willing to propose a significant replacement

There is the Gemini Protocol which has all the advantages of HTTP(S), but is much cleaner and simpler and faster.

Google does not seem to like it.

Anyway, neither HTTP nor Gopher nor Gemini are very good at being hypertext protocols. They don't do versioning, they don't do back-references, and what support there is for distributed caches in HTTP ("proxies") doesn't work very well - it is designed for reducing traffic, not for providing redundancy. (BitTorrent does that better.)

The optimization of compressing web content and combining HTTP requests is a band-aid

Support for gzip in connections is very useful. It means that compressed documents can keep their multipurpose internet mail exchange type, rather than the server reporting an application/gzip with no clue what to do with it. And text comresses very well.
And re-using existing connections reduces latency, because you don't need a three-way handshake for every document.
Those are solutions, not band-aids, but you are correct in that they don't solve problems other than the ones that they solve.

Connection multiplexing, what HTTP/2 and HTTP/3 do, only increases overhead. It belongs to the problem space, not to the solution space.

still are underlaid with the legacy JavaScript technology

The script tag in HTML allows specifying which scripting language to use, but in practice, only JavaScript is relevant, and that is not guaranteed to work. The venerable Lynx Browser/a> simply ignores it, as explicitly allowed by the W3 specification. In theory, you could implement a VM for any scripting language in your browser, like Dart, Coffescript, or Lua, but nobody bothers.
Probably for good reason. Injecting random executable code (like JavaScript) into documents (like HTML) is generally considered a severe security risk. Remember the grief that Microsoft got for executable macros in Word and WMF documents. Zero-click remote code execution.

legacy layout

HTML has a few severe short-comings when compared to more mature mark-up languages like ROFF or LaTEX.
For one, it does not distinguish between spaces between words and spaces between sentences, which makes text unnecessarily harder to read. How can you tell if a period at the end of a word makes the end of a sentence or an abbreviation?

Then there is the ambiguity of paragraphs. XHTML allows paragraphs to be nested, HTML4 and HTML5 do not. Now, you might think that nesting paragraphs doesn't make any sense, but HTML documents are full of div-Tags for structure. There is no standard way of grouping a picture with a caption, for example. Sections are separated by headers, but HTML layout engines generally don't treat the text between sections as one unit. But why should you need a tag specifically for for separating paragraphs anyway, except for nesting?

HTML does have the distinction of being media agnostic. Where DVI and PDF have strict page layouts, HTML doesn't care if it is rendered on an old black-and-white television screen, an RGB laser projector, a sheet of A4 paper, a sheet of US Letter paper, a text-to-speech system, or a Braille line display, or anything else. How an HTML document is rendered is inevitably and by necessity up to the user agent. And so it does not provide the typical 12 section layout from print media.

It is not particularly fit for the purpose of author-defined layouts. That's what PDF is for, which can also contain forms and random executable code.
And of course HTML is used for exactly that.

BBC.co.uk should not require 99+ http requests top get 7 MB of data for example.

Does it? The BCC is one those web sites that still work in Lynx, NetSurf, and Dillo. That is not a good example of what is going wrong on the web.

The main problem is the web frameworks that are desperatly trying to re-invent X11 with extra steps and more overhead.

Comment Re:What state of undress is acceptable? (Score 3, Informative) 70

It detects undressing, so I assume no amount of undressing is safe. You must leave your hat on.

On the other hand, if you start out utterly in the nude, you should be safe. To prevent your call from freezing, you should always answer in your best birthday suit.

Comment Re:KIller Whales eat people, so.... (Score 1) 73

They don't attack people. A few of them have been known to attack boats, but that seems to have been more out of curiosity, not out of malice. Their playing with the boats like they were balls has damaged them severly, and sunk a few of them.

From what I gather, they seem to have stopped doing that.

Comment Re:Vibe coding was already popular 30-40 years ago (Score 1) 179

Remember when "hand wrote html" users made fun of Frontpage and Dreamweaver users?

No, I remember being confused why users would make things more difficult for themselves. HTML is just text, plain text with some mark-up here and there. Then it mutated into a collection of formal and semi-formal document formats. And now it appears to be simply the placeholder, generated by some framework or other, for a manifest for an Angular compiler.

And Visual Basic users?

I remember what Dijkstra said about Basic.
And learning to keep logic and presentation separate.

Game jams are not about clever code, but about originality in game design. Game maker tool kits are limited to established and formulaic genres, which is a typical hacker solution: Solve the problem in general so the user can focus on the particulars, like aesthetics and level design. (They have largely been replaced with game engines now.) Most hackathons focus on the code instead. Completely different problem set. Not really comparable at all.

I get your point that AI may become another layer in the dev stack, above the linker, compiler, and assembler, replacing IDEs. (Some IDEs already offer AI integration, in the vein of the ever elusive "programmer assistant".) That may be true, but this is not the first time that the idea of programming in plain natural langauge has been proposed, and every time is has proven impractical for the same reasons: Parsing natural language is extremely complex; and natural language is not a natural fit for formal algorithms anyway, which is why mathematicians and natural scientists have their own formal symbols. It has been said that if it became possible to program in plain English (which it arguably is now), it would turn out that programmers can't speak English. (Natural language tends to be ambiguous, even when ignoring Chomsky's division into formal and informal language modes.)

So yes, it probably will make computers more accessible to mundanes, but it won't make the job of the programmer any easier. It will almost certainly make it more difficult, with even more critical infrastructure running on bloated and obfuscated glorified shopping lists. It's not the productivity boost that incremental compilers and scripting languages were.

Comment Re:Indication of bad hackathon questions (Score 1) 179

You put that as if LLMs could write better code faster than programmers, outside of contests.

Sure, you don't bring a Lamborghini to run a marathon, because everyone already knows that cars run faster than athletes; that's why commuters are stuck in traffic for hours every day.

Similarly, computing machinery has replaced human computers, because machines are faster and more precise. And autocode/compilers have replaced machine code programmers, for the same reason.

So if you think we need a rule that prevents computers from participating in coding contests, that implies that AI can replace code monkeys now.

That would be like Formula 1 racing needing a rule excluding RoboRace pilots.

Honestly, if generative AI could produce that kind of quality, programmers would simply write up the requirements and test specifications.
No code maintenance would be needed: Just have the machine spit out the optimal code for the requirements du jour.
I don't think we are at that point yet, if ever.

Until then, if cheating gets you better results, you'd be cheating yourself by not cheating.

Comment Re:Matter of definition (Score 1) 261

That's really taken out of context.

That's fair. But they do define intelligence as a thing that makes them money.

highly autonomous systems that outperform humans at most economically valuable work

A train fits that definition. No intelligence required.

A specialised AI will always outperform an AGI. That's the whole point of machinery: to be good at one thing.

I'm not sure which will happen first, an AGI beats Stockfish or

No need to guess: AlphaZero beat Stockfish in 2017.

Comment Matter of definition (Score 1) 261

There are two conflicting definitions for AGI.

The older one is any AI that is not designed for one specific task, but can learn new tasks and transfer skills from old tasks to new tasks. We have had that for years, but that kind of AI does not use human language, and is routinely ignored in discussions about AGI. Arguably it is not AGI because it cannot do everything a human can do, but then again, neither can humans.

The Sam Altmann definition is "an AI that makes me a billion dollars". The only type of AI that can possibly match that definition is Charles Stross' "slow AI" that is profit-oriented corporations. And we have had that for centuries. Any other type of AI is not going to match that definition because, as Peter Thiel put it, "AI has no moat": As soon as OpenAI can do it, anyone can do it.

Submission + - USENIX sunsets Annual Technical Conference after 30 years (usenix.org)

Synonymous Homonym writes: This year's USENIX ATC will be the last, but other USENIX conferences will keep happening.

Since USENIX's inception in 1975, it has been a key gathering place for innovators in the advanced computing systems community. The early days of meetings evolved into the two annual conferences, the USENIX Summer and Winter Conferences, which in 1995 merged into the single Annual Technical Conference that has continued to evolve and serve thousands of our constituents for 30 years.

USENIX recognizes the pivotal role that USENIX ATC has played in the shaping of the Association itself as well as the lives and careers of its many attendees and members. We also realize that change is inevitable, and all good things must come to an end:

The last ATC will include both a celebration of USENIX's 50th anniversary on the evening of Monday, July 7, and a tribute to USENIX ATC on the evening of Tuesday, July 8.

Comment Half correct (Score 2) 174

Hardware is more energy efficient than ever. And Software could make better use of it. In the trilemma of "good-cheap-fast, pick any two", good is always the one that doesn't get picked. That also means that correct, safe, secure, and resilient are at best afterthoughts.

Software optimisation is not the problem. Modern languages, compiled and interpreted, are doing a great job at optimising.

Monolithic designs are also not the answer; they are part of the problem. Intuitively one might expect that one big silo hiding all the complexity would be easier to optimise, but that is not the case: The complexity doesn't go away, and hiding the combinatorial explosion means you lose insight and maintainability.

Microservices are also not the answer. You want small, specialised tools that each do one thing, and do it well; and you want to build systems from those inherently parallel, scaleable tools. You don't want the overhead of long-running processes communicating over the network. You want small, short-lived processes running in parallel, managed by the operating system (or a VM efficiently using the IPC primitives of the OS). You want short, fast scripts using highly optimised specialised commands.

These are lessons that had to be learned over and over again. Artisans have learned them (good craftsmanship), mechanical engineers have learned them (respect the humble screw), electrical engineers have learned them (sockets and breadboards), and programmers have learned them (Unix philosophy).

For all the cruft in software, hardware also has room for improvement: Currently it is more cost effective to kluge together packages (and not in a modular way) than to design an elegant machine. We get processors that implement many ISAs in one and still stall for ALUs, while it requires legislation to make the battery replaceable.

But hardware has mass and volume, software has not. (Well, in a theoretical physics sense, it does, but practically that's immesureable.) So software expands to fill all available space and saturate every processor cycle; because empty RAM is wasted, and idling CPUs are just space heaters. And what used to be accomplished in a few kilobytes now takes gigabytes, which take longer to load than the old solutions took to compute. What used to be a few lines of text is now still text, but spread over several files using different structuring syntaxes, compressed, indexed, and served by a daemon using no structured query language.

Because if things were easy to read and easy to edit and easy to process, it wouldn't feel technical. It would feel like anyone could do it, it would fell like magic. At the same time, no thought is wasted on grokking the magic, because the IDE, the language server, and the LLM tell you how to work around the limitations that prevent you from making mistakes, saving you the trouble of understanding the theory.

Anyone can add complexity, and anyone will. Keeping things simple is the real skill.

And we don't keep things simple by ignoring complexity, one way or another.

But modular designs make innovation easier.

Slashdot Top Deals

Biology is the only science in which multiplication means the same thing as division.

Working...