Yes. We really need to take a hard look at network transparent displays in the context of what we can really do today as well as the future.
When I did this, 10T networks were common, and just a little slow for something like CAD. 100T networks were growing in popularity, and then we sort of jumped to 1000T.
Also during that time, I started on dialup, moved to DSL, and then more came.
Know what? The fiber connection I have in my home is fast enough to run X with few worries today.
And it's going to improve more. My 4G cell phone can run X too. Amazing!
Honestly, I miss the vision our early innovators had. In a way, the field was more open and people could build without so many legacy ties. The need to incorporate those into the next step is holding people back. Legacy "screen scrapers" should get attention. They are useful, and they do have advantages for application developers.
Network transparent, multi-user, concurrent multi-processor, networked computing is the bar to cross, and if we don't maximize it, we risk losing out on a lot of the potental.
Sad really.
All I know, is I won arguments back then, and I did it on UNIX when the dominant move was to Windows and the PC, and all that distributed software bullshit we face today. Won solid. No fucking contest.
The difference was really understanding how things worked and applying that instead of following the cookie cutter stuff we see being done so often today.
With X, one can distribute or centralize as needed!
Fonts on one machine, window manager on another, application on another, storage on yet another, graphics server on yet another, or even better, how about a few displays, each capable of serving a user?
Or, pile it all on one box somebody can carry with them!
Doesn't matter with X. It's all trade-offs, and this leaves people to structure things how it makes best sense to them. For some, having very strong local compute / storage / graphics / I/O is best. For others, centralizing that pays off the best.
Only X does this. Nothing else does, or has.
The screen scrapers are impressive, but they really aren't multi-user in the sense that X is, and that requires a lot of kludges, system resources, etc... to manage things.
I remember the day I read about X in BYTE. It changed how I viewed computing, and when I got my chance, I went for it whole hog and it paid off very well.
Also IMHO, part of this vision really should be to provide developers with dead simple tools to get things done. It is true that building an efficient network aware application takes some work. SGI, BTW, did educate people. If you developed on IRIX, you got the tools to make it all happen, and you get the education and consulting of a vendor who knew their shit cold.
Today, we don't have that surrounding X, and it's hurting development pretty big.
Back in the 90's, I was doing video conferencing, running things all over the place on lots of machines, melding Linux, IRIX, Windows, etc... together in powerful ways, often using machines secured from a dumpster. No joke.
We've managed to cobble that together again, but it's a far cry from what could have been, and could still be with people thinking this stuff through like it was the first time.
IMHO, the other real problem is as I've stated. We have a whole generation of people doing this stuff now who basically have no clue! They were never introduced to multi-user computing properly, never got to experience X as intended, etc...
When I explain some of this to people, they make comments like, "sounds like Star Trek" and "amazing", "wish I were there..."
Yeah. I was. Many of us here were.