Comment Re:Babies first fork! (Score 1) 826
It's not Linux, it's the runtime system, so there is no need to fork... but I think it's about time for a distribution to actually take the Unix philosophy to heart and to throw out all that new crap.
It's not Linux, it's the runtime system, so there is no need to fork... but I think it's about time for a distribution to actually take the Unix philosophy to heart and to throw out all that new crap.
The time when people who had no idea started to believe they could create something better then UNIX.
This created the mess we had in the 1990s when it was OK to have log files in binary files and you could only view them through a small non-resizable window. A time when you could display the owners of open files on your network share... but you couldn't do anything with that info except for writing it down and manually act upon it. Of course that data was also displayed in a small non-resizable window.
There is a reason why normal init is based on shell scripts, and that reason is simply because there is no reason against it. Shell scripts are perfectly adequate for that job. Binaries on Unix however make it much more difficult to deal with them. If you want to edit a binary you have to get the source code, edit it, make it compile, and then hope it'll run. It's even worse when you have dynamically linked binaries since they depend on other files, particularly for init.
Binaries are just a botched solution to somehow get faster execution. The whole design of Unix doesn't require them. Unlike Windows you shouldn't need to link in a library to have some API you should instead have a little program you can call. (actually Windows now has a little wrapper allowing you to make arbitrary calls to DLLs since even Microsoft has recognized the problem)
Dynamically linked libraries are just a botched solution for the problem of library bloat. You shouldn't need them, if you want some feature you should just call the program implementing it. That's how bc worked originally. All the calculation functionality was in dc and bc just re-formated its input to what dc expects. The problem why this doesn't work well any more is long startup times caused by library bloat.
Well you can fix all the problems existing in current SoCs. For example you could build an architecture which enables you to have multiple SoC boot up from the same image, just like the PC does. You could have basic hardware support without binary blobs.
In essence you could create a new portable platform where you could, for example, swap out the operating system on your mobile phone just by putting another OS onto your SD-card. That way even if your vendor doesn't support your device anymore, you can still get the latest version of whatever operating system you want. In fact as it'll greatly simplify making distributions, you could even get specially tailored operating systems for your needs.
I hope the people doing this will understand their chance.
Eventually commercial companies will end up like Siemens. Decisions will gradually take longer and longer, causing more and more engineers to be needed. This means that more bad engineers enter the company so more rules need to be set in place so those won't mess up to much. This will make the good engineers frustrated so they leave.
What you end up with is a company where your good engineers constantly evaporate, and you end up paying ridiculous amounts of money just to keep the rest. Those people will then feel like they actually know something since there are no better people to learn left in that company and they are paid huge amounts of money. This enforces their Krüger-Dunning-Effect and makes them toxic.
They don't understand how things work and therefore believe their ridiculous ideas are actually good. Those ideas cause more work and more frustration for the few good people who drift into the company.
In the end you'll end up with a huge amount of highly paid idiots bringing out inferior products. Since there rarely is competition in the real world, the company will stay in business. Should the company be in competition it is, by that point, already to big to fail and therefore will be saved by the government.
Yes I'd use simple SIP video phones. There's no need to go to Skype. You can either have your own SIP server, or use one of the many SIP providers which can even give you connectivity to the public telephone network at decent prices, if you want.
Ahh sorry, haven't seen that one yet.
Being a small stock holder is meaningless, you need a controlling share, or at least some noticeable share like 10%.
It doesn't matter what the consumer wants. What matters is what operators and manufacturers want. There is no way manufacturers are going to get feedback from consumers on such complex things. All they get is sales numbers, but they have no idea why a certain product sells or not. That's why Blackberry added colour touch screens since they don't understand what the potential of their product is. They see Apple being successful with touchscreen phones and so also try touchscreen phones.
Of course you can always use the democratic aspect of capitalism and just buy a mobile phone company, and make them build whatever device you want.
It's an attempt to get the most "bang for the buck". Essentially you write lots of small programs which have limited and well defined functionality, then you hook them up any way you like. In fact taken to the extreme (as with Plan9) you can do anything with simple shell scripts.
BTW there are simpler developing environments out there which have a decent feature set, without the complexity of a C(++) toolchain. Lazarus is just one example of it. Of course you then loose flexibility. Lazarus, for example, is mostly suitable for GUI applications. Writing a webserver with it is hard. Of course it does GUI decently well, allowing you to have one codebase compiling from everything from your bog standard Linxux (GTK) over MacOSX, Android to even exotic platforms like Win32.
That's because those devices will, like "smart"-phones, cater to the lowest denominator. In the end you'll end up with a device that's hard to program, preventing "casual programming", while allowing malware via some store.
So far the closest thing I've seen to a smart watch was the HP-01.
I mean those devices are sold as "business mobiles". Yet the keyboard lacks all important keys. For example there are no modifier keys and not even an "Escape" key.
How are you supposed to use, for example Microsoft Word, on such a thing.
I mean it's obviously foolish to not get some proper education, and at companies you typically only learn how not to do it. A formal education can bring you the inspiration and time to become a decent programmer.
However, currently there is the rare chance of a second ".com"-bubble. Companies are hiring just about anybody and paying them insane amounts of money. It's like in that old documentary I've seen about Netscape where they all thought they'd be great... but if you look at the actual product you'll find that it's unacceptably bad, by any standard except for 1990s commercial software standards.
So, if you manage to keep your standard of living low, you can milk a company for the money. Then when it'll collapse in 1 or 2 years you can get some proper education.
Well first of all the usual stuff. It needs to be completely open source and have an open bootloader so there is a chance of security. It also needs to have rather simple code so it can be checked, as well as decent battery life.
Then there is the whole issue of user interfaces which isn't even solved for mobile phones these days. What you need is a powerful interface that works on small devices. So far the best contestant in that area seems to be the HP-01 calculator watch.
http://www.led-forever.com/htm...
It allows you to start a stop watch, and then use the result in real time to do calculations on it.
Unfortunately it seems like "smartwatch" manufacturers will go the other route, making them rather useless. Just like they already did with the idea of a "smartphone" when they turned it from something like the Nokia Communicator to something like the iPhone.
There is, apart from some clouds, nothing in between. Those are ideal conditions. Considering that even the radio links of the moon missions had a few megabits of channel capacity, that's not very much. (Yes those links were analog, but Shannon has showed that you can still express the capacity of such a channel in bits or shannons)
So you are still looking through a letterbox. This may be acceptable in situations where you need lots of width.
It's a typical "market research" product. People put 2 screens next to each other and complain about the bezel, a company realizes this and makes a "double wide" monitor.
People don't put 2 screens next to each other because they want to have just a wider screen. They do so because they want to have a larger screens. Putting screens on top of each other is, however, rather difficult. That's why they are put next to each other.
What people actually want is a large high definition screen. Ideally with more than 2000 pixels in height. That way you can put whole designs on your screen without having to constantly scroll and zoom around. Just imagine routing a wire on a board and being able to see where you're going.
The rule on staying alive as a forecaster is to give 'em a number or give 'em a date, but never give 'em both at once. -- Jane Bryant Quinn