Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment: Re:Is this Google's fault? (Score 1) 307

by ThePhilips (#49627133) Attached to: Google Can't Ignore the Android Update Problem Any Longer

Then make a point to push for a model where every major X. release gets X.Y minor updates and bug fixes.

AFAIK that's how it worked - till 4.x series.

But with 4.x series Google has broke the pattern: almost every 4.x release was a major release with incompatible interface.

Effectively of Android 4.x, the Android OS is on a rolling releases.

Comment: Re:Is this Google's fault? (Score 1) 307

by ThePhilips (#49625205) Attached to: Google Can't Ignore the Android Update Problem Any Longer

How Google can make the updates mandatory, if they keep bumping up the H/W requirements with every release?

And in what universe a major OS overhaul still qualifies as an "update"?

Some vendors are pretty active in the Android development, but they simply can't expose themselves to the risks involved in supplanting a whole OS to just fix few bugs. Important bugs - yes. But the risk is the bricking of the whole device, of which Google would bear no brunt, while manufacturers are exposed 100%.

Google's stance on updates (and lots lots of other things) is simply lazy, arrogant and short-sighted. I can't even start guessing why some people (especially here) keep evangelizing them so much. (Though, earning shitload of money while being lazy and arrogant, might explain it.)

Comment: Re:You still need creativity. (Score 2) 396

by ThePhilips (#49620585) Attached to: The Programming Talent Myth

I have worked ~5 years in maintenance and have intentionally omitted it.

In maintenance, for it to be any effective, the problem is even worse: you have to be twice as creative, to find solutions which do not have side-effects yet fit the original specification/requirements.

I have seen many results of maintenance done by the "generic" outsourced stuff and on average they were not spectacular, and occasionally they were outright disastrous.

Comment: You still need creativity. (Score 4, Insightful) 396

by ThePhilips (#49619691) Attached to: The Programming Talent Myth

The truth is that programming isn't a passion or a talent, says Edge, it is just a bunch of skills that can be learned.

Why people forget the creative side of programming?

The programming is indeed bunch of skills. But if you do not have right mind set - inquisitive and creative - your career in programming would be full of frustration.

The only software where one doesn't really need any creativity - is already written and there is literally no work there.

P.S. Of course there is the "flip side" to the creative side of the programming - "monkey" coding and testing. But for most of this work you do not even need to have any deep programming skills. Reading and comprehending documentation fully (an ability which is again easily forgotten by the sensational headline writers) is more useful and also much under-appreciated. And then there is also the writing of tech documentation...

Comment: Re:I'm not necessarily against the idea but... (Score 1) 321

by ThePhilips (#49599635) Attached to: Mozilla Begins To Move Towards HTTPS-Only Web

HTTPS is already designed with that kind of decoupling in mind. But it wouldn't make sense to offer encryption without identity verification to the end-user, because that would make the encryption useless, so any protocol that does encryption has to do both.

I know that. That's basic AAA.

Also note that for an effective MITM attack you would need to have new certificate for which you have got the private key. There are a number of things that will make this increasingly difficult in the future, like certificate pinning, increased willingness of browsers and OS vendors to blacklist CAs, and increased monitoring for rogue certificates which makes it easier to find rogue CAs.

I think you fail to realize the scale, the proportions, of the opposition the browsers face.

It's not some script kiddies who are threat here.

That's countries covering close to a half planet's population. They might as well simply outlaw the browsers. In fact, they already do outlaw some encryption software.

I personally would still argue that the CA system is the Achilles heel of HTTPS but the situation is getting better and it's a matter of time until we get a more distributed and robust way of certificate verification.

But that's another problem: you can't make CA distributed. CAs are the "single point of failure" which are allowed to be that, based on the promise that they will work hard not to fail. Making it distributed would basically nullify the promise, making the whole CA system vulnerable. IOW, nothing changes.

Comment: Re:I'm not necessarily against the idea but... (Score 1) 321

by ThePhilips (#49595403) Attached to: Mozilla Begins To Move Towards HTTPS-Only Web

Even with the identity verification, the encryption is not a guarantee against the MITM.

Because the man (the one in the middle) could have hijacked the certificate.

The oft quoted example here is the China injecting the JS into the unencrypted traffic. They probably do not even need to hack anything to hijack the certificate - they likely already have the laws which force the CA to hand over the certificates legally. And once that happens, back you are at the drawing board.

Decoupling at least allows the two technologies (A) to be developed independently and (B) to be easier replaced.

Comment: Re: What is wrong with SCTP and DCCP? (Score 1) 84

by ThePhilips (#49509299) Attached to: Google To Propose QUIC As IETF Standard

[...] TLS on TCP is lots slower when there is any packet loss.

And how a (almost) stateless protocol like QUIC supposed to handle the packet loss any better?

The previous write-ups about the Google protocols were all like one based on the premise that packet loss is a very very rare occurrence. That's why they use effectively a stateless transport: because they assume that errors are rare. In other words, they are too very bad at handling it.

Coming from the old days of IPX vs TCP debates, I remember how the IPX proponents were going abruptly silent in the face of a bad network connection: IPX wasn't able to transfer literally anything, while TCP slowly was churning data, allowing to download OS update and fix the issue. It would be hilarious (and not unexpected) if (or rather when) Google would step into the same cowpie.

Comment: Re:I want to try it (Score 1) 229

by ThePhilips (#49491591) Attached to: GNU Hurd 0.6 Released

opengl driver

3D graphics is an outlier in the driver development.

But for a useable desktop, 2D graphics is sufficient. The 2D is commonly supported via GPU's ROM and as such implementing a 2D driver isn't hard.

Even 3D in itself isn't as hard. The problem is that games require (A) lots of edge cases optimized and (B) huge number of acceleration features implemented.

Comment: Re:Anything unique? (Score 1) 223

by ThePhilips (#49411179) Attached to: Mono 4 Released, First Version To Adopt Microsoft Code

If you insist on your definition of RAD you'll likely run into limitations (any RAD system) and be disappointed.

No, I will not be. I have used in the past the Borland Delphi for 5+ years and well aware of the limitations which come with the paradigm (rigid system libraries, "there is only true way to do it", "if there is no button for it, it's impossible", and so on). (And yes, to this day, I deeply hate Borland Delphi.)

I'm interested in RAD for specific purpose, so to say. To show that GUI development can be as easy as writing 10-20 lines of shell, but with the bonus of having a UI which is little bit more than text console. And, well, introduce some GUI into the Linux part of the product.

I don't really see the point of full RAD to be honest.

I do not look at them as a programming language or programming environment.

I see them as a tool to quickly develop and deploy a simple GUI, when the text console doesn't cut it.

This is now. Later is later.

Working...