Follow Slashdot stories on Twitter


Forgot your password?

Comment: Re:Fix the clueless summary (Score 2) 96

by janoc (#49705501) Attached to: Russian Rocket Crashes In Siberia

Arg correction. Actually the video is misleading. The "sputter" and flameout in the video just under 2 minutes of flight is normal staging, the first stage separating and the second stage igniting - you can even see the second stage continuing on. The video has nothing to do with the accident.

According to the official press release, the accident has occurred at 497 second of flight, with the third stage having an "anomaly" at 161km of altitude. The satellite and the upper stage practically completely burned up in the atmosphere.

Comment: Fix the clueless summary (Score 1) 96

by janoc (#49705419) Attached to: Russian Rocket Crashes In Siberia

That was obviously not 3rd stage but the 1st one which failed. The incident occurred just under 2 minutes of flight, right about the time where the first stage burns out and the second stage ignites. The article mentions that the satellite with the third stage would have crashed somewhere in Siberia, not that the 3rd stage has malfunctioned. The 3rd stage motor is the one that provides the final boost and orbital corrections and probably didn't even ignite yet in this accident.

Learn to read and use common sense, folks!

Comment: Re:Kickstarter (Score 1) 227

That "90fps needed for no motion sickness" is just a big red herring. Sorry. We had VR for much longer than Oculus exists and there were ways to have usable VR even at 30fps. Sure, 90 looks better, but things can be done with less. Moreover, the motion sickness is primarily function of content, not framerate - you can have even 120fps and you will still get sick if the camera is wildy gyrating around.

And if nobody will have hardware that will be actually capable of hitting those 90fps at those resolutions that we won't have 5 minute gimmicks but only demos that companies will show off at tradeshows on hardware that mere mortals can't afford to buy. I am not sure that that is any less gimmicky ...

Comment: Re:Kickstarter (Score 5, Informative) 227

Let's put this stupid never ending meme to rest, shall we?

The 9500 Kickstarter backers got their DK1 for their money. Including me. I was one of the first ones. They have delivered what they have promised in the campaign, nothing less, nothing more.

Or do you really think that the development beyond the DK1 and the massive hiring that included people like Abrash and Carmack that has brought Oculus from a 3 person startup to a large company acquired by Facebook was actually financed by the Kickstarter money? You need to get real, those Kickstarter 2.5 millions were long gone by then. Yes, the Kickstarter got it off the ground but everything else was paid by venture capital - and Facebook. So the Kickstarter backers really don't have any reason to not be happy about what became of their money nor does Oculus have anything to report to them anymore.

Now whether the direction in which Oculus is going meshes with the ideals about "democratization of VR", cheap VR that everyone could enjoy etc. that is another discussion. Personally, I am not happy with what they are doing, because instead of making the VR cheap and easily accessible it is going to be a toy for the rich kids only. The minimal PC requirements are actually the least of the issues, even though it is something that the lay person is most likely to deal with.

The much worse problem is that their SDK is becoming more and more proprietary, closed binary blob that requires your 3D engine to pretty much build everything around it, otherwise it is a nightmare to integrate. It is pretty telling that even Unreal Engine 4 *still* doesn't have a good DK2 integration, year after DK2 is out - it is that complex and that intrusive to do and their heavily threaded and pipelined engine is not a good fit for the expectations the SDK has. I am afraid that with these crazy requirements the adoption by actual content producers - game studios, application developers, etc. is going to be minimal.

The massive effort required to re-engineer the games (both the engines and to adapt the content) to support the Rift will not pay off when only a small niche will be able to actually use it. Heck, current games are barely able to consistently hit 60fps at 1080p, here we are asking double the resolution and, should we follow the recommendations from Oculus, we should be targeting 90-120fps. Good luck with that ... Either the Oculus games will have massively reduced visual quality compared to the "normal" versions or will require insane hardware. Most likely both. I just don't see the game studios jumping on this bandwagon on a massive scale. I am afraid that what will most likely happen is that it ends up as yet another obscure and poorly supported gizmo, like the Razer Hydra, things like the Vuzix glasses, various shutter 3D glasses that were sold for PC over the years etc. A pity and a massively wasted opportunity, really.

That they have stopped the Linux and Mac support - I think it was obvious that this was only a matter of time. The writing was on the wall ever since they have released the DK2 with the two-part SDK architecture (closed source binary blob runtime and an open library to talk to it). The Linux and Mac SDKs were much delayed and when the SDK finally arrived, it wasn't full featured - e.g. the "direct" mode has never arrived to Linux (even though it is possible to make something like that work and probably with fewer bugs and glitches than the horrid driver hack they do on Windows).

The Mac SDK may eventually come back, but I am not having much hope - most Mac users have laptops and most laptops with discrete GPUs actually don't render directly to the external output but into a framebuffer of the integrated ("slow") GPU which then sends the image out. Which is the architecture that is explicitly not supported by Oculus. The Linux SDK is very likely dead for good, even though they won't say so. It just doesn't make commercial sense to go there, the market is small. So it will be likely languishing in limbo forever - not officially cancelled, but never to be seen again.

In conclusion, I am not happy with this state of affairs, but I am not regretting supporting Palmer when he launched his Kickstarter. I had quite a few exchanges with him by e-mail at the time when he was designing what became the infamous cardboard Rift prototype. He is a good guy who had a vision, a prototype and a skill to make it happen. Oculus is certainly a phenomenal success from this point of view already.

However, I don't see myself queuing to buy the commercial Rift version any more. I will rather invest my time and skills in making VR work with projectors and maybe other technologies that may yet appear. The direction they took the Rift to is a technological dead end, in my opinion - engineering and business running amok, without any regards for the user.

Comment: Re:Hardware/Software Systems (Score 4, Interesting) 420

by janoc (#49655103) Attached to: Ask Slashdot: Moving To an Offshore-Proof Career?

Even better, specialize.

Generic Javascript/PHP/Java/C# "trained monkey" coders are a dime a dozen and most likely available for less than you are asking for, especially if the work can be done by someone overseas with 1/10th of your living expenses.

On the other hand, if you are skilled in mathematics, computer graphics (algorithms, not Photoshop!), statistics or artificial intelligence, you are going to be in high demand. These are skills that are a lot harder to find and command a good price. The downside is that you have to spend a lot of time by learning. That doesn't mean you must spend years and top $$$ on a university degree (it does help, though!), but you will need to invest some significant time there.

Basically, it is pretty much the same story as basic machinists working on lathes being replaced by CNC operators and robots - you need to bring some added value to the business. The low end - the basic programming - is pretty much a commodity today, especially for large companies who can afford to offshore/outsource. You are nobody special because you know Javascript or C# today.

The other option is to work local - there will be always a market for small businesses/consultants catering to mom & pop businesses that need a website built, accounting or customer management system created, perhaps some reporting beyond what Excel can do. Those are too small fish for the big guys like SAP to go after and too small to be able to afford a team in India/Eastern Europe to manage their systems, not to mention that it would be really impractical. It is a large market - not everyone has to (and can) work for Facebook, Google or Microsoft today.

Comment: Re:Possibly bringing high grade slicers to everyon (Score 1) 81

From what I can piece together, they want to replace the CNC system, which has been around for over 60 years, with a new setup. I sceptical as to why as

Easy. They cannot control a g-code based platform because there are way too many open alternatives around already. If they build something proprietary that others will have to work with due to their market power, then they get to be a gatekeeper and collect "tax". That's all. Good old "embrace & extend" strategy again.

Now whether or not this will be successful or whether it even makes sense, I don't know. It could be a major flop with a framework/tool that essentially nobody will use, because 3D printers are far from something an average consumer needs or will buy. Even the hypothetical non-existent plug & play, "push a single button" replicator-like machine is not going to make this an ubiquitous piece of hardware on everyone's desk - most people simply don't need it and can get their trinkets and gizmos cheaper and ready-made elsewhere. However, the software could also end up becoming a defacto standard for consumer printers if they play their hand well ...

Comment: Re:only a year? (Score 4, Interesting) 125

Unfortunately many people think that VR didn't exist before Oculus Rift. Which is, of course, BS. There was good quality VR available before as well but unless you have worked for a large university, the military, NASA or some large aerospace/car company, you were unlikely to encounter it due to the costs of the equipment involved. Industry-grade HMDs still cost $15000, they used to start at $40k but I guess after the cheap Oculus became available, that price point became untenable. `CAVEs and large projection setups that are commonly used for both research and training work can cost millions.

The worse part is that also quite a few people from these companies - Oculus, Valve and others that are jumping on the VR bandwagon now - tend to ignore the decades of existing research. Most people there are businessmen and (briliant) engineers, not researchers (with a few exceptions). They tend to massively reinvent the wheel and to rediscover things known for many years, because they don't know where to look for them. If they didn't, they would know that increasing framerate and decreasing input latencies is not going to fix the motion sickness. Sure, laggy, smeared image in the HMD will make people sick. However, you can and will get sick even at 120fps - it depends much more on what you are rendering than at how you are rendering it. A virtual rollercoaster will make people throw up even at 4k resolution rendered at 250fps with perfect head tracking. It would look awesome, though ...

The problem is mainly the content, not the technology - the content must work with the technology idiosyncrasies (I won't call them limitations - that implies they could be overcome, but sometimes it would require changing the laws of physics or it would cost so much that it just isn't practical), not ignore the specificities of the medium ("let's play COD with an Oculus Rift, that will be awesome!" *BARF*) or expect that the "technology will improve" and motion sickness won't be an issue, no matter what wild camera gyrations, cool fly-throughs and slow motion cut scenes the game designer has put in. It is the same thing as the film directors having to learn how to shoot in 3D - the "film language" (how you convey your message through camera work, lighting, etc) changes quite significantly when you are in 3D and not every film director was/is comfortable with going there. Even the visually stunning Avatar had some issues with stereoscopy here and there. That is why the worst 3D movies were the ones converted from the regular 2D, where the media specificities were ignored.

This is very much where we are still at the begining - virtual reality as an entertainment and story telling medium. It is not a question of technology anymore, it is more about finding sensible ways to do things in VR so that the experience is fun, pleasant and something people will actually like to return to. With careful design work and working with the medium and not against it you can render even at 30fps and nobody will get sick.

In my 15 years of working with VR (involving both large projection setups and HMDs) I have never encountered anyone getting sick because of the frame rate. It was pretty much always because of poorly made content not suitable for the technology being used, poorly implemented navigation that didn't respect the specificities of the medium ("teleporting" camera, forced/constrained camera movement, head bobbing, poorly synchronized/unsynchronized treadmills ...), poor camera work/tracking ("mouselook" in FPS really is wrong for VR - you don't have head bolted rigidly to your shoulders!) and similar issues. Then you have issues like people feeling discomfort/headaches because of eye strain due to poor focus, moire, poorly set up stereo rendering, etc. It often gets incorrectly attributed to "cyber"/motion sickness, but that has nothing to do with it at all. Finally, there are people who will get sick and dizzy even from looking at a static image projected on the wall - the issues can be psychological/psychosomatic as well.

To conclude, I think it is time that we step back from chasing latencies and framerates and look at the actual underlying causes. Higher framerates and better technology in general are not the silver bullet here.

Comment: Statistics and R (Score 1) 144

by janoc (#49209483) Attached to: Go R, Young Man

I think that the logic is that a business professional will benefit more from what a specialized language like R can offer than from the general purpose stuff. The manager is not going to code a website or an accounting database (where the general purpose languages would be useful), however, they may need some sophisticated business analyses or reports that nobody else can do for them - and R is very good for that.

On the other hand, learning R without learning (and understanding) statistics is pretty much pointless and that is *much* harder task than learning the language. Lot of people buy SPSS (a tool similar to R, just with a nice UI) for a lot of money, then load random data and start pressing buttons following some sort of cookbook/cheatsheet. Random numbers come out and then they wonder why their "analysis" doesn't match the reality. Then they go and hire expensive business consultants - who do the same thing while spouting jargon, only charge for it a lot more.

R is a very powerful tool, but without a solid background in statistics and data analysis it is like giving a scalpel to hospital nurse and declaring her a brain surgeon ...

Comment: They still don't get it (Score 3, Interesting) 445

by janoc (#49182797) Attached to: Microsoft Convinced That Windows 10 Will Be Its Smartphone Breakthrough

"... and provide an experience very much like the desktop"

Which is exactly what people don't want.

Microsoft should finally pull their collective head out of their backside and stop making everything into a PC with Windows. A phone isn't a PC, it isn't used in the same way, so a "desktop experience" is very counterproductive on a phone.

One would think that they have learned something already ...

Comment: Re:Hashes not useful (Score 1) 324

by janoc (#49160783) Attached to: Ask Slashdot: How Does One Verify Hard Drive Firmware?

The fact that this practice is widespread in the Linux world originates from the usage of insecure FTP mirrors run by volunteer admins. There it's possible for a mirror to get hacked independently of the origin web page.

Sorry, but that isn't how it works. The role of the MD5/SHA1 hash on the website is not security but the ability to quickly check whether or not the download was corrupted in transfer so that you don't waste time burning a corrupted ISO image, for example.

The real security feature are the cryptographic signatures inside the packages themselves. Both RPM and DEB formats allow the use of these and most Linux distros use them. There is both a hash and a crypto signature to check that the package comes from who it claims to be coming from and that it wasn't tampered with.

Comment: Classic DRM flaw ... (Score 2) 215

by janoc (#49048065) Attached to: New Encryption Method Fights Reverse Engineering

As this, by definition, requires that the encryption key is present in the clear on the machine where the decryption is happening in order to make it possible to decrypt the instructions (CPU cannot execute encrypted code), then it can be trivially circumvented. Finding where the key is stashed is going to be only a matter of time and then the encrypted code can be conveniently decrypted off-line, repackaged without the stupid performance-impeding encryption (caching will suffer badly with it) and released on a torrent somewhere, as always ...

Fundamentally this is not different from doing ROT13 on your code - code obfuscation.

Comment: Re:How can someone think that this is a good idea (Score 2) 157

by janoc (#49006709) Attached to: Automakers Move Toward OTA Software Upgrades

Having cars reflashed at a dealership is something different - the mechanic will usually do at least some basic sanity tests that everything works before handing it over to the client.

Anyway, my point wasn't that reflashing firmware is bad - it may be even required and I am fine with that. It needs to be done safely and securely, though!

And yes, Toyota had a big software problem too, even though it wasn't why they have lost that accelerator pedal lawsuit:

Comment: How exactly is this news ... (Score 2) 83

In particular, BMW has a history of similar cockups - just search youtube for various "iDrive problems", "Check engine reset" issues, "Engine stalling" issues, etc. Those software problems go back years. The first iDrive implementation from 2002 using Windows CE was a legendary lemon.

It isn't just BMW, though -

I had a Renault Clio and Renault's unreliable electronics is legendary too, even though there it was more a poor design than necessarily bad code. But you will never know - nobody has seen the source code of the firmware in many of the control units. Often not even the manufacturer has it - it is outsourced and subcontracted, even for critical systems like ABS or ECU.

And I am pretty sure that this is industry-wide problem - the same control units are in many cars, especially today with all those shared platforms and alliances between manufacturers.

If someone is thinking about drive-by-wire cars (Nissan, uses a safety clutch to be legal atm, but they have publicly announced a push to go fully by wire or the recent idea about the OTA updates in this sort of cesspit of horrid and unaccountable code, they must be insane.

"They that can give up essential liberty to obtain a little temporary saftey deserve neither liberty not saftey." -- Benjamin Franklin, 1759