Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Take advantage of Black Friday with 15% off sitewide with coupon code "BLACKFRIDAY" on Slashdot Deals (some exclusions apply)". ×

Comment Regular Online Shopping via MiniTel predates that (Score 1) 47

Others have mentioned it already - because it's so obvious - but allow me to chime in: MiniTel was spread nationwide in France from the early eighties. That includes official public MiniTel booths, free home-terminals including both a keyboard and a screen, billboard add-campaing for MiniTel services and ecommerce and even a large widespread market for cybersex

Comment What scares me here (Score 4, Insightful) 33

is that reading and exploiting data that's a mere 25 years old requires almost archeological-like recovery and reconstruction techniques. Compare that to a thousand year old book that's usually pretty much readily readable today.

I think modern society is on a scary path towards massive amnesia in the not-so-long term...

Comment Re:If we're going systemd, we should go full throt (Score 2) 721

I disagree on the "full-throttle" part. That's be fine on consumer desktops. But Linux is mostly about production servers. Yes, yes... I know... mainstream Linux on the desktop is "just around the corner" and all that. :)

The question here is: What's with serverhosts these days? They are either embeded/integrated or virtualised. No one screws around (literally) with hardware anymore - not in a time where soc pcs cost less than 10$ a pop. So there is no need to fumble with init on that level. I haven't touched init or even runlevels for just about 10 years now - and I do lots of server stuff.

These days im running all my services in VirtualBox and copy, booting and ditching entire VMs at my whim. Fiddling with init would be a waste of time.

If you have a stripped down serverconfig that you have to distribute and scale, I doubt systemd will seriously get in your way. Yes, you have to hook your init stuff somewhere and yes you'll have to read about how systemd does things at this level, but on a dedicated server that might aswell happen in userspace or somewhere late in systemd boot. I'm sure systemd offers hooks for quick late-boot custom fiddling of some sort.

Bottom line:
If finally all the Linux proggers get on the same page I'm all for it. If that page happes to be systemd, so be it. Simply the benefits of all getting behind systemd will move Linux forward faster than ever - that's my newest prediction anyway.

Comment If we're going systemd, we should go full throttle (Score 4, Interesting) 721

As we've all learned from Apple: No half-assed shit. Do or don't do. No place for inbetween stuff.

systemd has downsided but it also has upsides. We should stick with the upsides and patch the downsides until they're basically a non-issue.

I don't do much init-fiddling although I do like the text based init/runlevel thing, and I would guess that plug-and-play - one of systemd's strong areas - should be a userspace problem, but that's just me not really know what's going on in the init process.

However, since all major distros have moved to systemd it can't be that bad as some people make it out to be. I trust the debian and ubuntu crew to know what they are doing at init-level.

If as a result the Linux community grows closer together and focuses more on consistency I'm all for the move to systemd - even if that moves Linux away from the rest of the unixes due to loss of posix compliance. Seriously, who cares? It's mostly Linux left, right and center these days anyway. The BSD people will be fine with whatever they choose as init process, as usual, and no one gives a damn about other non-FOSS unixes anymore anyway. Unix basically is Linux these days.

But, as I said at the beginning: There is no use going systemd, but only kinda so-so. If the community get's behind systemd, it works and is/becomes usable and apps start relying on it being there - so what?

My 0.5 eurocents.

Comment Terrorism is a fallback solution. (Score 1) 490

I'd say it has more to do with being a male than with being an engineer.
The trait that makes a person a terrorist is more primoral than "engineer type".

Basically, terrorism is a fallback solution to changing/improving the world to fit your needs/desires.
Which is what many male humans and thus male engineers would want to do.

Tech experts are also prone to being smarter than average, narrow minded, misunderstood and socially excluded by people around them.
This in turn leads to frustration. And I'd say roughly 80% of all wars and conflicts go back to simple male sexual frustration. Also terrorism.

Take a smart, outcast male youngster, and, yes, he is indeed very much closer to becoming a leader, innovator, bum, philosopher or, yes, if the circumstances are right, a gun-rampager or terrorist than regular people.

I know that I am closer to being a warrior, leader or bum than a 'regular guy'. There is less of an inbetween for me.

Let's keep in mind, the difference between terrorism and war basically is just the amount of people you kill and the amount of comrades and long-term planning involved.
Looking at terrorism and the technical requirements for effective terrorism, these stats are no real surprise.

Comment Windows 2000 was my last version. Here's why: (Score 2, Interesting) 356

Same has been said by many a people about Vista, Windows 7, Windows 8 and Windows 8.1. Truth is at the end of the day when MS have a small or any screwup the open-source crowd are so divided among themselves that they can never seize the opportunity.

The last Windows I used was Windows 2000. They crossed the line with me with forced registration and remote disabling "features".
Why anyone would use a OS that has this for anything mission-critical, is totally beyond me.

In my book Windows is a Toy, an elaborate gaming bios, and the only reason to use it is if you're into frontline hardcore PC gaming or need to use a professional application that only runs on Windows - such as Solid Works for engineers or something like that.

I've been riding Mac OS X since 2004 - for professional Flash development back then - and x86 Linux since 1999. Nowadays there is absolutely nothing aside of perhaps some neat Photoshop plugins that Linux and FOSS can't offer that I need for my professional work (Dev, Software Architect and Consultant). I expect that to improve even more with Gimp 3.0. I've got no incentive to replace my broken Mac Mini now - albeit HW & SW integration with Apple is still top-of-the-line.

However, I *do* still use a MS product: The last iteration of the XBox 360. The system mature to the marrow and has dirt cheap top gametitles out of the bargain bin. Just added Diablo 3 to my collection for 20 euros last saturday. Neat.

I hope Windows, the abysmally shitty Outlook Groupware, Exchange, MS Office and all those ancient crappy MS monsters die in a fire and/or gets squished by Google and Chrome OS like a bug. They would deserve it.

Google has users by a leash too, but at least I get all their stuff and services for free and have an interest in keeping them running and synced across devices no matter what.
Which is why I recommend Chrome OS over Windows whenever a n00b asks me for advice.

My 2 cents.

Comment Not the first full recovery from space (Score 1) 121

SpaceShip One touched space and all elements were recovered and flew to space again.

BO's demonstration is more publicity than practical rocketry. It doesn't look like the aerodynamic elements of BO's current rocket are suitable for recovery after orbital injection, just after a straight up-down space tourism flight with no potential for orbit, just like SpaceShip One (and Two). They can't put an object in space and have it stay in orbit. They can just take dudes up for a short and expensive view and a little time in zero gee.

It's going to be real history when SpaceX recovers the first stage after an orbital injection, in that it will completely change the economics of getting to space and staying there.

Comment Yes, but they have to chance their perspective. (Score 1) 169

I'm your Type-A 80ies computer kid turned web-dev in 2000. The line between stable long-term occupation and freelancer has been blurry ever since. This comes with the profession and the times we live in.

I've been in active in the industry for 15 years and now call myself a "Consultant & Software Architect" for FOSS and non-trivial web-applications (flashy name required for being taken seriously as a senior). The software we use at my current employer is matured FOSS, most of the coding is done already. 15-20% of the work consists of slapping together various pieces and building a whole project, adjusting preconfectioned webdesigns with some CSS and jQuery hacks on the side, maintaining the deployment pipeline, doing a little helpdesk, patching IT, etc. The other 80% are office, partner and customer politics, writing important sounding requirements-analysis and covering the companies ass on the technical side when we prepare to take on a deal.

If I would insist on only doing coding, I'd be one of the freelancers we hire to do the work for a few weeks, two or three times a year. One guy is a freelance web-guy, the other is a student who's good at Bootstrap and WordPress and is more into politics and probably has other long term plans than staying in webdev.

Since I'm important for deals and revenue I've got a part-time fixed position. Which is just the right fit for me and the company.

If everything goes right, our jobs, like most others will mostly be done by robots/software when we retire. Software is eating the World.
It's called progress and you should prepare for it.

Comment SQL has no place in applikation persistance. (Score 2) 193

I always wonder why people - even professionals (ableit only the non-DB pros) - think SQL is a feasible means for an application to utilise persistance. It isn't. In fact, it's a huge smelly turd for app-persistance and using it so broadly for this sort of work is a really harebrained and abysmally stupid idea.

That we have to deal with SQL injection problems is one of the countless pieces of crap based on this technology decision.

SQL was meant as an end-user interface for interacting with relational database - and for that it is absolutely perfect. End of story.

Using SQL as intermediate for application persistance is one of the most annoying and studidest things in the history of applikation development - for reasons to countless to list them. DB designers are among those who time and time again shake their heads in disbelief when they see the mess devs do with SQL.

Time is an illusion perpetrated by the manufacturers of space.