Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:ask yourself *why* and do the right thing (Score 1) 294

Totally agree with this! Best is to introduce a classification of patches, e.g. "minor", "significant", "major" and "emergency" patches. Agree that you get the "minor" patches pre-approved (those with minor risk & impact, "minor" to be defined in agreement with the CAB). Other patches like service packs (significant?) and OS upgrades (major) should really go through a CAB, even if it is just to inform the other IT staff members about what you are doing (and to give them a chance to point out that application X or Y can break). Finally, also agree up front who you can call at night to get a "carte blance" in case an emergency patch needs to be deployed to fix some 0-day expl0it.

Comment Don't go there... (Score 1) 383

First ask yourself what the priority of your CEO is. Does he really care about the IT department, does he see it merely as a cost or as adding value to the company? My experience is that in manufacturing, this is mostly not the case (as opposed e.g. to the financial services vertical). Furthermore, are there real problems with IT being exposed to senior management, e.g. was there a big loss of data that caused the entire factory to come to a grinding halt? Or do you guys do a good job, make sure that everything keeps running (with the necessary overtime, stress, cursing, ...) with just some occasional failure that never even surfaces because it has hardly a financial impact? Unless you will be able to convince the CEO that either his business will increase because of an additional FTE on IT, or that a huge risk of production loss (= financial loss) can be mitigated using additional staff, your chances to convincing him will be nearly zero. I suppose your IT manager has gone over this exercise a few times as well... Tip of the day: don't tell the CEO about your nitty gritty techy projects you want to do, like "creating the solid infrastructure you want to have". They couldn't possibly care less about what you want, they care only about the benefit for the company (which in turn after the yearly numbers have been published, turns into a benefit for them personally).

Comment Electricity consumption -- where does it go? (Score 2) 348

Disclaimer: no expert in this area! I remember hearing stories that for electricity generating companies, the highway lighting was one way of consuming the excess production of electricity at night (knowing that a nuclear power plant does not have a big red control lever to lower electricity generation at night). Where will this electricity go now, just in the earth (all non-used electricity is wasted!) ? And who will pay for this, the UK consumers who will see a raise in their electricity bills for more wasted electricity at night?

Comment Re:Private cloud (Score 4, Insightful) 141

And I don't understand why you get insightful for your comment :). There is a big difference between a traditional approach to IT, which involves fileservers, SAN, mailboxes, ... and a "private cloud" approach. What most techies do not comprehend, is that cloud computing is not a technology, but *a delivery model* for ICT services. Any existing service can be wrapped in a cloud coating, if that service is delivered in another way, to adhere to some fundamental characteristics of cloud computing (see for example the NIST definition). That is: you need to deliver your service anywhere, anytime, from any device (ubiquitous access), it needs to be in a self-service form, it needs to scale elastically (without waiting weeks for new servers to be delivered, ...), etc. Those are service characteristics that in the end will of course use technologies such as a SAN or fileserver or mailserver to deliver that service. It's just one logical layer higher than the technological layer. People who claim that cloud computing is "old stuff", have not understood what cloud computing is about.

Comment Bad choice of names? (Score 5, Informative) 160

For those interested, the preprint of the Nature article can be found at:

However, I don't really see what the fuzz is about. What they are in fact demonstrating is a relationship between conditional von Neumann entropies, which they claim is a measure of "uncertainty" (it is in a specific meaning of the word "uncertainty"). However, there is a difference between von Neumann entropy and the variance of a physical observable as used in the Heisenberg uncertainty principle. On the other hand, if you label a physical property such as entropy "uncertainty" and demonstrate a relationship between those entropies, then you can indeed call that an "uncertainty relation" but that's just a cheap way of attracting attention.

Also, I am not sure if it is possible to obtain the Heisenberg uncertainty relation from their equation. I would expect that, for example by entering pure, disentangled states in their equation, that Heisenberg should be recoverable (because of course, Heisenberg also applies to pure states). I don't immediately see how that can happen since the von Neumann entropy for a pure state is zero. Perhaps I am just missing something and perhaps my QM is a bit rusty :).

Comment Re:Gartner is shilling (Score 1) 1213

The XP extended support will end April 2014. I've been guiding companies with Windows 7 migrations and I can tell you that for some of them (the ones that are really not well organized), 2014 will be a very hard date to hit in time. Yes, to migrate 10.000 PC's (which is still nothing compared to the bigger companies of this world), it can take over 4 years if you want to do it properly. But then again, testing 1500 applications for compatibility, redeveloping your own internal messy applications, setting up a distribution platform for OS & software... that all takes time! So, no: it's not a matter of just XP being supported. It's a matter of planning ahead. That is precisely why Gartner is telling you to urgently start thinking about it.

Comment Re:Comments from Lubos Motl (Score 1) 650

From his blogpost: "At the level of the dimensional analysis, it had to work, of course. However, all the detailed justifications, special qualitative assumptions, and numerical factors seem to be either unjustified or downright wrong which creates some doubts about the chance to make this argument serious."

That's a bit stronger than his polite remark "I remain undecided" at the end ;).

I quit reading at the point where the original author claims that the entropy for an open system is given by Boltzmann's formula S = ln W. That is simply incorrect since the derivation of this formula clearly assumes a closed system (and not somebody pulling a polymer with a pair of scissors in a heatbath, which is an open system). For open systems, there are numerous other ways of defining entropy (which you can all derive from the Boltzmann entropy by modelling the heat bath by the way).

There might be some merit in turning the reasoning of most physicists around: starting from Boltzmann until today, most physicists try to explain entropy and the second law of thermodynamics as originating from a more microscopic view (and there are some very convincing arguments for it, see J. Bricmont et al). However, I doubt that by mixing concepts from information theory and statistical mechanics, and then by turning everything upside down, you'll get to a point where you discover the new all-encompassing force that underlies everything ("information"??).

Comment CapEx vs OpEx (Score 5, Insightful) 349

Don't forget that the biggest cost in a client is not necessarily the purchasing of the hardware (which is obviously the most visibile cost). Various studies (Gartner, IDC, ...) indicate that a PC that is purchased for $500 (one-time cost) in fact costs somewhere between $1500 and $4500 per year (!) to manage. These hidden costs are mainly into the backend infrastructure supporting these PC's in corporate environments, people managing them, deploying software on them, ... Google for desktop TCO and you'll find plenty of information. Sure, you might disagree with the exact numbers provided by a Gartner /IDC /Forrester but at least it gives an indication.

For thin clients (and desktop virtualization for that matter), this is also where the cost savings are. No serious VDI vendor will tell you that the CapEx (investment in hardware, licenses,...) is cheaper with thin clients and virtual desktops: you need to buy additional licenses, you're going to run desktops on server hardware (ok, 100 at a time on the same box) and then I still didn't start about the licensing galore (Microsoft VECD, Citrix XenDesktop or VMware View or...). The real cost savings are in the fact that it's much easier to manage, and being able to let your very expensive system administators do something else than troubleshooting a desktop (which costs you twice for the end-user downtime and the sysadmin troubleshooting it).

The same goes for thin clients: the up-front investment is larger, but they are very easy to manage (plug into the network and the thing autoconfigures itself, pointing you to your virtual desktop -- which means fewer expensive sysadmin interventions on-site for replacing hardware!), they live longer compared to traditional desktops (these used to have three-year lifecycles whereas thin clients typically have a five-year lifecycle -- roughly speaking you'll need to buy two traditional desktops for one thin client in a 5-year desktop lifespan; I'll concur to the fact that with the economic situation, you'll see prolongued lifetimes for both thin clients & desktops but the idea remains the same, numbers might differ today).

So is the thin client cheaper? In most situations and looking at the total picture, sure it is. Even despite a higher up-front investment. The real problem is not really the price of a thin client but whether your applications and IT environment support thin clients/server based computing (TS/Citrix/VDI).

Sidenote: I work for a consulting firm where I work a lot with VDI & Server Based Computing in general; we strive to be independent as possible (trying to nuance the vendor claims as much as possible for our clients) but that might mean I am a bit biased towards using SBC if it works ;)

Comment OPTIONAL is the keyword here (Score 2, Informative) 413

Notice that you are not forced to use the XP Mode, in fact, the early reports mention that you have to explicitly install it as an add-on. This means that companies have the CHOICE to either go for a full Windows 7 compatibility track (yes, they should) OR they can choose to support two operating systems until a legacy application fades out.

This is just Microsoft trying to convince IT admins not to have application compatibility as an argument against Win7 migrations, and not requiring to implement dreaded MED-V like, Terminal Services, Remote Desktop XP, VDI solutions just to keep that darn ol' app running. That also requires maintenance of multiple operating systems, and in fact, just as many as there are instances of non-compatible apps.

Comment Re:Not very realistic (Score 1) 276

Samba 4 is not really production ready yet. That is why it is labeled as an alpha version.

I acknowledge your point on Samba 4 not being production ready. I was merely using the example as an indication of "core functionality" that appeared to be missing.

I have seen countless problems restoring AD after a DC failure. I created a mock scenario with a Samba 4 DC wherein the entire database was wiped. I simply used Samba's own LDB toolset and had it up and running again in seconds.

Glad to see that they are providing a toolset to do this. I do wonder how FSMO role recovery, global catalog recovery and GPO recovery will be done. I hope with the same easy and especially in a fully Microsoft supported manner.

However... the most tricky part of an AD disaster recovery (as you know as you speak with experience), is not getting the database back running, but verifying its integrity. Again, I wonder if tools similar to NTDSUTIL will be ported to a Samba equivalent.

You're missing the point. It isn't about cost at all. The point of having an open source replacement for AD is to make it easier for software developers to take advantage of the largely undocumented protocols. This is designed to facilitate interoperability. Even Microsoft, from the light of the anti-trust lawsuit it lost, extended an olive branch to the Samba team to assist in providing documentation. Plus, the work that Samba does stands to benefit Microsoft as well because they might be able to see where the Samba team has had some really good ideas and legally incorporate them into mainstream AD.

+1 Karma score for being the first to provide a good answer to the reason of Samba 4's existence ;). Yet, I do wonder where there was a lack in documentation? Direct interfacing with AD is done through LDAP... which is documented as a standard, or through ADSI, which is IMHO (from a limited developer experience in the past) is also decently documented on MSDN/Technet. Microsoft documents their own extensions to LDAP in a whitepaper. On top of that, messing around with replication, sites, FSMO master roles or other low-level Directory Services parameters in the Configuration naming context of a forest is something I wouldn't recommend anyway.

I do agree that it can be a good trigger for Microsoft to be forced to document some parts of AD that are scarcely documented (garbage collection, tombstone processing, ...). And, before you express such confidence, I would try using Samba 4 myself. Some parts of the code are very mature and work well.
I have used Samba 3 in the past and was very pleased with its stability. It is a very decent product and I believe that Samba 3 has an added value to provide (a limited form of) Windows Domain services.

For me, the step up to an Active Directory environment is merely an academic exercise in order to study the Microsoft closed source internals in more detail. Interesting, yet of little practical value in a commercial or educational environment (given the low costs... which brings us back to that).

Comment Re:Not very realistic (Score 1) 276

I was not talking about the stability of Samba. I have used version 3 for quite some time and I was very happy with its stability. I am simply questioning the motives of attempting to duplicate a product where the original is cheaper to implement, directly has the original manufacturer's support and has 9 years of development maturity backing it.

BTW: Then who are the Samba team targetting as a user? I cannot imagine many home users would require an Active Directory environment, so naturally they would be targetting at small businesses, where Microsoft also has a competitive offer with SBS & EBS.

Slashdot Top Deals

BYTE editors are people who separate the wheat from the chaff, and then carefully print the chaff.