Follow Slashdot stories on Twitter


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:clearly the truckers are right (Score 1) 331

I just wrote some grammars for a language and I might be a bit sensitive to the way it is set up. I agree that the truckers are wrong. Looking at it from a lexical/parser perspective we have the sentence:

The canning, processing, preserving, freezing, drying, marketing, storing, packing for shipment or distribution of:

With the lexical tokens:

The (1 - article) | canning (2 - identifier) | , (3 - separator)| processing (2 - identifier)| , (3 - separator) | , freezing (2) |, (3) | drying (2) | , (3) | marketing (2) | , (3) | storing (2) | , (3) | packing for shipment (2) | or (4 - operator) | distribution of (2)

In English this would be a logical list. And that is usually set up as:

logicalList : identifierList OR_OPERATOR identifier
                              | identifierList AND_OPERATOR identifier

identifierList : identifier
                                    | identifierList COMMA identifier

Long story short the OR in the wording above is the operator to tell the parser (human), what type of list it is. In this case the OR operator indicates that the list is one of the items in the list. If the last identifier was 'packing for shipment or distribution of' then there is no way to say what type of list it is.

I also think this bullshit that people can get forced to work unpaid overtime. That includes C level executives. If you're going to work the hours you should be paid for it. The only time I would disagree with this is if you signed a very specific contract that says 'I will get X, Y and Z done by T and I will be paid $ for the work' that was properly negotiated then I can see overtime needing to uncompensated. But that's for very specific mostly deterministic tasks. Driving a truck to make deliveries doesn't fall under this, there's too many uncontrolled variables (Traffic, people receiving being slow, etc).

Comment Re:Password rules insanity (Score 1) 498

I've worked for similar companies. The one I worked for had terrible rules.

- Passwords need to be changed every month
- Must be a minimum of 8 characters long
- Must have a mix of upper and lower case. Must have digits (more than one), and symbols
- Cannot have more than 3 characters in the same sequence as the last 12 passwords.

They did have people from HR go around and look for the sticky notes on monitors and remove them, of course writing down who was violating policy. So we just started to hide the sticky notes in our log books. And this was for a start up that was spun off of a large company that ultimately failed. Had no government, financial or similar contracts. The idiot who came up with these policies was fired 3 months later when everyone threatened to quit, what really did it was his stupid 8am - 6pm core business hour policy.

Comment Look at FACE of Amazon (Score 3, Informative) 392

This seems to be very common at Amazon. Going by the FACE site, it shows a clear pattern of abuse, and I'm not surprised that this hasn't happened before.

Granted the FACE site is posted to those who are usually pissed at Amazon, but with so many postings and so often it shows that there is a clear pattern of employee abuse.

Comment Re:Paper... (Score 2) 209

Even if their code was open source, you still can't trust them. Especially if the people rigging the machines is the people who own the machines.

Who is going to be able to verify all the lines of code? Even if you had a million programmers looking at it, something will probably still slip through, after all there are contests every year on making code that looks legit but is actually nefarious.

Who makes the compiler? Can you trust them? Has the code for the compiler been checked into? There's a legend (real or not) that when AT&T was going to commercialize UNIX that they asked the programmers if there were any obvious security holes. Dennis Ritchie spoke up about a backdoor he made in the C compiler. If it noticed it was creating the login program that it would automatically insert code for his username and password so that he'd always have root access. This was not in the login code, but the compiler itself. So you can't trust the compiler.

Are you using signed binaries? Well who signs the binaries and calculates the hash (see the point about the compiler).

What about what downloads the code to the voting machine? Can you trust that?

And that's just the voting machine itself, what about the thing that collects all the results from the voting machines and gives you the final results? Who's checked all that? Do you trust the people doing that? Do all the interested parties trust that?

There are so many points of failure and compromise with this that its scary. Especially when they want to go paperless, with no paper backup, and trusting it all to the machines. Some electronic voting machines are still this way.

The only voting machines that I see being any close to secure are the ones with the cardstock ballot that the voter fills in a line with a black marker to indicate who they are voting for. That can be machine counted for quick results. But to certify the election each ballot should be counted by a human official, with the concerned parties watching. That way if the vote can be called into question the ballots can be looked at.

Machine counted for initial (fast) results, Human counted with observers for certified results. In the case of US elections that would be at least 3 people counting each ballot: one independent election official, one republican and one democrat.

Comment Re:It would, actually (Score 1) 73

Mainly for latency reasons. From what I can find from a simple google search most optical cables transmit light 31% slower than in a vacuum. This means that for every 1000 km you add to the length of the cable you ad 4.8 ms of latency (if I did my math right)

4.8 ms might not seem like a lot, but when you're talk about needing speed it is one of the factors that is important. Trading, online games, etc. I'm not sure how much distance you will add if you run it up to Alaska and then over. If you're wanting to run it by land over to Alaska and then over to Japan it could be a fair amount, it might save a bunch on initial cost but will hurt latency. Plus there's Canada in the way, so you'd have to work through all that red tape too, and that might cost more (time == money) than just running it straight across the ocean.

Comment Re:The old talent doesn't understand the new stuff (Score 1) 229

I agree with you. For my work I would like to use classes, polymorphism, overloading (functions not operators), and pure virtual functions. Anything else on an embedded system would probably lead to bloat fast, so no templates (and STL). I would also avoid a lot of the new features like lambda functions.

We're already using classes, and pure virtual functions. It just looks ugly because we're doing it in pure C.

The arguments I usually get into are with people who think that any sort of c++ code automatically bloats the code by hundreds of percent and slows it down by 50%. The big argument I'm having right now is when we change the file extension from .c to .cpp and use the c++ compiler that the object file is suddenly larger, so it must be larger on the target and must be slower. I usually fix that by extern "c" in the header files, but I still have the same argument over and over again.

Comment Re:The old talent doesn't understand the new stuff (Score 1) 229

I constantly have the c vs c++ argument at work. We use c because our 'customers are afraid of c++'. But we're also doing embedded frameworks where we want to easily swap in and out drivers that follow a specific interface. And what was chosen was to use a structure of function pointers to do the interface. Others got upset when I started to label my structures with vTable for the interfaces I was creating.

I'll fully admit that c++ has it's disadvantages but when you're writing object code in c, why not use it and limit what aspects are legal?

Comment Re:Huh? (Score 1) 219

I wonder if I had the same teacher. I remember getting told you couldn't subtract a big number from a small number. And then getting detention because I used the calculator I got as a prize for being the most improved in Math by that same teacher to show you could. Apparently the calculator was malfunctioning. That completely messed me up when my family moved and I got laughed at when I insisted you couldn't subtract a large number from a small one.

Or maybe its just the elementary school way of teaching math at the time.

Comment Re:CVS or Subversion (Score 1) 325

I'm fighting the fight right now of moving from SVN to Git/Bitbucket Server. So I'll give it a shot in explaining why Git is harder on newcomers than Svn, Cvs. I also work for a hardware company that isn't very good on software discipline. But we have a much larger project than what was explained above.

To start with, its Git's complexity. One of our customers (who pulls more weight with management than the engineers do), recommends that we don't move away from SVN to Git because 'Git is too complex for embedded engineers'. There is definitely something to this. The client/server architecture of SVN is easier to understand to people dealing with the basics of networking. It works very much like a filing cabinet that hardware centric engineers can wrap their heads around. You basically have a copy of what's on the server.

With beginners they look at the power that any VCS can provide and then go crazy with it. They store things that shouldn't be in there, because its convenient. Binary artifacts for instance (.a files, installers, pdfs, etc). With a new team just discovering the joys of VCS your repository is going to explode with these things, because they won't understand how it affects things. With a distributed repository like Git this will explode the repo that needs to be cloned. And people will not understand why its slower than SVN at least on the initial clone/checkout. I'm not saying there aren't any solutions to this (I'm deploying git-lfs backed by JFrog Artifactory to handle my binaries), but you have to make sure you have everything sorted out before you go to Git.

Branches are another problem, some engineers can't get their heads around it. In SVN it looks like just another copy in the repository. Almost like another directory. With Git its completely different, branches in Git are wonderful and probably its killer feature over SVN. But it adds in complexity and newbies won't appreciate them, and really know how to deal with them.

Git also gives you enough rope to hang yourself, and then gives you plenty more. You can treat it like a normal VCS system, but that removes a lot of its power.

Newbies also haven't had the pain of a malfunctioning VCS, or the pain of when it starts to go wrong. With the centralized repository the pain can be concealed better with SVN. Git rips off all the band-aids.

Git can be merciless when it comes the power it provides. A lot of the complexity can be trained away. The Pro Git book is a good place to start, but how are you going to make your developers read it and understand it? For my own migration I'll be training the entire team on how to use Git when working in our project along with JIRA, Bitbucket server and Bamboo. I think the training will last for 2 days or so. Do you have enough time to put together the training for your own team, and then conduct the training? If you don't need Git's features, SVN's simplicity is definitely easier to train for.

I hate to say it, but engineers and developers are stupider than they think when it comes to things outside their direct experience. Giving them SVN is like giving them a bike with training wheels. Its good for them to learn, and maybe you can take off the training wheels (using the CLI instead of GUI), but you wouldn't trust them out in the street. Giving them Git is like giving your 3 year old the keys to your SUV even with limiting them to a GUI. You want them to learn a bit before they get that power.

I'm moving my team from SVN to Git to get a better workflow together. With 2/3 of the team in India I have a very hard time keeping the builds clean. Running with SVN right now the developers have been instructed to commit all their changes at the end of the day. I want to find the guy who said this and shoot him, because we have people breaking the build, then running off home for the weekend, leaving the people 12 timezones away stuck. With Git, Bitbucket, and JIRA workflows set up (and permissions on who can actually commit to the MASTER branch and how it is done), I hope to isolate people's work and not having it break the build. I also hope that I'll be able to keep MASTER clean and always releasable. I can't do that easily with SVN, especially since IS has forbidden us from locking any files in SVN (because it slows the server down). My team has now had enough pain from build breaks that they're willing to learn Git and the power it has and how it will make their lives better. (For one thing I won't be yelling at them). A new team won't have this built up pain, and won't have the fear of the release manager yelling at them.

Comment Re:First... (Score 1) 167

I don't know for sure, but I don't think that Weis & Hickman own the copyright on Dragonlance. I know that both were employed by TSR when the books were written for TSR. Since they were published by TSR while Weis and Hickman were employed by TSR they'd be considered 'Works for Hire'. Which means that TSR -> WotC -> Hasbro owns the copyright on the Dragonlance novels and world.

There may have been a supplemental agreement that gave them ownership of Dragonlance, but if that doesn't exist that Hasbro owns it and they can do whatever they want with it.

Comment Re:"Other types of electromagnetic radiation" (Score 1) 529

This actually happened to my father when he was into HAM radio. It was back in the day when you assembled your own equipment. He put up the monopole antenna/tower first, and ran the coax back to the house. Industry Canada (or whatever it was called) came by about 2 weeks later with complaints about him causing people's TVs to go out and they had to inspect his equipment.

They did so, found it only half assembled, and the coax from the Antenna sitting bare without even the connector put on it yet. The inspectors rolled their eyes and filed the complaint as groundless by NIMBY people and left.

Its definitely the sight of the antennas that cause the bulk of the issue.

Comment Re:Summary is rather misleading (Score 4, Interesting) 193

You're assuming that that the games are all strictly a high level language like C, C++ or C#. A lot of game programmers will drop down to assembly to do some things as fast as possible. When I was in the industry I had to do that a few times, never for the xbox360 though so it may not be as big of an issue.

Game programmers also use a lot of intrinsics that are basically C macros around assembly calls. And these are very tied to the CPU architecture. They also do a lot of things based on cache line sizes. Making sure that structures or multiples of structures fit inside cache lines. Or play around with using a structure of arrays instead of an array of structures, or visa-versa it all depends on what turns out to be faster on the architecture, or CPU multi-threaded loading, or the cosmic rays hitting the box at the time. If a game team has a good set of optimizers on it they'll beat anything a compiler will do, and it will tie the performance of the game to the CPU and ensure you can't just recompile. Recompile will just throw error after error.

The CPU architecture is completely different. Pipeline depths, branch prediction, it uses SSE for its vector unit instead of the one in the xbox360. And that's all fairly custom code almost in the assembly level to force the use of the vector units. The GPU is different though I think they were both AMD GPUs so it shouldn't be too bad for the code to run on it, and it should be using Direct Draw 9.0c as the API so it shouldn't matter what the GPU is.

Microsoft also loves to change their APIs between SDKs, something compiling for June 2010 may not compile in June 2012. The only thing they guarantee is that something compiled on June 2010 of the XDK will run on June 2012 version of the flash. And only on the production boxes. I remember a few times where older games compiled for launch did not run on the latest flash on the dev kits. The dev kit flash was filled with lots of things to make development easy, so they stripped out deprecated functionality. They also stripped out the deprecated functionality to ensure that people didn't use it, because game developers would find a way to get at it if they really needed to, if it was in the flash they'd find it.

Also MS may only have source code for Microsoft Studios' games. They don't have the source code for any of the third party games. When submitting for certification and publishing all they cared about for the xbox360 was the ISO image. They may not even have the source code from their own studios available. Especially from the early games, the Xbox360 has been around longer than most companies store data. The company I worked for only kept the source code around for 5 years. That would put the earliest game to have published in 2010. They may not go back this far for their compatibility but it does cut out the earliest games.

I think they've finally got a Xbox360 PPC emulator that is fast enough to emulate what the xbox360 could do without dropping too much in the way of performance. And that wasn't ready at the launch of the Xbone.

Comment Re:Do It, it worked in AZ (Score 2) 886

It really depends on who you ask. When I sent out a novel of editing, it used the rules you stated, but the editor came back and said its no longer proper English, and that I should change all of those pronouns where the gender was not specified to they, them, and their.

The one the editor complained the most about was that the crew on the space ship I was writing for referred to the ship as 'she' and 'her'. I personally thought the object was stupid and rejected that change. Especially since there are reasons for calling a ship by the female pronouns. Historical reasons may be nullified by political correctness, but spiritual reasons not so much. Especially since the female pronouns refer to the soul of a ship as any sailor will tell you, and if you treat 'her' right 'she'll' see you home safe.

I sometimes feel like a stick in the mud and I realize that language evolves all the time. To me it will always he, him or his unless you know its female. They, them and their is plural.

Comment Re:Why not have devices get their time from GPS? (Score 2) 166

This works nicely for self driving cars which need GPS anyway. I have no idea why self driving cars were listed. And for the times that it can't get a GPS signal the internal clock shouldn't drift that much. Unless the self driving car is 100% underground it should be able to find a GPS signal to time sync to often enough.

Things inside a building might be harder. But there are things that take a GPS signal and put a NTP server on the network. All you need is on of these and you're fine for the local network. I used these 10 years ago when working on base stations, and they provided a very stable 10 MHz reference clock too. And they're not that expensive, I was looking for one for home because I'm a little anal about time some days.

Slashdot Top Deals

A list is only as strong as its weakest link. -- Don Knuth