Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).


Comment: Re:So live underground (Score 2) 131

by Ronin Developer (#49149235) Attached to: Adjusting To a Martian Day More Difficult Than Expected

This is no different that what submariners experience - with no natural light, they move to an 18 hour day (6 on 12 off). Contrast this to driving across the ocean in a ship and traversing the various time zone. The would adjust things on the ship so as to try to minimize the effect. However, it still sucked.

BTW, the moon is also tidally locked with the Earth with it's rotation period and orbital period matching almost exactly 1:1. That's why the moon never seems to rotate from our perspective.

Comment: Re:Same error, repeated (Score 1) 300

by Ronin Developer (#49137329) Attached to: Moxie Marlinspike: GPG Has Run Its Course

True. You can't stop spammy content from being inserted into an email. However, being able to identify the source of the email as being from a trusted source or not makes it pretty easy to identify and classify potential unwanted email. Other techniques still would need to be applied on messages that pass the first round of filtering to determine the likelyhood they are or not real spam.

Comment: Re:Same error, repeated (Score 2) 300

by Ronin Developer (#49126851) Attached to: Moxie Marlinspike: GPG Has Run Its Course

Original poster stated, "... it'd be kind of a shame to finally get there with 1990's cryptography."

The RSA encryption algorithm has been around a lot longer than the 1990s. In fact, it was released in 1977. Still, the technology and algorithm continue to work. However, due to advances in computing and hardware, the encryption keys have had to be extended. So, there is nothing wrong with the older technology.

When my brother and I started a business in 1994 to provide a secure communications platform for the masses, RSA and the related PKI infrastructure were all the rage. At that time, we had DES and Triple DES - AES didn't exist and the legal status of PGP was up in the air. RSA Laboratories had a great licensing deal for BSafe and TIPEM that made it possible for a small startup to develop some really cool products without breaking the bank. But, we soon discovered we were up against both Microsoft and Netscape who were releasing secure email solutions. And, the gov't sponsored Clipper chip was at the forefront. There was a lot of uncertainty in the secure communications market back then.

Despite our product being built from the ground up to provide encryption, digital signatures, anti-spam, secure file transfer and secure FAX facilities in an easy to use package (initially for Windows 3.1 and Mac System 7), we ultimately felt we couldn't compete with FREE and never released our product.

While we set out to make PKI a manageable process (no easy feat), the biggest barriers were trying to convince the general public why it was important to protect one's privacy and why people should want to pay for our commercial product (to be sold in CompUSA's everywhere!). We shut down our business in 1996 having never gotten the product to market.

GPG and Enigmail provided the privacy and authentication features while still being bound to existing and clunky client mail agents. However, the web of trust never really took off and the PKI infrastructure is a real bear. I don't know how many people use the tool with private keyrings vs the WOT.

On the Anti-Spam from, Yahoo! developed DKIM which relied on digital signatures. Sadly, it has limitations and isn't the end-all-be-all cure for spam it was hoped to be. I still believe a product, developed from the ground up with privacy and authentication in mind and not a bolt on to another system could solve a lot of our woes. And, I am sure that brilliant folks could probably come up with a way to anonymize the traffic so that it metadata analysis would be nearly, if not entirely, impossible.

Comment: Damned if he does...Damned if he doesn't (Score 4, Interesting) 220

by Ronin Developer (#49077373) Attached to: Obama Says He's 'A Strong Believer In Strong Encryption'

It sounds like he's caught between a rock and hard place. He might, personally, believe in strong encryption and privacy. But, the series of events since 9/11 have made a stance which prevents the collection of information to prevent another attack a difficult one to sell to the public.

Strong encryption can protect secrets and privacy. The secrets and privacy of the common man is worth protecting. The same technology can also enable our enemies to operate in stealth. Should we have another 9/11 experience and the suspected perpetrator used strong encryption to protect their plans, the public will scream that not enough was done to prevent the attack. How should the president respond?

I am an advocate of strong encryption having started a business in the 90's to provide secure email and file transfer. I also remember the advent of the Clipper chip and the reasons behind it subsequent defeat. We liked to believe our privacy was not being infringed and then Snowden revealed how our intelligence community was violating our rights. At the same time, we haven't had another terrorist attack on our soil lending credence to their methods (valid or not). Snowden, however, also released information on other data and intelligence collection methods. That disclosure allowed our enemies to operate with more impunity through the use of strong encryption and by adjusting their methods to avoid detection.

Sadly, that protection strong encryption provided in order to protect our privacy and rights now becomes a marker of potential threats with other intel methods compromised. Weakened encryption or strong encryption with a backdoor would, theoretically, permit the gov't to pierce the veil when other intel might have put the focus on an innocent citizen and users of strong encryption would be marked as threats.

We, as a nation, have allowed the events of 9/11 to shatter our society and live in a world where our believe of privacy through ignorance was shattered by Snowden's revelations.

The revelations that Snowden provided on the intelligence collection programs aimed at our own citizens, supposedly for our protection, were necessary. However, the disclosures of other techniques and operations on the international front has given our enemy insight and tactics to be able to circumvent critical intelligence collection methods. In that regard, he has done tremendous harm. And, with the shutdown of those programs, the fight is now over when and how strong encryption will be permitted.

Comment: Re:now if HR would *write* good postings (Score 1) 55

by Ronin Developer (#48989037) Attached to: Using Machine Learning To Find a Better Job

This wouldn't stop someone from "tweaking" the job description that was carefully drafted in the first place. I see this all the time - job descriptions from multiple, usually, offshore, agencies and recruiters with minor differences. The tweaks and grammar are so bad that when combined with their lack of US geography so poor (CA or WA is thousands of miles from my home - without company jet, it's not exactly commuting distance), that I simply laugh at them.

I want to receive honest job descriptions that match MY criteria from a job site. Then I want a list of agencies (job seeker ranked) and authorized to represent that position with contact info, rate info, etc. so I can decide who I want to work with in my search. Sorta like "Angie's List" for job hunters.

I would enjoy building a business around such a system.

Comment: This is just part of the equation (Score 1) 55

by Ronin Developer (#48988861) Attached to: Using Machine Learning To Find a Better Job

and, just one step in the right direction. But, it does mean being locked into a single job site or agency.

For me, a bigger issue is getting all the job descriptions (via email) that sound similar...but are being represented by multiple, usually, offshore, agencies. There are minor differences in the job descriptions that make it hard to know for sure they represent the same position. And, they don't like to tell you who the potential employer is UNTIL you sell them your first born and drink demon blood. They ALL want the right to represent you. Please be careful who you bargain with.. If you go with another agency for the same position (maybe offering a better rate) after they have submitted your resume to the potential employer, you will either not get the position or be locked in with the first one you gave your resume to. Hiring companies don't want to deal with the hassle of sorting out who gets the commission. In those situations, you lose.

Unscrupulous agencies will blast all resumes in their database that might fit a position in the hopes of being the first one in the door. You can be screwed without even knowing it if they have obtained your resume and submitted without your knowledge. They hope the employer sees one they like and calls them. Then, they reach out to you. It's one reason why automation is now used to sift through CVs and job applications with many finding their way to the bitbucket without ever being seen by a human. They get hundreds (maybe thousands) of resumes submitted electronically. You need to be standout. Being unique is one way to make it past that first cut. If not unique, you lose.

I have come view recruiters with a critical mindset after having dealt with some of the ilk that's out there. As a result, I keep a list of recruiters and agencies, gathered over the years, with who I trust to represent me well. There ARE good, reputable recruiters and agencies out there. Yes, I they are in the business of making money. And, I am very happy to help them do that if they work for me and help me gain employment on terms I prefer. This strategy, hopefully, prevents my CV from being rejected without human action first.

Comment: Re:planned? (Score 1) 577

I am aware that a certain midwestern police department would utilize license plate readers and drive through mall parking lots. The plates were run through NCIC and their DMV looking for scofflaws. This was pre-2005. I won't name the department or state.

In Philadelphia, PA, the Parking Authority added this specific capability to their vehicles as well to identify scofflaws and vehicles to boot.

In the former case, the vehicle license plates was also run against the vehicle table of the department's RMS where an officer could run additional checks, such as identifying previous incidents associated with a vehicle and/or known associates (and relationships) in addition to running the checks against the various law enforcement databases without submitting to the national databases. Each check against NCIC, theoretically, required probable cause or the officer faced discipline. Checks, at the time, against the DMV didn't require that level of scrutiny nor did the a check across the shared database used by other departments as these were local or state level systems. But, depending upon the hits returned from the other systems, a police officer could justify the check against NCIC in most cases.

The fact that they are doing this at gun shows...well...it's really no surprise. Knowing and recording that someone attended a gun show and associating it with a vehicle could, theoretically, save an officer's life given the likelihood that the vehicle may have a someone in it that may have a weapon (licensed or unlicensed). Of course, the counter to raising the officer's awareness could, potentially, result in a officer approaching a vehicle a bit more cautiously and, perhaps, more likely to use their own weapon. Double edged sword to say the least.

No, I don't work in this industry any longer and don't keep abreast of latest changes. Just relaying what was happening a decade ago.

Comment: Source of Future Data (Score 1) 220

by Ronin Developer (#48927451) Attached to: Anonymous No More: Your Coding Style Can Give You Away

I guess we can expect that source code repositories will be scanned and processed. And, for code written by multiple authors, the modified code (from commits) will be scanned and indexed as well.

But, I bet they will never figure out who writes the malware recently attributed to the three letter agencies. They should, however, be able to figure out which agency writes the stuff if they get a copy of the source code or maybe even from decompiling the binary.

Additionally, if written from .NET, the CLR code can be reflected back to VB, C# or any other .NET language to retrieve the source code.

Comment: Re:A call for Write Protect (Score 1) 95

by Ronin Developer (#48914127) Attached to: Researchers Tie Regin Malware To NSA, Five Eyes Intel Agencies

Yup. Changing a bios required physically taking the old one out and popping in a new eprom. At 17, I doubt the NSA cared less about my original IBM PC that came with a cassette tape drive (I couldn't afford single sided floppy drives until a little later..let alone a hard drive until I was 18.

Comment: Re:i++ (Score 1) 492

by Ronin Developer (#48903831) Attached to: Ask Slashdot: Is Pascal Underrated?

++i is not the same as 1++.

I have been working in Object Pascal, and C, and Obj-C and Java and numerous other languages for many years. I still prefer Object Pascal since I have been working with it since 1995. It's my preferred language for my own development. But, at work, I use what is most appropriate to meet our client's needs. Delphi has yet to be the correct choice. Our clients are either, PHP/Drupal, JAVA or C#/.net shops. Delphi doesn't have a place in web app development.

There was an argument earlier about why do we need another language, another tool chain etc. Hmmm. I seem to recall that Pascal and Object Pascal were around long before most of today's "modern" languages. I think we have the right to ask the same question of those languages.

The biggest argument I hear against PASCAL and Object Pascal is it's verboseness. While I would agree that all the begin/end blocks can seem annoying at times, nobody I know mistakes begin/end keywords for variables. And, single line statements don't need a begin/end block. I have heard arguments about the excessive use of white-space or the need for a semi-colon. Giving whitespace significance in code is an awful idea. Nothing could possible go wrong there could it?

The primary reason the language has suffered is because of what transpired at Borland/Inprise/Embarcadero after Delphi 7 was released. Quality suffered at the expense of making profit and developers migrated to cheaper or free solutions. I can't say I necessarily blame them - the product IS expensive. The migration of developers away resulted in a significant downsizing of the tool and component vendor community as well. At the same time, OSS became very popular. Still, Delphi continues to evolve. The tool permits rapid development of Windows/Mac and, on the mobile side, iOS/Android is a true cross-platform manner. While some might bitch about the FireMonkey framework, it does get the job done for cross-platform UXs. There will always be things that it can't readily do such as accessing certain frameworks out of the box. But, many frameworks have been translated. You can still access Java code on Android once you run the bridging tool and then adding the bridge units. Or, you could still write native iOS apps. But, if I am going that route, I would use the native tool chain rather than translate the headers (that's the vendor's job). That's just me.

I don't know if Delphi will survive. But, to speak against it without knowing what it can do is ignorant. Ignoring other languages and tools just because is just as ignorant as well.

If this is timesharing, give me my share right now.