Follow Slashdot stories on Twitter


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Segementation of the Users themselves (Score 4, Interesting) 57

Any security tech worth their salt will tell you the same thing. The network needs to be protected from the users themselves. They are the primary way bad things enter the environment. To that end you need to do several things.
1. Segment off the entire gamut of user PCs and apply the same access restriction methodology you do to the Internet feed. Use a white list approach. Yes, they can reach more services internally. No, they cannot obtain administrative access. The user in front of the PC has no bearing on the PC's access.
2. Remove the ability to administer anything directly. Create a set of 'jump' or 'hop' boxes which employ some form of two-factor authentication, from which all administrative functions originate. And this includes everything from networking gear to application administration. No PC should be able to obtain any form of administrative access to anything, anywhere.
3. Use end node segmentation. Every server and network device must have a separate, non-routable management interface. The primary IP address, the one with the configured default gateway, is the one used to provide services. The management interface has a disjoint IP address, as in it can't be derived from the schema used to create the primary addresses. It has no routing capability, as in it can't communicate outside of its configured subnet. The Hop-box through which it is managed is housed on the same subnet. Hop-boxes provide the service of 'management' to the environment and employ the same addressing and routing scheme. In this way remote, or off-site administration is accomplished through normal routing to the hop-box, not to the device's management interface.
4. Management applications use a VDI methodology housed on the hop box. This includes even SSH clients to the networking devices. They only display on the PC, they don't run in its memory space. As a best practice, all of your applications similarly run as VDI services for the same reason. The end PC becomes much closer to a 'terminal' or portal to the applications, and its memory space and CPU are used only to draw on the screen and communicate with the VDI service. There is a financial advantage as well to loading software only onto VDI servers, instead of a set of desktops. This also aids in writing the firewall rules for user PC's as the only services they need are for Internet access, and the VDI protocol itself. This is a thin-client kind of design without using actual thin client hardware.
5. Eliminate the use of local storage. This includes thumb drives but is really focused on documents. For the most part laptop hard drives are not part of any backup process, and at some point some middle manager will complain about a key spreadsheet they lost because the only copy was on their laptop hard drive that just went belly up. Avoid that. Put everything onto a file server which has access controls and a backup schedule. If you need transfer capabilities, use any number of secured file transfer methodologies. Yes you will require a network connection to access your files. No this isn't really a problem anymore, and why would you be updating your business critical spreadsheet held on a thumb drive you can lose?

Among other things this alleviates the need for draconian Internet filtering policies. Let the users browse Facebook or even dark web sites. They are treated as the security cesspool they are and they cannot achieve a secure stance no matter what is running one them.

Another thing this eliminates is the need to control local admin rights to the PC's. Let anyone load whatever software they like. Heck, let the web link load malware. It won't accomplish anything. You can keylog all you want, it won't get you any access.

The final advantage this has is more operational in nature. Given that there is nothing critical contained on the PC, then any PC will do. If one goes belly up or is compromised by malware, then simply replace it with another from spares and the user continues on their way. Mean Time To Resolution becomes the time it takes to dispatch a replacement, and the failed/corrupted device can be examined offline and without impact to the user.

Comment 3rd Party (Score 1) 993

I have long been saying we need a viable third party; if not a 4th and 5th. I will finally put my vote where my mouth is this November and vote Green. Be it that if either of the front runners makes it to the White House we're f*cked, it is now time to do everything we can to let someone else have a chance. All any party needs to be 'real' is 5% of the popular vote. At that point they can spend less money on just getting onto the ballot, and spend more time and effort on getting their message out.
I strongly suggest that each and every Slashdot reader do the same. Vote for "None of the Above" and cast you vote for any of the alternative parties which you find attractive. I don't really care which, just don't vote R or D. Nevertheless vote. Staying at home is abdication and you deserve what you get.

Comment I only play 1 game, and it's through Steam (Score 1) 412

Age of Empires, which is a Microsoft game no longer in distribution. My old CD's for it are long gone. I've noticed over the past few month's that running the game has become problematic. Being that the machine is a Win10 I created from components, I suspect the premise of the article to be correct - MSFT is compelling my machine to not run Steam well. It might be time to convert the whole thing to Ubuntu or Fedora.

Comment The Rise of Automation (Score 2) 474

This story is nothing more than the natural progression of something that started in the early 20th century. We used to have people who held the title of 'machinist'. Now we have machines called 'CNC's which perform the same job to a better precision and produce identical parts. Being a machinist was an art form. Since the invention of 'machine tools' we have slowly moved away from the art to a repeatable process. Eventually factories will employ no one, or essentially no one. Stock will be dropped off and finished product will be picked up without ever encountering a human being. No lights, no breaks, no vacations, no unions, no variance. Perhaps a team of maintenance workers, but there would be no reason to house them at a single plant. This is the future of manufacturing.
Similarly we are automating the office. I am old enough to remember six-part forms and hallways filled with file cabinets. Now the same information can be housed on a single drive. I remember call centers which employed thousands of agents. Now there is a computer program which can get you through at least the front few interactions. As we continue along this line of reasoning, there are a number of jobs which will fall into oblivion just as the machinist has. The basic premise is if the human being is following a script, or a decision tree, or a detailed process; I don't need a human being for that. Humans are needed for exceptions, not wrote processing.
There is of course an impact to this move towards automation. We don't need unskilled workers who can absorb the necessary training through OJT. This then eliminates the need for a vast number of now middle class workers. They move into the poverty class and the societal divide widens. Not everything intended for good is limited to positive consequences.
If you are a factory worker now, how do you ensure employability? Learn how to repair robots.
If you are a low level office employee now what do you do? Learn how to automate your own processes.
For something a little closer to my own profession, if you are a Route/Switch engineer (Networking IT professional) what should you prepare for? Learn how to program. You job is nearly obviated now. It's called 'Software Defined Networking'. The days of troubleshooting OSPF/EIGRP are nearly at a close.
Automation is the natural outflow of specialization and advancement. As you work towards making your job more repeatable and predictive, you work towards ending your employment.

Comment Failures in the Scientific Method (Score 1) 387

Not everything will succumb to the Scientific Method. Case in point, History. Not everything can be evidenced by experimentation, or in this case though "empirical validation." Think of it this way, how can you measure a "string?" What would you use to sense its properties? It would be similar to measuring the force applied by an ant's leg using a truck suspension spring. Actually not that close. That isn't to say there is anything wrong with the Scientific method, just that it too has a scope of applicability. Once we leave the realm of particles, we have no more tools. In fact most of the "empirical validation" done to discover things like the Higgs Boson are based on missing energy. So we wrap a particle around a hole in the experiment. Not exactly what I would call "empirical" but those who call for the dismissal of mathematically based theories use such techniques themselves repeatedly.

Comment Pass phrase (Score 1) 637

Include spaces. 0x20 is a remarkably unusual character in a password. Full sentences, perhaps a favorite quote (although maybe not quite exact since that would be predictable). Include your common misspellings and it's better still. Long is good too, so more than a phrase per se. "Now is the time..." or "Better to remain silent..." are good examples, but don't use overly popular ones. What is the phrase your mother/father/grandparent always said to you? What words of wisdom do you live by? These are good passwords and easily remembered.

Comment Another suggestion (Score 4, Insightful) 255

Find a problem and solve it. Your first (few) programs don't have to be full on applications or games. They just have to solve a simple problem you or someone else is having. Case in point, I am a networking professional (CCIE #12981). I run into things like, "what is the current inventory of devices on the network?" So I wrote a 'script' which does what I would do manually. It logs into a 'seed' device using provided credentials, downloads information like serialization and addressing, and then figures out connected devices from there. Newly discovered devices are then submitted to the same task. Problem solved. In another effort, I was working for a firm that sold a particular service and our back-office documentation to fulfill the order was so cumbersome that the probability for error was a statistical certainty. So I wrote some VBA inside a spreadsheet that took the metrics form the customer, and produced the requisite documents directly.
Find a problem. Solve it; simply, directly, efficiently. That will give you experience AND provide a useful output.

Comment Uhm, most of you haven't been paying attention (Score 1) 381

Nissan and Google have had actual working examples on the roads for a while now. If you still believe that 'it can't be done', then you are just fooling yourself. If you think 'a few sensors' and a microprocessor can't outperform a human, may I remind you of all the deaths that occur every year due to drunk drivers. The adoption of autonomous vehicles is limited only by schedules and regulation at this point. It is proven.
As to job loss, cars will have little to no impact in that realm. Yes I think taxi services will be impacted, but that business was already under attack by Uber/Lyft so it was going to have to change anyways. The real impact to jobs will be when we get an autonomous truck. And I mean 18-wheeler, not F150. This is likely what is still 20 years away. The shear physics of backing up such a truck to have it mate with the loading dock hasn't even been studied as far as I know. Nevertheless, consider what impact it would have to be able to have overland deliveries such that they would drive essentially nonstop from source to destination. No lay-overs, no limits on time on the road other than fuel capacity, no side trips, no dependence on a clock. Keep in mind also that once these things are removed from the current logistics systems, and since autonomous vehicles inherently report where they are, what speed they are traveling, and the surrounding conditions how simple it would be to COORDINATE such data. Then think about all the truck drivers that no longer have a profession. Then think about all the truck stops, hotels and other supporting infrastructure that is no longer needed. Heck you could even do things like refueling in motion. We do that with planes already, I can't image that getting it to work with a truck would be more difficult.
Finally as to adoption, lets be realistic. No one over the age of about 40 right now will buy one of these. It's called inertia. Those of that age group may have witnessed the technological revolution, but they didn't grow up with it. They don't trust it. They will always believe themselves to be a better driver, despite evidence to the contrary. We already have laws which specify things in cars for repeated DUI convictions; I assume this will at some point mean enough DUIs will require you to own an autonomous vehicle. As in no drivers license for you. This will be how the autonomous vehicle takes its final hold of American culture - when we are forced to adopt it. Until then it will be our taxi service, our delivery service and we'll have a few friends with 'those'.
As an aside, and completely fictitious situation, think of what you could do once the overwhelming majority of cars are autonomous. You can eliminate traffic signals and laws. You won't need them. Each vehicle could 'reserve' its passage time and directionality through the next intersection and all the vehicles would just meander around each other as needed - which is exactly what happens when you walk. Think of the fuel savings when nothing ever idles waiting for a light to turn. Think of the pollution savings. You could also increase speed limits, or at least make them sane. No longer would a road need to change its limit five times in the span of three miles. The vehicles would react in real time to the conditions, and (hopefully) communicate among themselves to disseminate such.

Comment Systemic Failure (Score 2) 186

Anyone (else) remember how we used to write programs (for the main frame)? The Chinese didn't invent anything, they simply followed the IBM red book. Although the advent of personal computers has certainly changed everything, the very basis upon which they did that eliminated the very thing being touted. Giving the power to process data (write code) to the end user will of necessity remove any impetus for code review.
There are other issues as well that are engendered in the forces driving software development itself. First and foremost is the inclusion of inexperienced programmers. Ones whose only experience is with writing GUI routines who are then promoted to creating systemic code. The two have completely different security needs. Similarly the move to frameworks such as AGILE where code production is valued over code correctness have led to a plethora of routines which only have positive testing, and no review. Finally the creation of both tertiary languages, ones that have to be translated twice before they arrive at machine code, and the rampant use of tools which eliminate the need to actually write code in lieu of dragging and dropping functional blocks, make code review nearly impossible. You aren't reviewing the code itself but rather larger collections of routines. You'll never find the backdoor because it isn't in the code you are reviewing.
What I'd like to see, and it won't happen, is a return to the bad old days. This is when a program update took between 6 mos and several years due to review and rewrite schedules. You can approach the same endpoint with well constructed negative testing, but I have yet to encounter a software firm which performed exhaustive negative testing. Usually if it is done at all it is simply a session using random data. No stress testing. No deliberate failure induction. No code review.
Why do we want to move all of our things to being internet connected (IoT) when we can't even write a decent firewall.

Comment Re:BASIC programming skills (Score 1) 214

You completely missed the point. I could have just as easily said "past a compiler" but since I used VB as my example I went with interpreter. Next time you may try reading the entire post rather than jumping off a single clause.

My post had nothing to do with one language/framework/tool being better than another. The point that you missed is that the skills needed to be a programmer have nothing to do with proficiency in a language. They have everything to do with math. Let's teach more math and then perhaps if you happen to enjoy coding you can be a programmer.

Another aspect I didn't mention has to do with tools. So long as we rely on tools which essentially build an application by linking interfaces from a library, you don't need a programmer. You need one to build the library but any mildly intelligent human can then build the app like linking together legos. That has been the main focus of software development for some time, which of course removes incentives to hire qualified developers. Thus the more 'intelligent' your tools, the less you need intelligent developers.

Software development is well on the path that Networking took about a decade ago. Good luck finding a job that pays enough to support a family when every 10th grade is building apps.

Comment BASIC programming skills (Score 4, Interesting) 214

I taught myself how to program using the BASIC books located in the Radio Shack stores and typed them into the (new) TRS-80's they had out. (Yes I am THAT old) I then moved on to using Assembly (Z80). At the moment I can code in 23 languages, and I think in C so there can be a progression.

Although I completely agree that one needs an introductory language to bridge the gap between language arts and programming, the last time I checked Dice there were no openings for Wolfram programmers. I do however remember all the hype around the various instances of BASIC and I can attest to a large number of VB apps that were written (very poorly) by non-programmers. Coding past an interpreter syntax does not qualify you as a programmer.

I see this entire discussion, including the various calls for CS education in the public schools as yet another instance of what killed my profession: the incorporation of unskilled labor. I am CCIE #12981 and there was a time when having that certification meant I could pull down a well paying job nearly anywhere. Now it almost doesn't matter because so few organizations need highly qualified networking resources. They have farmed out networking to a 3rd party, or they have a few slightly skilled resources that keep the lights on. I see the same thing happening to software development, and we as a culture will continue to suffer under the risks of running poorly written applications, because corporations don't see the need to hire highly skilled developers. Shoving all students into the pot via mandatory CS education, or promoting BASIC languages like Wolfram will only make that worse.

If you want a programmer you don't start with a language. You start with math and specifically with logic. The language used is a mere vehicle for the expression of concepts and as such learning its syntax is secondary. Rather teach principles, such as "Always check your inputs, and your return values" which is true in any language.

Slashdot Top Deals

In every hierarchy the cream rises until it sours. -- Dr. Laurence J. Peter