Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Chess (Score 1) 274

The AI in master chess computers, in essence try as many combinations as possible to find the best outcome. However it still takes way to long to process all the possible games. So there is a degree of trimming involved. Meaning not all combinations are in play. So there is a particular degree of randomness based on what future actions we choose to predict and not.

I personally stink at chess and can barely see 2 moves ahead of me.

Comment Re:Really there's no excuse (Score 1) 98

There is the State University of New York at Albany example.

The design for the campus was designed by an architect to be used in a Desert location rumors have it in Saudi Arabia or Phoenix Arizona. It was designed to Chanel the winds to keep the campus cool for those hot Desert days.

However SUNY Albany to save tax payer money out and bought those designs, and put them in Upstate NY. Where the bulk of the school year is during the Cold winter months, thus giving the campus a bitter cold windchill in winter.

The Architect did a fine job, it was the stupid politicians who just cheap out and put a good design in a bad location.

Comment Re:Why would this surprise? (Score 1) 254

It comes down to economics 101. Supply and demand.

The reason why their is inequality is because not all people are equal. Some people have skills and attributes that are needed/wanted more then others.

So the Sports Hero, who is physically superior to most people, is rare compared to the average person so he will be more desired and be placed on a higher status then people who do not.

The CEO or Politician is willing and able to deal with a lot of crap and take risks that most of us do not want to take. So they get paid more too.

Us Engineers and IT guys tend to get paid a little better than the average guy because we have skills that are in more demand.

So that is the supply side.

But if you are the smartest person in the world but in a field that no one really cares about or isn't much demand for say Expert in some obscure author of the 1800's, or say a top performer in an instrument that no one uses. Then the demand side of the equation will kick in. So this guy while smart isn't going to get much pay or status.

As well like in the 1990's during the tech bubble. They was a glut of "Web Developers" AKA some guy who knows how to use Front page who were getting paid a lot of money because there was such a demand to get web pages.

No matter what sort of economic system. Supply and Demand kicks in and forces inequality.
   

Comment Re:Cheaper drives (Score 1) 183

Well there will be a point where SSD are cheap enough for people to decide to pay a little extra to get them.

As magnetic drives get cheaper per storage, they are sold at around the same price but with more storage. It isn't uncommon for someone to get a PC built with a few Terabytes of data in a magnetic drive. Or for the same price you can get a SSD rated in hundreds of Gigabytes.

At a particular point the faster SSD drives with be affordable enough to offer the space that they need at a cost they want to spend.

Comment Re:A rather simplistic hardware-centric view (Score 0) 145

Well beyond hardware, Software reliability over the past few decades has shot right up.

Even Windows is very stable and secure. Over the past decade, I have actually seen more kernel panics from Linux than a BSOD. We can keep servers running for months or years without a reboot. Out Desktops,Laptops, and even mobile devices now perform without crashing all the time, and we work without feeling the need to save to the hard drive then backup to a floppy/removable media every time.

What changes has happened sense then on the software level.
1. Server Grade OS for the desktop/mobile devices. Windows XP on uses the NT kernel, Macs and iOS use a Unix Derivative. Android and GNU/Linux are Linux based system. All of these OS's were designed for Server based useage with proper memory management and multi-tasking as well support from SMTP. Causing a lot of those silly crashes of Yesterday a thing of the past.

2. Understanding and prevention of buffer overflow. While we knew about buffer overflows for a long time. But it was found to be a security issue in the late 1990's. So newer languages and updates to existing compilers are designed to try to prevent them. Plus the OS now randomizes the memory segments to help reduce the risk.

3. The "Try" "Catch/Except" commands. It is nearly impossible to try to break a complex program during testing. The Try/Catch idea in modern languages while many old schoolers claim leads to sloppy code, it does attempt to deal with the fact that the world isn't perfect and people make mistakes, and allows a clean way exit your program or procedure on error conditions. Meaning your program will still run when things are not perfect, as well once it quits your data is in a clean state to prevent further data corruption.

4. The rise of Server based programming. We go threw cycles of who should do the work The Server or the end use device. How often do you actually need to download a program any more, I bet most of you don't remember Software stores back in the 1990's where you had to buy a program to do everything. They were cheap $10 programs, or you can get a collection of shareware. But in general everything you needed to do on your computer needed a program. If you wanted an electronic Encyclopedia you needed to get one and have it stored on your computer. You wanted a program that had some forms and did some calculations you needed a program.... This created a situation where you had a lot of programs on your computer with DLLs/Shared Libary versions conflicting each other. Today a lot of these small program are now done over the Web, On the server, with Javascript to make the UI clean. But that means there is less stuff running on your desktop that could be conflicting with each other.

5. Rise of Virtualized Systems. If you had a server, you needed to put all your stuff on it, on the same OS, with a complex set of settings which made them more suited to mistakes. Virtualization allows you to have a bunch of custom settings designed to do one job and do it well. Vs. one server to do many things.

6. The rise of the interpreted and virtual machine languages. Most stuff done today doesn't require compiling straight to machine code, but to a virtual code (Java/.NET) or it interprets the code in run-time (Python/Ruby/PHP). That give the developer separation from the hardware. Yes it slows things down, but it also prevents a lot of oddities that happen when you make a program on a 32bit vs 64bit OS. Or even the minor difference between a Core i5 and a Core i7.

7. The fall of easy to use languages. Back in the Old days, there was a lot of languages like FoxPro, Visual Basic (not the .NET) which were designed to be used by Non-Programmers. These type of languages are not being used as much any more, that means the programming is happening with people who know how to program and not from people who just know how to use a computer. This means the programs wrote in these hard (By hard meaning there is some design and though behind it) to use languages are designed better and not as cobbled together.

8. Inexpensive database. Back in the old days, a Relational Database costed thousands of dollar or more. Too much for most peoples usages. Now with MySQL, Microsoft SQL, PostGreSQL and the others, having a Relational Database to you program isn't going to break the bank, and you have a tool that allows for better collection organization and retrieval of data.

9. Google Search/Widely available broadband internet. Back in the old days of coding, you had to use your reference book, or help in the software. However most programmers write programs that do something different than what the compiler makers had in mind, so you need to figure out a work around. Today if you hit something that seems overly complex you do a Google search and you can usually find out a better way to do it. The old way you just did it the bad way, it worked so you kept it.

10. Fast Processors, Lots of RAM. The old days everything needed to be optimized. This first added more time to your programming, and opened your code for a lot of bugs, because you needed to strip out as many checks that you can get away with. Also you need to keep an eye on the RAM you are using and write methods to store data and retrieve it over an over again. Today you wouldn't think twice about storing a few megs of data in RAM process it, than just save the output on disk. Easier code, less errors.

Comment Re:Never let the truth (Score 1) 391

We like like the idea of the IQ score as a measurement. It is a number to say I am better then someone else.
However people are complex and their IQ is only part of the overall person. Very successful people have average or even below average IQ's as well. They can compensate it with Physical abilities, strong influential personality, or just knowing who to ask for answers and good guidance for better decisions.
A person with a High IQ and they know about it use it as a crutch to make them feel superior to others, while actually inconveniencing themselves by disregarding advice from people with experience and skills they they have not gotten.

Comment Microsoft is tied to the desktop. (Score 1) 337

Microsoft's dominance was in the Desktop. In other areas Mobile, Video Games, Servers, etc... Microsoft was just a major player, perhaps being #1 in the market but but not by the the huge margins it had on the Desktop.

With the traditional Desktop/Laptop market moving to Personal Computers that are more in a tablet small form factor, combined with the rise of development with non-platform particular languages, HTML 5, Java Script, Java, and Server Side processing. It is creating a fairer playing field for consumers to choose a platform.

Comment Re:Really? (Score 2) 118

Can you trust anyone with a zero-day exploit?

If you just tell the company and not anyone else, chances are they will thank you, or arrest you, then not put the time or money into fixing the problem.

If you tell the public, or any other group, they will be some bad apples who will use the information for their own misdeeds.

If you tell the government, they will use it to their advantage as well.

Comment Abstract laws that already exist. (Score 2) 134

Lets say someone had little security, akin to not locking the door, and someone gets into the system and seals data. That is the same as if someone just walked in and made photocopies of all the data and left the building.

If they needed to break in, where the computers are in a more compromised state then it is breaking and entering.

Comment Re:Read the source code (Score 2) 430

Reading the source isn't documentation.
You can see what it is doing, but you don't know why is it doing it, or what it is trying to accomplish.

Much like the idea if it is Open Source then it is also Open Specification. Which isn't true.
The source is the instructions for the computer to follow, the documentation and specifications are for the people to know what the product suppose to do.

Often software will have a glitch, it doesn't get fixed, because there is not documentation or specification to compare it against. So the bug either gets worked around or just ignored while the program is still faulty.

Comment Software Documentation is bad everywhere (Score 3, Insightful) 430

The problem is most software is complex, and documentation is an attempt to simplify the work flow. But the documentation if complete would probably be just as large if not larger then the code, and just as complex to read.

What I find for good documentation is down in the FAQ, or a quick spot where you know a particular area is kinda clunky in the UI, or just a list of of the features you can use and what they do. Not a formal write up in a 1000 page book. But the appendix, and the list of tables is usually enough.

Slashdot Top Deals

There are no games on this system.

Working...