Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Not from what I've seen (Score 3, Insightful) 248

Not the fact that wifi routers degrade, you are totally right about that, but that people will replace them. I'm amazed at how shitty someone's Internet can be and they have an "Oh well, whatever," attitude about it.

A good example near and dear to me is my parents. They moved in to their current place about 7 years ago and got a cheapass Linksys router to handle their NAT and WiFi. It has been giving them enough grief for me to hear about it for at least 3 years. They are not poor, a new router is not a big deal, yet they didn't get one. So I got tired of it, and also had an easy solution: When they were visiting me this June I upgraded my WAP to a new 802.11ac one and gave them my old one, which was working great.

They still haven't installed it. It's not like they don't have time, mom is retired and dad is semi-retired, it's not like it is hard, it is much simpler to set up than their old model and they can always call me. They just haven't bothered. Their router acts up, they go reset it, and don't bother to replace it.

Another somewhat related example would be a friend of mine. He's a young guy, under 30, and quite technically savvy. He's complained to me that the Internet at his house is not meeting advertised speeds, going quite well below it. Strange, since we are both on the same ISP, and live only a couple miles from each other and my experience has been that they always are right around max. I inquire a bit more and find out he still has a DOCSIS 2 modem. Ahh ok, well that is probably the issue. Though his connection is of a speed that a single DOCSIS channel can handle (25mbps), that modem has one one channel to choose from and it could well be too loaded down by other people on the segment. So my recommendation was to get a DOCSIS 3 modem. An 8x4 modem that is compatible can be had for like $80. That should solve any speed issues since now there's a bunch of channels to choose from, and will be compatible when they bump the speeds in the future.

He didn't want to spend the money, and so just complains occasionally about the speed.

For whatever reason, there are more than a few people who will just use old, failing, technology and bitch about it rather than fix the issue.

Comment And how does IPv6 solve this issue? (Score 1) 248

This is a real question: Do you know what IPv6 does instead of BGP? Because as far as I know, IPv6 is still using BGP, and that is what this is a problem with. In fact I can only see IPv6 making things worse in that regard because tons more address space means that more AS assignments would be easy to do.

So if it really does offer a solution, please enlighten me I'd be very interested. If this is just an example of trying to use a problem to push a favoured agenda, then please knock it off.

Comment Re:A rather simplistic hardware-centric view (Score 0) 145

Well beyond hardware, Software reliability over the past few decades has shot right up.

Even Windows is very stable and secure. Over the past decade, I have actually seen more kernel panics from Linux than a BSOD. We can keep servers running for months or years without a reboot. Out Desktops,Laptops, and even mobile devices now perform without crashing all the time, and we work without feeling the need to save to the hard drive then backup to a floppy/removable media every time.

What changes has happened sense then on the software level.
1. Server Grade OS for the desktop/mobile devices. Windows XP on uses the NT kernel, Macs and iOS use a Unix Derivative. Android and GNU/Linux are Linux based system. All of these OS's were designed for Server based useage with proper memory management and multi-tasking as well support from SMTP. Causing a lot of those silly crashes of Yesterday a thing of the past.

2. Understanding and prevention of buffer overflow. While we knew about buffer overflows for a long time. But it was found to be a security issue in the late 1990's. So newer languages and updates to existing compilers are designed to try to prevent them. Plus the OS now randomizes the memory segments to help reduce the risk.

3. The "Try" "Catch/Except" commands. It is nearly impossible to try to break a complex program during testing. The Try/Catch idea in modern languages while many old schoolers claim leads to sloppy code, it does attempt to deal with the fact that the world isn't perfect and people make mistakes, and allows a clean way exit your program or procedure on error conditions. Meaning your program will still run when things are not perfect, as well once it quits your data is in a clean state to prevent further data corruption.

4. The rise of Server based programming. We go threw cycles of who should do the work The Server or the end use device. How often do you actually need to download a program any more, I bet most of you don't remember Software stores back in the 1990's where you had to buy a program to do everything. They were cheap $10 programs, or you can get a collection of shareware. But in general everything you needed to do on your computer needed a program. If you wanted an electronic Encyclopedia you needed to get one and have it stored on your computer. You wanted a program that had some forms and did some calculations you needed a program.... This created a situation where you had a lot of programs on your computer with DLLs/Shared Libary versions conflicting each other. Today a lot of these small program are now done over the Web, On the server, with Javascript to make the UI clean. But that means there is less stuff running on your desktop that could be conflicting with each other.

5. Rise of Virtualized Systems. If you had a server, you needed to put all your stuff on it, on the same OS, with a complex set of settings which made them more suited to mistakes. Virtualization allows you to have a bunch of custom settings designed to do one job and do it well. Vs. one server to do many things.

6. The rise of the interpreted and virtual machine languages. Most stuff done today doesn't require compiling straight to machine code, but to a virtual code (Java/.NET) or it interprets the code in run-time (Python/Ruby/PHP). That give the developer separation from the hardware. Yes it slows things down, but it also prevents a lot of oddities that happen when you make a program on a 32bit vs 64bit OS. Or even the minor difference between a Core i5 and a Core i7.

7. The fall of easy to use languages. Back in the Old days, there was a lot of languages like FoxPro, Visual Basic (not the .NET) which were designed to be used by Non-Programmers. These type of languages are not being used as much any more, that means the programming is happening with people who know how to program and not from people who just know how to use a computer. This means the programs wrote in these hard (By hard meaning there is some design and though behind it) to use languages are designed better and not as cobbled together.

8. Inexpensive database. Back in the old days, a Relational Database costed thousands of dollar or more. Too much for most peoples usages. Now with MySQL, Microsoft SQL, PostGreSQL and the others, having a Relational Database to you program isn't going to break the bank, and you have a tool that allows for better collection organization and retrieval of data.

9. Google Search/Widely available broadband internet. Back in the old days of coding, you had to use your reference book, or help in the software. However most programmers write programs that do something different than what the compiler makers had in mind, so you need to figure out a work around. Today if you hit something that seems overly complex you do a Google search and you can usually find out a better way to do it. The old way you just did it the bad way, it worked so you kept it.

10. Fast Processors, Lots of RAM. The old days everything needed to be optimized. This first added more time to your programming, and opened your code for a lot of bugs, because you needed to strip out as many checks that you can get away with. Also you need to keep an eye on the RAM you are using and write methods to store data and retrieve it over an over again. Today you wouldn't think twice about storing a few megs of data in RAM process it, than just save the output on disk. Easier code, less errors.

Comment That's a problem we have (Score 5, Insightful) 561

I do IT work at a state university. As you'd expect with government institutions, we are really big on the EEOC rules and such. However, we can't force people to apply and for IT stuff, you get mostly men. Last round, it was all men. I don't mean we chose to interview all men, I mean no women applied, or if they did apply, HR filtered them out (HR does a basic "resume vs qualifications" check). Our IT group (we are only one of many IT groups on campus, there are women in other groups) is all male, at present. We had a female webmaster, however her fiance got a job in New York, so they moved there and of course she quit.

What, precisely, are we supposed to do to be more diverse? There are just not many women who seem to have the skills and wish to apply. We can't go and force people to apply, nor can we (legally or practically) say we'll waive the requirements for the job if you are a woman.

You can't hire those that don't apply.

So in terms of all this fluff up over Silicon Valley and diversity, I'd say how does their workforce numbers compare to their applicants? If in general it is the same, meaning say 30% of applicants are female and 30% of employees are female, 9% of applicants are black and 8% of employees are black, well then there probably isn't any discrimination going on. The fact that the numbers do not reflect demographics doesn't mean any discrimination on their part if they are simply not getting the applicants.

Also with regards to race, I'm not seeing why the 55% white number is problematic. According to Wikipedia, 72% of the US is white. If you count being hispanic as not being white (remember hispanic is an ethnicity, not a race) then the number is 64%. So per overall breakdown of the population, white people would be underrepresented in Apple by a fair bit.

That is also something I think people forget: The US does not have an even balance of all groups. Male/female has about a 50/50 split, but racial/ethnic groups are not nearly so even. It is still a nation dominated by fair skinned people of European ancestry, aka "white". The amount varies by state, of course, but it is quite a consistent majority.

Comment Re:Never let the truth (Score 1) 391

We like like the idea of the IQ score as a measurement. It is a number to say I am better then someone else.
However people are complex and their IQ is only part of the overall person. Very successful people have average or even below average IQ's as well. They can compensate it with Physical abilities, strong influential personality, or just knowing who to ask for answers and good guidance for better decisions.
A person with a High IQ and they know about it use it as a crutch to make them feel superior to others, while actually inconveniencing themselves by disregarding advice from people with experience and skills they they have not gotten.

Comment Re:Someone who reads random gun stuff on the net (Score 1) 219

You may have qualified on shooting a rifle, you apparently didn't qualify on reading since you are criticizing me by repeating things I said, like the fact that M4/16 are very accurate to long ranges, and that there are larger rounds in use for longer ranges. So perhaps spend more time reading and comprehending, and less time pulling out your (alleged) credentials and repeating what was already said as though it is something new.

As for fragmentation first off you act as though it is a bad thing when talking about a target. Quite the opposite. A round that fragments, expands, or tumbles in a person does much more damage and thus has a higher probability of stopping the target in a single shot.

In terms of fragmentation on other barriers: Try it. Shoot through a window, a couple sheets of drywall, etc. Put a paper target a bit behind it so you can see what happens. At short (less than 100m) range, the round will usually fragment on account of its high velocity. Depends on the round composition, of course, 62gr M855 will fragment less than a 75gr BTHP round. They don't explode in to tiny specs if that is what you are thinking but they break apart.

Comment Microsoft is tied to the desktop. (Score 1) 337

Microsoft's dominance was in the Desktop. In other areas Mobile, Video Games, Servers, etc... Microsoft was just a major player, perhaps being #1 in the market but but not by the the huge margins it had on the Desktop.

With the traditional Desktop/Laptop market moving to Personal Computers that are more in a tablet small form factor, combined with the rise of development with non-platform particular languages, HTML 5, Java Script, Java, and Server Side processing. It is creating a fairer playing field for consumers to choose a platform.

Comment Someone who reads random gun stuff on the net (Score 5, Informative) 219

It is amazing how much misinformation flies around about guns. One of the common ones is "OMG the M4/16 is such crap, the AK is so much bettar!"

You are quite correct about the range. The AR-15 platform weapons are much more accurate. Anyone who has ever fired both can easily tell that.

The issue that people like the grandparent conflate is the lethality of the 5.56x45mm round at longer ranges. Though the M16 can easily hit a target at long range (with a skilled marksman operating it), because of the small size and low mass of the round, it is often not as effective as you would want. If the bullet does not fragment or tumble, it can go right through someone and the small hole does little damage.

That is the issue it has at range, not accuracy or ability to reach that range.

Also this isn't like it is some completely unknown, or unsolvable, thing. The military also has weapons that use 7.62x51mm rounds which are larger rifle bullets and have much greater range, mass, and kinetic energy. For longer engagements still things like 8.58Ã--70mm and 12.7Ã--99mm are used.

Of course as you move up in caliber and amount of propellant, weapons become bigger and heavier, and have larger amounts of recoil to deal with, it is always a tradeoff and is one reason why the standard personal weapons use 5.56.

In terms of 5.56x45mm vs 7.62Ã--39mm (which is what the AK uses, is is not the same as the larger NATO round) the real issues come up at medium range (100-300m) and with barrier penetration. The light, high velocity 5.56 round tends to be fantastically lethal below 100m because the high velocity results in fragmentation when it hits the target. However since military rounds may not be specifically designed to fragment or expand (the Geneva convention prohibits it, civilian and police rounds are available that do), as it slows down at greater ranges they lose that ability and are not as damaging. Also, because of their low mass and tendency to fragment they are poor performers when shooting through barriers like windshields, doors, and so on.

THAT is the issue the rounds have in general use vs 7.62Ã--39mm rounds. Not long ranges. While they aren't super effective beyond 300m, they are reasonably accurate at least, which is not the case with the 7.62 rounds. At a long range engagement an M4 would be at a decided advantage to an AK-47.

However neither was designed for long range use. They are carbines, made for medium range and below. They trade overall power and range for smaller size, lower weight, and better portability. As their widespread use in many conflicts around the world indicates, they do well in that arena.

Comment Re:Homeschooling is... (Score 1) 421

Hopefully.
Something they learn.
Is how to make proper paragraphs.

In all seriousness though you need to get down off your high horse before you fall and break your neck. I've heard this BS of "Oh our homeschooled kids are SO much better than public school kids!" However I work at a university, and our admissions don't seem to bear that out. Homeschool kids often end up getting stuck in remedial classes, particularly English, because their skills are not up to the level required. To me that is particularly shocking, since I consider our entrance requirements to be pretty damn lax.

The problem I think is in part attitudes like yours: You seem to be very caught up in how smart your kids are, and how great you are for teaching them yourself. You are not looking at the situation through a lens of objectivity and thus are likely missing deficiencies in what you teach and what they learn. These will be laid bare if they choose to go to university, because they don't give a shit how special you think your snowflakes are, they will be required to meet certain standards like everyone else.

None of this even touches on the social learning aspects of public school. Just remember: Some day your kids will have to go out in to the wider world, and will no longer be accountable to you. If you've shielded them and controlled their lives, well they may go way more wild than you ever thought possible.

Comment Re:A Different Approach (Score 2) 421

I don't agree with cutting taxes to schools, but I do agree school administrators need to be held to account. I remember when we passed an increase for schools and the money was specially provisioned for various things: Teacher salary increases, labs for students, etc. It has specific provisions of what to spend it on. So what happened? The administrators gave themselves nice raises and had to get sued over it.

The answer in my opinion is not to reduce school funding, but to increase administrator accountability.

Comment And sometimes you need to (Score 1) 35

I have a BP machine at home. Why? Because I have what my doctor calls "white coat hypertension." What that means is I get nervous when I go in to the doctor's office and my BP goes up. Measured at home, my BP is on the high side of normal, but fine. At the Dr's office it is at the high side of prehypertension or low side of hypertension. It's not a difference in the machines, they have me bring mine in to check the calibration.

Ok well that means they can't keep an accurate record from their measurements. So they need me to measure it myself, which I do, and then let them see the results. These days such a thing is very feasible since electronics technology means we can produce quite accurate automated systems, that aren't that much.

For that matter a large part of your physical can be, and is, automated that being the blood test. You need a skilled person to draw the blood, but after that it is usually a computerized system that does all the analysis. It can be done by a separate lab from your doctor.

You still need to see them in person for plenty of things, but there is plenty of stuff that can be reported to them remotely and they can just look at the results. I don't see this as a bad thing, personally.

Slashdot Top Deals

"Don't drop acid, take it pass-fail!" -- Bryan Michael Wendt

Working...