Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Comment Cargo Cult Metrics without science (Score 1) 232

The Road to Performance Is Littered with Dirty Code Bombs

Unexpected encounters with dirty code will make it very difficult to make a sane prediction.

Dirty code is defined as ' overly complex or highly coupled.' As a programer you are expected to deliver X number of features by Y date. Unless one of those features is 'simple and loosely coupled code' what does that have to do with predicting anything? For performance you don't predict. Experiments are the only thing you have that work: test and change and re-test and un-change and re-test, endlessly. Anything else is voodoo programming, not to insult the pracitioners of Santaria, Vodou or Hoodoo.

How about predicting the schedule? I recall that Steve McConnell once joked that to get better at estimating we need to get better at estimating. (This may have been someone else.) Greg Wilson showed we can do this in programming, and Computer Science in general. We only have to do scientific experimentation with various methods. We throw away what doesn't work (instead of writing pulpy business books to bilk people out of money.) But you'll still have to run a lot of tests to do that, too.

It is not uncommon to see "quick" refactorings eventually taking several months to complete. In these instances, the damage to the credibility and political capital of the responsible team will range from severe to terminal. If only we had a tool to help us identify and measure this risk.

It is my opinion that any refactoring that cannot be done by an automatic program isn't refactoring. The original definition of refactoring is just 'factoring' or re-organizing the code. It is not a re-writing as in an 'several months' effort.

Misuse of a sexy, trendy name from the 90s does not change this. All re-writing suffers the risk of second-system syndrome and not in the throw-one-away sense of prototyping. Do you have a button to press in your IDE to make the change? Do you have in mind a short sed statement, simple awk program, EMACS macros or a on-hand shell scriptlet to do the transformation? If not then you cannot get away from re-thinking the problem. This will require re-design of the solution and re-implementation of the feature. Each of these carries time risk at least as high as the original work.

What if the problem is overly complex or highly coupled? The code may merely be an expression of this. In this case only a paradigm or perspective change by the customer, developer or user can untangle the problem. The computer cannot help you do anything but automate making a mess if the problem is a mess. Changing perspective is often an unbound-in-time problem for human beings. Good luck with estimating completion dates for that.

In fact, we have many ways of measuring and controlling the degree and depth of coupling and complexity of our code. Software metrics can be used to count the occurrences of specific features in our code. The values of these counts do correlate with code quality.

In fact, Greg Wilson showed in his presentation that almost every metric on the market when analyzed showed no better and usually equal predictive power as simple counts of Lines of Code.

The situation in programming is almost as if more code equals more bugs while less code equals less bugs.

This seems obvious and trivial, but this is quite real and has serious implications. One of those is the increasing spread of syntactic sugar in programming languages. Another is the proliferation of VM models that take over more features like threading and memory management over time. This is enabling less skilled programmers to do things that once required lots of skill, training and thought to implement. This also forces certain performance requirements for applications, e.g. the arguably fictitious idea that Java Virtual Machine is bloated and slow so all Java applications must be bloated and slow.

One downside to software metrics is that the huge array of numbers that metrics tools produce can be intimidating to the uninitiated. That said, software metrics can be a powerful tool in our fight for clean code.

But if they are no better than simple counts of Lines of Code, why should the uninitiated bother? If you know that the more you write the more bugs you are going to have, why not seek to write less instead?

They can help us to identify and eliminate dirty code bombs before they are a serious risk to a performance tuning exercise.

The fastest code is that which is never run. The only code without (implementation) bugs is code that doesn't exist. Why is quicksort so quick? Because it does less than other sort algorithms.

This is also, I think, why a lot of great programmers are known for writing either some major tool or a programing language. Rephrase the original problem in a well-matched language or tool's command interface. Then you only have to write a little amount of code. Writing parsers is so well known of a task that many tools exist to process a description of a language into a compiler automatically. The real trick is realizing you need to do this the first time, not the second time around.

This is well covered by the other advice in the Contributions like Unix Tools and Simplicity and Automate.

Comment Re:It's not the highway infrastructure (Score 1) 469

It is funny to note that one of the original - and never met - goals of the original President Eisenhower Federal Highway system was to replace bad city-planned roads to reduce congestion. The ironic fact the system increases congestion it by creating choke points to get on and off it is lost by many.

The real root of the problem is that people are either unwilling or unable to live within a short distance to their workplace. Many large cities were not designed to handle the volume of commuters that we have had for at least 20 years. People live in the suburbs (for a variety of reasons; some due to economics, others due to a desire to live in areas with lower population density), and commute to the city centers to work.

The highway system in the United States is rather unusual. Most countries would design a system to maximize the utility. Lots of high density living near high density employment plus walking, cycling and mass transit. Then minimizing problems like traffic jams by using turnabouts and parallel paths. Instead, the United States highway system was built for the military instead. It was created by the Federal-Aid Highway Act of 1956, popularly known as the National Interstate and Defense Highways Act (Public Law 84-627).

Originally this system got raw material from one side of the country to another to manufacture planes, guns, ammo and ships. It was also envisioned as a great lever on the economy. But the immediate social cost of this high-speed bypass was destroying little towns that grew up on existing roads like Highway 66, a road that already crossed the entire country.

But the system was funded at a time when Nuclear War was the next big thing just around the corner. One intention or clear effect is spreading living out into the new suburbs and exurbs to reduce the impact of a nuclear strike on the core of a city. In fact the roads around every major city aren't designed to avoid traffic jams but instead to ensure:

the importance of the Interstate System to evacuation of cities in time of national emergency.

-- the Clay Commission.

This was the time when everyone was told on the brand new TVs that success means 'a steady job, a home out of town, a car, two kids and husband+wife.' That is when they weren't practicing duck and cover.

Where these yahoos intended to put these people fleeting the burning inner cities during war? The imaginary copious amounts of farmland that planners though should be able to support them. Yes, this was during a time when farming was already well on it's way to consolidating into agribusiness.

No, people didn't decide that suddenly the suburbs were the peak of civilization (even if we parody that in the movies.) The citizens of the United States bought a big pile of propaganda. The sad fact is that the people who wrote that propaganda actually believed it was to help them.

The problem can only be solved by reducing the need for people to commute. There are a lot of ways to do this:

Tell that to three generations of management that believe in face-to-face time. Google and other Stack-ranking "Internet Native" companies design their HR system to terminate remote workers or flex workers as fast as they can hire them. Sixty years of white flight, black flight, Mexican-ization, gentrification, urban blights, drug wars, gang wars, the real estate collapse and protectionist nimby laws the problems haven't been solved by staying at home. In places like Irving, California, that are built on the Internet, things got much worse. The demographics keep changing but the work culture and laws didn't.

And the roads? The roads pretty much stayed the same. Literally. As in until you couldn't really drive on them anymore.

Infrastructure's expensive. Someone's gotta pay to make it then someone's gotta pay to keep it up.

Comment Money problems; money solutions (Score 1) 86

Is there a charity that goes to at-risk places like these mining villages and towns then pays the family to put their children into school?

Something Like:

But where I can directly 'employ' a child to go to school and get a report on how well they are doing, a transparency report on what portion of my money is making to the child vs overhead?

If there isn't I think there should be. Can you offer a family more money, food and opportunity to put their child into a small village school than the local miners or child laborers?

If so then you can effectively buy happiness for these kids. Or at least a shot at a childhood while raising the pay of miners who's "tiny slave labor" market now has to compete with the charity.

I think there's a missed marketing opportunity here for Apple. All they are doing is pulling their money away from a toxic situation like child labor which hurts their reputation with people who buy luxury electronics in various shades of grey and white. They could be touting how some of your money for your iThing is being spent on teaching children who would have instead slaved away to build your toy.

Comment Re:I don't like it (Score 1) 185

The free to play web-browser based game Kingdom of Loathing had a web-based IRC chat system long before Slack, Matrix, Gitter, Mattermost, et cetera.

Access to KoL chat requires passing a basic English exam. Several questions are aimed at common grammatical errors (to vs too, their and there and they're).

There is less low quality trolling and a lot less bot spam.

But even with a basic language test you will still have worthless discourse. The spelling might be a bit better, though.

Comment Re:They forgot compilers (Score 1) 115

I think the reason they didn't mention compilers, and OSes for that matter is that they limited themselves to things that are actually useful for the end user, not what lie behind it.

At one point compilers and OSes were the things used by the end-user. The very definition of an operating system is a kernel, standard library and compiler. This means that for most of its history Microsoft did not actually sell a actual computer operating system by definition. But for many users their computer is just their favorite application. To your accountant the computer is just a means to access email, quickbooks and irs.gov. To your kids the computer is the thing that provides access to disney.com.

The biggest change has been in the users, not so much in what was provided. The typical target user has not been academics or geeks in decades. Applications are targeted at children with no technical skills, busy parents with no technical skills and professionals with absolutely no technical skills. They interface to the computer in their pocket through rote, learned application-centric tasks. Like thumb pressing a share button to tweet a picture of their cat.

Video games are a major component of the history of computing and it is important to include something to represent this industry.

The popular media may want to whitewash history but major improvements in computing like operating systems, networking and personal computers follow two very end-user focused applications of processing power. One is pornography. The other is video games. Ken Thompson developed little project called UNIX based on a system to play a game called Space Travel on a PDP-7. That design seems to have done pretty well. The success of AOL hinged upon their dominance of the online "dating" scene, not so much their free coasters. Modern machine learning algorithms are designed with kernels that run efficiently on PC video cards. The same cards which had their expensive research and development paid for by at home video game enthusiasts craving a few more pixels or FPS.

But to your system administrator you are all equally end-users. Compiler in hand or not.

"The pillars of your bright new world were built by people whose minds are so arcane and alien to you that you will never be able to comprehend exactly how much you rely on the hobbies of dead legends."
-- Lesrahpem "LINUX INSIDE!" (paraphrased) 2009 September 22 03:44 AM

Comment Re:you mean capitalism works? (Score 1) 372

Let's remember that the drug wasn't there before. That's the price the society pays for a dynamic drug market.

No, the epipen was cheaper before Mylan CEO Heather Bresch decided to jack the price to +$600. Of which the company claims to only make $50, a really nice profit for something some people need. That is ignoring the insanity of that $550 overhead. An epipen is a single use stick needle. It delivers a $5 dose of a drug needed to stop anaphylactic shock. Outside the United States these pens are below $10.

You invent something; it's prohibitively expensive for a bit, then the price drops.

Nice theory but reality is different. The dark side to supply and demand is that if you need something, you don't have a choice to buy it. Whatever price I chose to sell it to you is what you have to pay. You want to stay healthy so you need Medicine. Since medicine is something you need, you'll pay whatever price or suffer. If I can make enough profit I can even afford to make sure nobody else competes with me. Either I can create a premium brand like the iPhone or just break the kneecaps of anybody who competes with me like solar roofs versus the local power monopoly.

The best business is to charge people for nothing, like sham medicine. The second best business is to take something that was cheap and already exists then resell it for really high profits.

And because of the first problem the FDA regulate markets like medical products very carefully. You may have to pay more since providing something real is more expensive than just cheating you out of your money. But you shouldn't be getting sham products.

The FDA doesn't regulate the cost to consumers, though. The would require a different, non-existent government organization in the USA. Something like a single payer medicine program.

Comment Puppet verses Ansible? (Score 3, Interesting) 167

Where do you see the configuration management market going in the next year or two?

Orchestration is the hot topic right now for automation verses last year's configuration management tools. Ansible is more orchestration than configuration management. Puppet and Chef require tools like mCollective to pickup the orchestration piece. RedHat now runs Tower. And Tower now ships as part of the RedHat Ceph storage product. RedHat's Satellite product is based on the Foreman which includes Salt, Puppet, Chef and Ansible support.

But where is this market heading? Are we likely to see consolidation? Integrations? Or even a flood of config management system tied products from vendors?

Comment Re:Train them as poorly as possible (Score 1) 531

You sir, sound like an idiot. If you were 'so talented' you'd have had no problem finding a job. In fact your story smells like such bullshit I had to check my shoes to make sure I didn't walk in anything before I sat down.

Then you need to check your eyesight. You missed the cowardly brain matter leaking from you anonymous ears.

The story is so common and well-known in the United States that it even has a name: hard luck story.

The skills for doing a job and getting a job are different for everyone but a corporate recruiter.

Thus RubberDogBone was probably busy doing the job when working and not dedicating large amounts of time to finding the next one. Deep experts tend to be like this by definition. They gave up other time and tasks to dedicate to learning and performing one thing. It's also why going to conferences and user groups in an important part of professional work.

The skills for doing a job are tied to the application(s) and industry worked in. The skills of getting such a job are those for establishing and maintaining a large network of people. These people get you job referrals and job offers by getting past the HR filter. In instances where you are well known they can create jobs to get your limited skills for themselves. At the least they connect available jobs with available potential employees.

This is exactly like dating. There is a hidden information problem with lots of questions. Can you do the job? Can you fit in with the existing team or deal with the family? Are you wiling to work for the money available? The tools to resolve the problem are limited to writing about, talking to and meeting people. All of these fall into the trap of trust and reliability. Was this person just lucky at their last job or relationship? Are they bullshitting about their ability? Is this person just a presidential-class conman or con-woman?

In both cases lots of new tools have been developed to work around the problem. You have dating sites, prostitution and Churches on one side. On the other you have Linked-in, personal consulting and out-sourcing firms like Capgemini.

However, large layoffs like this are different from just losing a job like RubberDogBone did. In large layoffs the employment vultures circle. The most desirable employees get picked off early. The rest are filtered through so those with the top amount of connections get hired out. Stereo-typically in IT, a lot of employees are going to have limited social networks outside of work. Now those networks are gone. With a sudden glut of potential employees the market saturates in an area for a while. The suddenly unemployed and underemployed won't have the resources to go to conferences or spend time networking with peers. That network is gone so their duration of unemployment will be long as they compete on even ground with every conman and crook in the general labor market to get past HR.

Company unions aren't the solution to this. They start out fine. But because humans must run them it just devolves into another kind of business you have to get hired into. Unions "solve" the hiring problem with a worse old boys network than the original company. Taken to an extreme you cannot find work in some industries unless you are either already skilled or you are related to someone who does the work. Trade guilds are slightly better - being industry wide - but again depend on corruptible fail-able and limited humans to do the work. Maybe in the future machine run guilds could prevent this but I don't trust the people programming the machines. They are still human.

Comment Re:Go measure (Score 5, Interesting) 147

With dislreports and other aggregation tests, the bloat for download and upload may not be symmetric. So the resulting score might not be as good as it looks.

Paying for a commercial connection? Test for this kind of performance daily and scream as soon as it drops. Otherwise why bother to pay so much?

In the United States and other jurisdictions a home 'customer' user is not expected to run a "server" on their paid for Internet connection. Downloads may be finely tuned to low bloat. But upload may have significant bufferbloat, caps and gradual dropout. For financial reasons, of course.

This upload problem may get to be much worse in the future. More and more services push data from "client" devices in the home or office. Camera phone videos, twitch streams, shared google docs and your home automation spyware upend the upload/download assumptions of last-hop telcos. P2P is impacted now. The highly asymmetric buffering of uploads is detectable using protocols like bittorrent that don't have client-server separation.

Comment Fueling the fueler (Score 1) 38

Well, we do need a good OTV (Orbital Transfer Vehicle). You could use it to move stuff from orbit to orbit as needed.

So, how much fuel is this robot going to have on board? How or why would you refuel it?

The reason you put tiny fuel tanks on satellites is that it cost a lot to launch anything on a rocket. If it didn't then the engineers would put huge tanks on things sitting in orbit. Tanks designed to last as long as the next part expected to fail.

At there aren't that many kinds of propellant in use but you'd still be out of luck if you had something using hydrazine while the only thing left on the repair 'bot is nitrogen.

Orbital transfers aren't free or cheap (ask any Kerbel Space fan.) It will be interesting to see what propulsion system is proposed. There's interest in tethers for 'propelentless station keeping or orbital transfers.

Would you send up refuel cans for the robot? Would you de-orbit the robot once it ran out of fuel? Could you recover the robot to save costs, then?

Except for the Hubble Space Telescope most satellites are not designed to be serviced. What can a hypothetical servicing robot do about dead batteries or shorted out control systems or hole solar arrays on the existing fleet in orbit?

Finally, while space is pretty big, sending something on a 'soft' collision course with a dead satellite in the prime geosync orbit sounds like a great way to create more debris just where you don't want it. But it's Loral. They will have the best people Congressional pork spending can buy on staff to ask and answer these questions.

Comment Re:Amount of gravity needed? (Score 1) 77

It would also be nice to get a long term study of humans in rotating space habitats to see if it has any issues not detectable by ground models. Theory says the vestibular system shouldn't be impacted by long duration in an fast "inverse" rotating frame. It evolved on a large rotating planet after all. But Yogi Beara and any astronomer will tell you that in theory, theory and practice are the same but in practice they are different.

We have lots of experience with space craft that shuttle things off or to ground. There needs to be operational experience with vehicles that are designed to permanently remain in space. If you built your space stations strong enough and big enough you only need to attach an big engine to turn them into space ships.

Comment Re:Color Me Skeptical (Score 1) 428

Also, how hard is it to cut through an existing solar roof to add things like plumbing vents or to move a flue for a stove in a major kitchen remodel.

One advantage tar shingles, a very popular option in America, is that adding a roof vent is an hour long affair. Punch a nail up from underneath so you miss the rafters then just pull back the shingles, cut a hole, and apply the fascia kit for your vent. The tar shingles get layered right back on.

I presume these will be more like a terracotta roof but much less friendly to modification. Particularly when the shingle is generating power while exposed to light.

Still, if this is at least as durable as a class 4 "hurricane/tornado" shingle they might qualify for the common home owner insurance discounts on top of the price.

The home owner game is a market of long-term thinking. If you are only interested in next quarter or uncomfortable with 5 year break-even on your investments, just keep renting. From someone who owns a house.

Comment Re:All linked in /usr ? (Score 2) 58

I am pretty sure they also forgot that the 'S' in sbin stands for static und not superuser.

I beg to differ: http://www.linfo.org/sbin.html

These file in /sbin were system binaries. That is why /sbin directories are usually not on the default path for users.

Now, /usr/sbin, that one is confusing unless you know the sorrid history of /usr as a shared NFS mount. Files in /bin and /sbin may be statically linked or not even on real UNIX. For boot-time on Linux like Debian, static linking is for stuff in your initrd, rescue images or really really badly written software (*cough* Zabbix *cough*).

The changes directly impact two groups. Power users are going to need to know about /bin, /sbin, /usr, etc. as they are going to mess with their system directly. Package Maintainers are going to have another thing to pull hair out over when converting the raw sewage seeping out of poor developers into functional shipping things to end-users.

Until this impacts regular users or Joe X Windows who runs SteamOS it's like the mechanic changing the brand of shocks in your car. Someone who knows better will be using the correct tools to do the correct thing. Or everyone will hang them out to dry when your transmission drops out of the car on the highway.

Comment Re:Energy budget (Score 1) 151

So what am I missing? What is the actual benefit to separating heavy industry and people?

That it is really really easy to get things down into a gravity well.

In orbit? Just toss the package out the back fast enough and it comes down all on it's own. Take care to not hit anything on the way down.

Also, space colonization for real will the subject to huge limitations. Suppose you manufacture stuff in orbit and have the technology to ship it down to the ground. The landing process is the same technology for dropping a bomb anywhere with minutes notice from an effectively unreachable location. You don't even need bombs. Rods from God are a thing.

Governments have a long-term interest to ensure colonization - not industrial development - is slow, limited and guaranteed to align with their purposes. Space is the ultimate anti-government, anti-anyone position. Literally the high ground. Otherwise you'll get the plot of Heinlein's book, Moon is a Harsh Mistress.

But if you put heavy industry in space and most people still live on the ground, it takes an incredible amount of energy to get the raw resources into orbit and bring the finish products back down

Lifting anything into space to bring it back is a fool's errand. Look at how much of the Saturn V that went to the moon came back. Plenty of resources exist in space already to mine locally.

And without on-site captive customers...err, colonists, the economics dominate the situation. Industrialization is most likely to happen around the time that industrial jobs finish being taken by robots. That way you don't even need to ship messy old people with their huge life support systems. With enough resource scarcity to make market-wide recycling economic this would only be done for selected items anyway. Anything that can't be automated would be telepresence, keeping your workers and citizens safely in reach of the police and military.

You would always build wood stuff on the Earth. But if I could drop 10,000 custom-yet-completely-prefab concrete and aluminum houses, white goods and all, anywhere in 10 minutes (with clearance from traffic control) that could be a game changer for disaster relief or interesting for urban development.

But with any factory in a new area the problem is getting the first one up. Then you have the infrastructure to get many more much cheaper and quicker. Just look at how industrialization happened everywhere on the Earth.

Comment Re:"This is the one you want to protect" (Score 3, Interesting) 151

BioSphere II was a poorly planned theme-park garden now owned by the University of Arizona.

Want to see what can be done if you really understand ecology and not just theme park construction? Look at Ascension Island. Joseph Hooker, with the aid of Charles Darwin and Kew gardens, built the ecosystem on the island out of completely foreign species. This cloud rainforest was built whole cloth on a bare lump of clinker sticking out of the ocean long before electrification.

The key difference is ocean.

Biosphere II was designed with almost no significant bodies of water containing phytoplankton, which produce up to 85% of all the oxygen. The facility has a glorified wake pool that would have fit in a large cities' water park. The planners put in 50% more grassland than synthetic ocean. Much of that 850m "ocean" is dedicated to a coral reef. Unsurprisingly, the oxygen levels crashed soon after closing the doors. Both times.

If one thing was unrealistic about O'Neil Colonies it was the sheer lack of mixing oceans in all the designs. Water is one of the most abundant substances outside the dry line in the Solar system. It's also a good radiation shield and has high thermal mass. The giant magic space windows that somehow didn't let in vast amounts of cosmic radiation were more realistic.

O'Neil also wrote about Bernal Spheres. These are slightly better, but have their own engineering challenges. Artists still show the interiors as if they were a cutout of a heavily populated Italian riverside. More relaistic would be 70-80% ocean with islands or peninsula. But in Bezo's case it's probably a matter of go big or go home. And the Island Three plans are certainly Big Homes.

Slashdot Top Deals

Veni, Vidi, VISA: I came, I saw, I did a little shopping.

Working...