Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:Hell no (Score 1) 232

I have to disagree, programming is easy. The only time programming is difficult is when someone makes it more difficult than it needs to be. I will agree that nearly all of the yak-shaving has been removed making "programming" easier, but it goes back to my first assertion.

Real programming happens in your head or on a white-board with pseudo-code. Pseudo-code writes exactly the same today as it did 100 years ago, one logical step at a time. If Knuth has made pseudo-code easier to write, then I will agree with you, that he made programming easier.

Step deeper into the world and you'll be truly amazed at how deep it is ... and likely staggered that it works as well as it does.

You make it sound like some mystical complex thing. I do think that it is amazing how far computer have come, but I refuse to believe that logic, the cornerstone of programming, is some " deep and mystical" thing. The only thing I am surprised at is how difficult logic is for most programmers and how the state of programming isn't worse than it is.

Comment Re:Too much to express here, but (Score 1) 448

Don't worry, there is a good chance that we will have to become a Socialist civilization in order to reach a Type 1 or 2 civilization. Once nearly everything is in infinite supply, everything will be free. Let me know how your capitalist society will function in a free economy. But you make a good point, we should not force Socialism, only be aware of it and not fear it. We'll probably need some hybrid to transition.

Comment Re:Too much to express here, but (Score 1) 448

Humans want to succeed and have a successful lineage, they want to build things, they want to tinker with things, they want to learn things

Those are some rose-tinted glasses you have on. History shows that all of humanity has that potential and the best of us actually aspire to make the world a better place, but the majority of people are fk'n lazy and the only reason they act like they want to work is they want other people to think that they are important.

Don't confuse people wanting to be important and feeling important.

Comment Re:Look in the mirror, dehumanizer. (Score 1) 322

"No better than" assumes there is some universal standard to measure the worth of a person. There is no such thing as good or bad, just what makes the majority of people happier or sadder. Of course I a technologist, so I don't focus only on averages, but also percentiles. I don't just want average happiness to go up, but the 99th percentile as well.

Comment Re: He sounds like an idiot (Score 1) 332

Experience counts for naught. Most developers with 10 years of experience just experience the first year ten times over. Many big tech companies confirm this with their talent search issues. Statistically there is virtually zero difference in skill between someone who has 6 months experience or 20 years. Most people quickly reach their limits.

In my many dealings with world grade technology consulting firms, they are horrible at consulting, but they do make for great human interfacable reference books. Most of the time I spend about 5 minutes reading a Wiki article about a given technology before jumping into a meeting with a specialist, then I poke holes in their logic until their ego is bruised enough to get them to be quiet, then I start asking my questions and finally get somewhere. Their logic and understanding is almost always horribly flawed, but they do know a lot. Their opinions are pretty much useless.

They may know more facts than I do and have dealt with more issues that I have, but I will have vastly more understanding of the domain than they will ever have. Cargo-cult, that is all.

Comment Re:order (Score 1) 332

One of my co-workers prototypes all the time. He likes empirical evidence. I am the polar opposite. Not that I hate empirical evidence, but empirical evidence lies too much for me. I mostly just prototype in my head, but I seem to have very accurate mental models.

Empirical evidence will find local minimums, but not global.

Comment Re:Has the lord and savior told you (Score 1) 332

TDD stands for Test Driven Development, not Test Driven Design. Architecture and design happens before development. Don't start writing code until you know WTF you're doing. Build some prototypes, but just throw them away once you understand the problem. Same thing with agile. It is not an architecture or design methodology, it is a development methodology.

Comment Re: I'm always proud of my code (Score 1) 280

I've had the opposite experience. Projects that have been going on for years where the code was such a mess that bugs took months to fix, I would re-write from scratch in a month or two and never have another reported bug. Messy code is unmanageable and does not scale. It works for only the simplest of projects. And many times these projects turn into full time jobs because they are important enough to keep working, but messy enough that anything short of a re-write will stop people from complaining.

Comment Re:Multithreading is a solved problem (Score 1) 497

I love using atomics, but the biggest issue I have with them is the assumptions I make are based on x86 or x64 memory ordering guarantees. Please don't use my code on ARM.

The reason I like using my own atomic thread sync code is I like to write my threaded code to not require exact ordering where possible, as long as the result is the same. Sometimes this results in duplicate work being performed and one of the results effectively discarded, but the reduction in locking overhead is a huge win for scalability.

Comment Re:Four hard problems in programming: (Score 1) 497

I've only had an issue with a race-condition once, and that was when I only had a few weeks experience programming. Write you code in a way that guarantees that race conditions can only occur in certain locations and that problem is easy. I have not had to use a debugger to fix a race condition in years now.

My most recent project was extremely async and parallel for highly scalable IO. I told my manager it would take me at least 1 month to write it, I was given a week. I slapped that thing together, stuck it in prod and hoped for the best. Now, 6 months later, Someone had an odd difficult to reproduce issue. I looked at the stack-trace, got a bit perplexed for a few minutes, then realized the problem. Five minutes later I had it fixed. This pretty much describes every race condition I have ever had. Only a few times have I had to use a debugger, and that's because the issue actually existed in someone else's code, to which I did not have access.

My co-workers describe me as having a super-human intuition for debugging code. I seem to have a knack for being able to debug non-reproducible errors in system to which I have never seen the code, nor know the architecture. Based solely on the characteristics of the issue, I can infer the architecture and the nature of the problem. I've never understood other people inability to debug these issues. I just think of many mental models to solutions for the system, then pick a mental model that would have the same symptoms as being described. Nearly every time, the mental model I choose very closely matches the design of the system. People just need to get better at creating viable mental models.

Comment Re:Buffers (Score 1) 497

Because there is no such thing as a buffer really, it's an abstraction on memory

And there's no such thing as color, it's just an abstraction of the relation among different optical inputs
And there's not such thing as thought, it's just a complex interaction of chemical reactions
And there's not such thing as random, the Universe is deterministic
And there's not such thing as life, just atoms moving around

Everything we know of in this world is just a collection of characteristics that describe an abstract idea.

Comment Re:Buffers (Score 1) 497

Async isn't meant to help cpu intensive work loads. Generally, most computers have too much CPU power and not enough IO or scaling. If you're getting "synchronous freezing" from "cpu intensive" work that needs to be done prior to your IO, you have an easy problem on your hands. Get more cores. If modern CPUs are not enough for your workload, it's probably because you're horrible at coding.

Comment Re:Closures? (Score 1) 497

I'm not sure about Go, but .Net has some interesting deadlock situations with async not all of your code is 100% async. Which is annoying because most opensource .Net libraries are not async. I had to help a co-worker with some GCC Go pseudo-deadlock issues many years back. I found it rewarding to have solved a deadlock issue in a language that I had zero experience and turned out it was an implementation detail of how GCC handled "async" at the time, via threads, and when the thread-pool ran out of threads to handle go-routines, the producer and consumer could be handled on the same thread and block each other. Took me about 15 minutes. I say "pseudo" because if a routine blocked too long, the scheduler would change which routing is running, which "fixed" the issue after some hesitation. You'd get this strange jittering that got worse as more routines were running, quickly getting to tens of seconds in our test.

Once you have a mental model of how concurrency works, it doesn't matter which platform you're on. There's only a few good ways to implement it.

Async allows for high scaling when dealing with lots "messages" moving through the system. Context switching is crazy expensive, about 10,000 cycles on a modern CPU. That's not including a lot of other contention that's created in the kernel. To put it simply, if you want a single server handling 100Gb of traffic with millions of network states, you HAVE to use async. And yes, a single server can handle 100Gb of traffic over 100,000/s of short lived connections.

Comment Re:Closures? (Score 1) 497

Right there with you. I've been coding multi-threaded code for nearly a decade now. It's not the difficult. My first real-world application was threaded. Taught myself threading in 3 days, wrote my own synchronization code. Looking back, I cringe at what I did, but it worked perfectly and I have not had to touch the code in 8 years. Not bad for re-writing an existing program that had been under constant development for almost a year, in only three weeks, fresh out of college, about two weeks of experience programming, and having never written multi-threaded code before.

I recently had to look over my code again to turn it into a library, which took me less than a day. Well written threaded code is well factored with single-responsibility taken to the extreme. I don't like more than one piece of code modifying shared state. This lends itself well to be converted into a library, so most of my work was already done for me, but some annoying coding habits that I had were annoying.

Slashdot Top Deals

What this country needs is a good five dollar plasma weapon.

Working...