Follow Slashdot stories on Twitter


Forgot your password?
Take advantage of Black Friday with 15% off sitewide with coupon code "BLACKFRIDAY" on Slashdot Deals (some exclusions apply)". ×

Comment Re:Another attempt at manipulating the open market (Score 1) 229

Oh, and that is a somewhat difficult skill set to find.

I have it, and guess what? I contract out for pre-sales work all the time. Enterprise B2B people are just drooling over people who can do this. No wonder the talent isn't available to the big companies.

If they are smart, they do what I'm doing, and that is leverage that ability to understand new and old tech, create visions, back them with a strategic business alignment and value, and then help the salesperson pitch that to the execs for new sales.

(take the CIO out to lunch for bonus points)

Comment Something stinks here... (Score 1) 229

Isn't the CIO the generalist who is able to articulate how the business can succeed with technology?

Ok then, get after it. Your new people and old people all have perspective. Get off your arse, talk to them, make some choices and go and sell that to management or prioritize the budget.

Delegating the budget is just fine, but even that needs a basic review. I understand how it is in very large enterprises, but I also understand companies of that size can afford to hire several CIO types too. Not all techs can be business minded, young or old. That's a specific skill set, and as a tech generalist, they would and should be expected to get what they need from the hard core techs, who will gladly give it to them too.

For smaller companies, if they even have a CIO type position, the generalist there needs to do the work to understand what the strategy actually is and what it means.

Comment When there are fewer initial options, people... (Score 1) 270

...grok programming more quickly and easily.

This all comes down to what one has to know in order to attempt some programming. BASIC requires one know very little to get something useful done. They try the PRINT statement, and that's cool. GOTO, INPUT, strings, numbers, basic math follow.

From there, you can do pretty useful programs!

EXCEL works a similar way. You see what cells do, then you find things like autosum, then you put a little bit of math in a cell, and suddenly, you can make some really useful spreadsheets. I know people with about that level of knowledge modeling businesses to great success. It's not the most advanced use of EXCEL, but it works fine, they can change it, they get the benefit of some automation and can communicate advanced ideas to others with relative ease.

Way back in the day, before EXCEL, I had used BASIC to compute a whole pile of useful manufacturing related things. Saved me a ton of time, and I sold those and some CAD system programs to get a reasonable PC. All development was done on some 8088 clunker from a thrifty store. (yes, it ran the CAD system, having exactly the minimum requirements listed on the box)

The CAD system had a BASIC like language built in. Was cake to do this. I did know something about programming, but I also was able to teach others how to make useful programs on just little nubs of knowledge. Some of them advanced, getting into IT, systems, etc... while others just used the programs they made and were happy about it.

Indeed! The print is too small.

Best use case for new programmers, is to maximize utility while minimizing knowledge dependencies. They don't need much to get the spark. Once they get it, as they progress, they will want out of whatever little environment they started in. The ones who really have aptitude will get out and do just fine. For many others, they will just use the thing and be happy, or move on and not care so much.

We really should give everybody a go. Find out who is who.

Think of this like public speaking. We make everybody do it, or most everybody. Most people experience an ordinary, "I can do this" outcome. Some of us find out it's not for us, and still others find out they are great somehow. We lose out if we don't run everyone through.

Comment TOLD YOU ALL SO (Score 1) 263

And of course, those of us who prefer humans make basic marks on physical media are right about all of this and talked about the expense and untrustworthy nature of voting machines.

Here in Oregon, we vote by mail, and are joined by WA and CO now, with some other pockets here and there in various states. It's awesome, works, can be trusted, is difficult to fraud on a scale that would impact anything, and turnout is generally higher than the poll methods in use most everywhere else today.

We can actually manually count and evaluate every last vote if needed.

Comment Depends (Score 2) 170

My early experiences were the old Atari VCS (2600) and VCS stood for video computer system. I was fascinated by the pixels and the idea of a TV being interactive.

I wanted control of the pixels.

Later, in school, I got to work on Apple ][ computers, and those just begged to be programmed. Gaming can initiate the desire, but so can a lot of other computer driven things these days.

It is not prep directly.

Indirectly, games can be prep. For a few friends and I, cracking copy protection got us into 6502 machine and later on, Assembly language. We would use the monitor to see what was going on. Reading the ROM listing told us a lot more.

BASIC is slow, and that too drove learning more. To get the real magic out of the old machines, one has to know stuff. We made games, played them and learned. Utility type programming was good too. One such program generated book reports with just a few picks and keyboard input.

Just playing, unless the game incorporates programming concepts, is not meaningful. The ability of games and other interactive things can spark the desire to build and control.

The latter leads to activities that do serve as prep.

Comment Re:Interlacing? WTF? (Score 1) 113

C64 used a non-sequential scheme that mirrored it's character display.

8 bytes sequential on most machines means a linear series of pixels on the same scan line.

On the C64, those bytes got stacked up to form a character, each byte on a sequential scan line, assuming one starts at a character boundary.

Submission + - Parallax Completes Open Hardware Vision With Open Source CPU

PotatoHead writes: This is a big win for Open Hardware Proponents! The Parallax Propeller Microcontroller VERILOG code was released today, and it's complete! Everything you need to run Open Code on an Open CPU design.

Get the details here:

This matters because you can now build a device that is open hardware, open code all the way down to the CPU level! Either use a product CPU, and have access to it's source code to understand what and how it does things, or load that CPU onto a suitable FPGA and modify it or combine it with your design.

Comment Bullshit. The RIAA itself says this is OK. (Score 2) 317

I participated in a forum with Lessig, having my question selected for somebody from RIAA legal to be answered:

What are we buying when we buy entertainment media? Is it a license to view/listen to the product, or is it just a copy of the title that we have limited rights to? That is, do we own the license to view/listen to the content in any format -- or when we buy a CD, are we just purchasing the format of the content?

Matt Oppenheim responds.

(C) [What are we buying when we buy entertainment media?]

When you buy a CD, you should feel free to consume the music. That means you should listen to that disc, and feel free to make a copy of that disc for your own use so that you can have a copy in your home and your office. You should feel free to copy it onto other formats, such as .mp3, so that you can listen to it on your computer. You should feel free to copy it to cassette.

The only time you run into problems is if you begin to distribute your copies to others.

The original event is no longer online. However, it appears to be archived at the forum I just linked. We get to transcode and backup our media, and we've always been able to do that. Of course, the DMCA makes circumvention an issue, but CD's really don't have that problem as they are essentially an open, raw audio format to begin with. In practical terms, they are not much different from tapes.

So we make mix CD's, we back up our masters, so the heat from the car doesn't ruin albums we might not be able to buy again, we transcode for our portable media player, or frankly the media player we made ourselves! Mix CD's to express our love for somebody else? Yeah, doing that is OK too.

What this means is we've always been able to make copies for friends. The answer above from the RIAA actually doesn't state this, and for obvious reasons, but the reality of things is clear. What that answer does state clearly is that we are just fine making a copy for the car. In fact, this is nicer than the backup CD, in that it's not really portable like a backup CD is.

Here's a notable question for you:

Say you archive your CD collection. Then you give the originals away. Ethics would have you get rid of the backups. But the law? No requirement at all. Doing this is shitty, but not something one is going to jail for.

Hope these clowns choke on a dick.

Comment Reform to how we fund elections is primary (Score 1) 117

Your term limit issue is secondary, as are many other issues.

Whether or not we have term limits is a matter of reasoned public debate. Right now, we can't have that due to the money in politics problem.

It is unreasonable for you to connect your issue to the core, systemic problem of how elections are funded.

That is perhaps the biggest misconception and hangup people have. This isn't transactional politics. It's not like you get something in return, or trade-offs get made. We do that now, and the money biases it away from the overall best interests of the people.

Really, if we reform money in politics, a fair, reasoned discussion will happen. Or, at least a much better one will happen.

Term limits, and other things get decided then, not now.

This is a single issue effort. It is systemic, not partisan, and not intended to remedy anything other than the basic issue of money in politics.

Comment Re:Network transparency of X has always impressed (Score 1) 204

Yes. We really need to take a hard look at network transparent displays in the context of what we can really do today as well as the future.

When I did this, 10T networks were common, and just a little slow for something like CAD. 100T networks were growing in popularity, and then we sort of jumped to 1000T.

Also during that time, I started on dialup, moved to DSL, and then more came.

Know what? The fiber connection I have in my home is fast enough to run X with few worries today.

And it's going to improve more. My 4G cell phone can run X too. Amazing!

Honestly, I miss the vision our early innovators had. In a way, the field was more open and people could build without so many legacy ties. The need to incorporate those into the next step is holding people back. Legacy "screen scrapers" should get attention. They are useful, and they do have advantages for application developers.

Network transparent, multi-user, concurrent multi-processor, networked computing is the bar to cross, and if we don't maximize it, we risk losing out on a lot of the potental.

Sad really.

All I know, is I won arguments back then, and I did it on UNIX when the dominant move was to Windows and the PC, and all that distributed software bullshit we face today. Won solid. No fucking contest.

The difference was really understanding how things worked and applying that instead of following the cookie cutter stuff we see being done so often today.

With X, one can distribute or centralize as needed!

Fonts on one machine, window manager on another, application on another, storage on yet another, graphics server on yet another, or even better, how about a few displays, each capable of serving a user?

Or, pile it all on one box somebody can carry with them!

Doesn't matter with X. It's all trade-offs, and this leaves people to structure things how it makes best sense to them. For some, having very strong local compute / storage / graphics / I/O is best. For others, centralizing that pays off the best.

Only X does this. Nothing else does, or has.

The screen scrapers are impressive, but they really aren't multi-user in the sense that X is, and that requires a lot of kludges, system resources, etc... to manage things.

I remember the day I read about X in BYTE. It changed how I viewed computing, and when I got my chance, I went for it whole hog and it paid off very well.

Also IMHO, part of this vision really should be to provide developers with dead simple tools to get things done. It is true that building an efficient network aware application takes some work. SGI, BTW, did educate people. If you developed on IRIX, you got the tools to make it all happen, and you get the education and consulting of a vendor who knew their shit cold.

Today, we don't have that surrounding X, and it's hurting development pretty big.

Back in the 90's, I was doing video conferencing, running things all over the place on lots of machines, melding Linux, IRIX, Windows, etc... together in powerful ways, often using machines secured from a dumpster. No joke.

We've managed to cobble that together again, but it's a far cry from what could have been, and could still be with people thinking this stuff through like it was the first time.

IMHO, the other real problem is as I've stated. We have a whole generation of people doing this stuff now who basically have no clue! They were never introduced to multi-user computing properly, never got to experience X as intended, etc...

When I explain some of this to people, they make comments like, "sounds like Star Trek" and "amazing", "wish I were there..."

Yeah. I was. Many of us here were.

Comment Re:Network transparency of X has always impressed (Score 1) 204


Did this in the late 90's through early 00's.

That exact scenario. Know what? It kicked some very serious ass. Still to this day we don't really have a software combination quite as potent. Here's the setup:

SGI Origin multiple CPU, lots of RAM, one or more 1000T interfaces. I started the thing on 100T, which was more than acceptable for most users, but I ended up with a lot of users.

ONE COPY of the software, ONE shared data repository, and the software contained data management, revision control, etc...

That machine hosted 30+ users via the X window system. Users could run another SGI, a PC, Linux, whatever they felt like running.

A simple script logged them onto the CAD system, where they could build solid models, make drawings, perform analysis, and many other things.

No user ever touched the data store. It was owned by the account that ran the application (SUID), and no user ever touched the application data either. All remote display.

Admin on this thing was fucking cake. Never had it so good. Still don't. And systems today that either run "cloud" or copy data all over the place are a mess by comparison.

The network model of the X Window System had some very serious advantages. Today we are missing out on a few options in most cases due to the lack of network transparent display capability. That lack is costing us a lot of time and money too. Thing is, nobody actually knows, so it's all A-OK.

At the time this was done, I competed with traditional setups and kicked their ass solid on every single metric. Cost, administration, performance, etc... It wasn't even a contest.

Today, with the networking we have and overall compute power available, it's hard to imagine how freaking good a similar setup would run.

Shop floor, various departments... no worries.

For the odd user at home, X required too much and we didn't have things like NX yet. That was a case for "screen scraping" type tech, to which I just setup a VNC like thing, let them access that over the home network, and life was good.

I could, and did, administer that thing from all over the place, often using one of those "Free JUNO" accounts, just to get a dialup and a few K/sec needed to run a command line or two.


Truth is, the direction we took from those times, the decline of UNIX for this kind of thing, etc... was so much more labor intensive and expensive, I moved on to other things, occasionally consulting and mostly laughing when nobody sees the clusterfuck for what it is.

I agree with you about 4K and some other cases being more optimal without network transparency, but that's not the point. We also have other resource issues associated with those.

The protocol needs to have it all baked in, so that as we gain capability, smart people can apply it and actually get the benefit of it, not some diluted down thing we wish were as good as planned.

X did that. The protocol was there for when things grew, and some of us applied it all, and it rocked hard. A whole lot of us don't get it, and are still slogging around doing so many extra things we don't need to, it's a wonder there are any gains at all.

UNIX + X is multi-user computing. It's the bar, and most of the industry has forgotten what multi-user really means and how it can be used. Their loss.

Money is the root of all evil, and man needs roots.