Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment Re:Explanation (Score 4, Interesting) 237

TL;DR: Not really.

I'm guessing that's more of an "asset management system". Ours was orientated around the video. As cameras roll, we digitised the footage by tapping into the tape deck monitor output, we had RFID tags on each tape, and we had LTC/VITC timecode from the deck. We therefore had a unique reference for every frame laid down, as it was laid down (ie: there was zero ingest time, which was - and still is to a large extent - an issue with asset management systems).

The system then sent each frame to a centralised database server that had a webserver on it, and I wrote a streaming (ok, this part was in C :) server and a streaming player for Linux, Mac, and Windows that understood our custom streaming format. There wasn't anything complicated about the format, it was basically motion-JPEG data served from an HTTP interface, so the player would send the URL "http://asset-server/tape-rdid/timecode-from/timecode-to" and get an application/octet-stream back which was each file (common headers stripped), where a file was an individual frame in JPEG form.

What this let people do was record out in the desert, and have their digital dailies sent back via a satellite upload to home base via rsync, and the team at home base could "see" (we only supported quarter-res images at the time, the internet wasn't as fast as it is now) the footage, reliably locate frames on tapes, and discuss/annotate/create EDL (edit display-list, basically a set of timecode-timecode ranges) sequences and play around with it as if they had the tapes right there, even if it was at a low resolution.

On a more prosaic all-in-house system, the act of using a Discreet Inferno or Flame system (which controlled the tape decks in a post-production suite) would automatically log footage into our system, so the non-artist types could use our "virtual VTR" system to review and create play-lists which could then be sent to the machine room with the certainty that what they'd composed in their web-browser would be what ended up on the tape that would later be delivered to clients. This freed up a lot of the tape-deck use which could then be put to more profitable use by the post-house.

There was at least one time when I got a angry phone call from a client who claimed our system was screwing things up. They'd created their EDL for the client using our system and then sent the job to the tape room to be generated, and of course creating that new tape would automatically log the new footage into the system (because it was writing to a tape in a monitored tape deck). They looked at the output footage of the generated tape in their browser, and it wasn't right. After a bit of tracking things down, it turned out the tape room had inserted the wrong master tape, so we saved them the indignity/embarrassment of sending footage from a *competing* client out the door. That alone, in the eyes of the director, was worth the cost of the system.

We had similar procedures for rendered footage from 3D systems (Shake etc. at the time). Again, everything was collated into shots/scenes etc. on the database server. We had rules that would be applied to directories full of frames that would parse out sequences from arbitrary filenames that were differentiated only by a frame number in the filename. That's actually harder than it looks - there is *no* standard naming convention across post-houses :) I separated out the code into a library, wrote a small commandline utility called 'seqls' which was *very* popular for parsing out a directory of 10,000 files into a string like 'shot-id.capture.1-10000.tiff' ...

All of this is (I'm sure, I haven't kept up to date) commonplace today, but it was pretty revolutionary at the time. I'd say about 90% of the code was PHP, there were various system daemons in C, there were video players for the major platforms in C/C++ and there was a kernel driver for the linux box in C that handled the incoming video, digitised the audio, and digitised the LTC timecode (the audio timecode, the VITC timecode was on line 27 (?) of the video signal on every frame, and we decoded that as well in the kernel driver).

On top of that, I designed an audio circuit to take in the stereo balanced audio and de-balance it to a standard audio signal we could encode along with the video frames. More recently, and purely for fun, I re-designed the entire thing to be on a single board with an FPGA doing most of the work. One big-ass server and lots of cheap digitisers, it's amazing what you can do these days :)

That's a whistle-stop tour - I could go on and on, but the wife is calling me to do the dishes while she bathes the kid :)

Comment Re:Explanation (Score 4, Interesting) 237

15 years ago, Apple hired me on an H1B, and my starting salary was $140k, then they paid everything to convert my H1B to a green card. None of this includes joining and yearly bonus stock options (at the time, RSU's these days) or yearly cash bonuses. They also paid relocation and first few months of rent in a pre-arranged location.

I'm not special. There were several dozen of us in the (weekly) new-employee orientation meeting, most of whom were s/w engineers.

Oh, and I (or rather, my small company, that Apple bought) wrote ILM's digital asset management system for films like Star Wars (ep1), James Bond films, digital commercials etc. mostly in PHP. That sold for $40k/pop... Indeed, just like any language, it's possible to write crap code in PHP, but used properly it's a powerful tool.

Comment Re:Refute vs. Rebut (Score 1) 116

and refute can also mean simply to deny

Not really. It's just another instance of the media and politicians picking up a nice new technical-sounding word and not bothering to check what it means before using it. Of course the sheer volume of mis-use does in time change the actual meaning, but I don't think "refute" is quite that broken yet.

See also epicentre, code, chad, hacker, etc...

Comment Image processing (Score 1) 111

When I started my PhD in image processing, I was given an 80-column, 24-line text terminal to the department microVax (approximately 1 MIP, shared between about 40 people). I was lucky, and got one of the good ones, it had an amber phospher :)

Seriously, the only place to see the results of the algorithm was on a shared display downstairs in the lab - which was in high demand. I ended up doing a lot of terminal-style graphs (mine wasn't a tektronix terminal, so I only had text-like characters) to prove an algorithm worked before actually seeing it.

And now I look at the technological ability of my freaking phone, and I wonder at just how far things have come in 30 years or so...

Comment Re: Well that makes sense (Score 1) 185

I think the issue is that "clean" is in the eye of the beholder.

Sounds good, but utterly untrue. There are plenty of objective criteria by which the clean-ness of code can be assessed. Yes, you get beginner programmers who take an attitude of "That's your way; this is my way - we're both entitled to our opinion." but unless they shed that view point they'll never become competent.

The fact is that to achieve even something as simple as an encapsulated module of code which won't clash with anyone else's code, if you're working in JavaScript you have to jump backwards through hoops to achieve it.

The only reason to use JavaScript is because it's the only language available. It makes producing clean code much harder than it should be - it can be done, but it takes a lot of work.

Comment Re:Well that makes sense (Score 1) 185

We use it because it's pragmatic to use the lingua franca of programming.

Hardly a lingua franca - JavaScript is used because it's the only language web browsers understand.

I've used a lot of programming languages, and I've spent quite a bit of time trying to learn how to write clean JavaScript. It can be done, but the language really doesn't help. You have to fight it every step of the way - a better designed language would help you, not hinder you.

If a decent alternative to JS were suddenly to be supported by all the major browsers, the rush to get away from it would be immense.

Comment Re:AT&T U-verse Central Illinois (Score 1) 243

Why would that approach guarantee that your speed is consistently high, and why would a no-cap method be limited only to "ultra-cheap" ISPs?

Giving everyone a 1TB cap doesn't prevent congestion,

True enough, but you've reversed what I said to try to create a straw man.

Yes, applying a cap doesn't in itself prevent congestion, but what I said was the opposite way around. If ISPs sell a service at a price below the wholesale cost (because the market is driven - at least in this country - very much by "cheap, cheap, cheap") then they need to find some way to make a profit. To begin with they applied caps (whilst pretending they weren't doing), but as that's now become politically unacceptable to the mass market what they do instead is to vastly oversell their capacity, at the same time claiming, "We'll never slow you down". Then when your connection does get very slow, they say it's not them but other users and you shouldn't be so selfish.

I choose to pay a realistic price for my bandwidth, from an ISP who is perfectly clear that there ain't no such thing as a free lunch, and accept that they do properly provision their capacity (which is why my link stays fast) but that I can't use more than I've paid for. Doubtless they do still oversell - although not nearly to the same degree as the cheap, cheap, cheap merchants do - because there's no way I'm going to use 1TB in a month (typical usage for our house, with a game-obsessed son is about 250G) and they don't oversell enough to affect my connection speed.

You have to accept that there need to be limits somewhere.

Comment Re:AT&T U-verse Central Illinois (Score 1) 243

I rarely go over 60-70GB, but I still don't like the idea of caps. You should be paying for speed, with everyone limited to a percentage of their paid-for speed when there's congestion, while the limit would be increased during low-usage times.

That's how far too many of these ultra-cheap "unlimited" services work. The advertised price is so far below the actual cost of providing what they're claiming to provide that something has to give. The way it's done is to oversell the capacity heavily, and then no-one gets anything like the speed which they paid for, but at least there's no data cap.

I prefer it the way around my ISP does it - I never get near my 1 TB cap, but I can be confident that my speed will stay high all the time.

Comment Re: Finland (Score 1) 441

How so? Take someone that's being paid, let's say, $5000/month at the moment, and let's take a UBI of $1000/month to have a neat number to work with. With the UBI they'll be getting $6000/mo, but paying back $1000/mo for a net of $5000/mo. That's exactly what they were already getting, so where's the subsidy for the employer?

The $1000 less they have to pay the person to do that job because that component of their worth in the market is being met by the UBI not the employer.

The minimum wage is not the same thing. It is a required minimum amount the employer must pay, not a minimum amount paid by the public.

It's a good point. I quite like the general idea. None of this is going to be viable long term though, because we can automate all of these things too.

Yes. But there needs to be a transitionary step so the people who can't handle the idea of "getting something for nothing" can get their head around it (or die).

Comment Re: Finland (Score 1) 441

They can't just pay $X less and hope to still have people working for them though, unless the resulting wage is high enough that the employee will be paying most or all of their UBI back in taxes, in which case the $X reduction is mostly or completely just a regular pay cut.

That doesn't really address the point ? Even if someone is being paid relatively a lot, the UBI still represents a subsidy to their employer who will be paying them roughly the equivalent of the UBI less than they would be if it didn't exist.

A job guarantee relies on there being jobs available, which as we've established is kind of the problem. I guess you could invent some pointless work for someone to do, but forcing them to spend a significant chunk of their time doing meaningless busy work doesn't strike me as being better than not forcing them to do it.

There is arguably plenty of work that is not so much "pointless" as not particularly profitable. Someone to help little old ladies on and off buses, for example. Or more teachers. Or take back all the jobs around publicly funded services that have been privatised and improve it (eg: cleaning staff).

Slashdot Top Deals

"We don't care. We don't have to. We're the Phone Company."

Working...