Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Professionalism (Score 1) 837

I worked for a software house of 10 people. We wore what the hell we wanted unless we were facing clients, then we wore what the hell the client needed to see in order to believe we were professionals. Once the client "knew" us, we wore what the hell we wanted. We grew, we merged with a company of equivalent size to make about 200 people. We kept our "dress code". The merged company exploded in size, technology group represented less than 10% of the 10,000 staff globally a big group of that 10% wore some company branded hooey we stayed with what the hell we wanted. Someone tried to introduce the idea that we should conform to a uniform policy.

I find the requirement for "uniforms" for non customer facing resources to be offensive. I have enough trouble with the idea of "uniforms" for people that do face customers. What I understand is that some customers come at things with a view to what a professional looks like and in order to make the connection you have to conform until you can prove that your suit isnot what makes you worth paying to do your job.

I probably would have walked from my job over this issue because it really sits at the heart of my relationship with my employer. My expertise, commitment and professionalism are measured in what I do and not the clothes I wear. If my boss thinks otherwise then he or she is a tool. If I cannot persuade them or their boss of this issue then the company is not worth working for. Period.

We all have to make compromises and by the time this issue came up for me I was senior enought that I could have pulled weight and just ignored it but I was holding out for all the guys in our group who didn't have that ability. I cannot overstate how much this kind of thing shits me. I don't know about your Helldesk folk but the ones I work with fall into two categories, the good who treat most problems like puzzles and do what has to be done to solve them, for them I would go into bat to get 'em the right to wear what the hell they like. The others, you know the kind I mean, I wouldn't piss on if they were on fire and I have bollocked them and their managers over their work, I would be happy if their uniform was a grey smock and a dunces hat, just so we know what to expect...

If you cannot leave your job then suck it up and wait until you can because this kind of thing is a baaaad deal.

Comment Re:Lowering the bar (Score 1) 578

Let dumber people program and you end up with dumber programs. Way back in year 2000 I found that most of the Y2K bugs were actually from more recently written programs in dumbed down languages.

No, I think dumbing down the langauges is the right thing to do. Made my living writing software for 20 years, never wrote a piece of assembler once. Some of the stuff we wrote was actually hard. Still never wrote a single piece of assembler. For this I am eternally thankful. Clearly almost every language we have is a better (and dumbed down) compared to assembler. Don't misunderstand, I fully accept that assembler has its place, but that too is a fairly generic statement that I am happy to endorse "I accpet that has it's place".

When looking at something new, I now try and start with some of the more dumbed down languages, Python, Tcl, Perl etc. I even had to learn a bit of Ruby the other day because I wanted to enhance something that someone had written in Ruby. Their choice. Whatever reason. Now I have a tool that does exactly what I want and it took 2 days. Sure, not a complex task but writing in C or Java would have been a nightmare.

I am waiting for the "Fisher Price" language. A bunch of oversized blocks that you just "snap" together to make your applications. Sure you only use 2% of their functionality but the big blocks are robust and "apparently" simple to the "programmer" and all one needs to do is specify the "what" rather than the "how". And isn't that the real ideal. I am sure we've all thought it even if we were too ashamed to say it... "Aaargh, bloody machine do what I meant not what I coded". So the real developments in this area are to find "environments" that allow users to specify the "what's" and leave the hows to the "environment". The language can be as dumb as possible.

To paraphrase; "Everything should be made as dumb as possible but no dumber"

We have quite a lot of dumbing down to be done.

Comment London in December (Score 1) 1095

Ah, pre-christmas in London. Oxford and Regent Streets window displays and in particular "Liberty" (http://www.liberty.co.uk/fcp/content/find_us/content), buy some chestnuts (they may or may not be roasted on an open fire) wander around looking at the poor folk stressing about crowds/shopping. Marvel at the crowds. Many, many Christmas drinks to be had at that time of year, try "The City" almost any evening, many pubs many suits many parties.

Don't bother with your laptop. Frankly I wouldn't even bother with a camera, just get a PAYG camera phone on your way in and if you need more microSD cards as you go along just buy 'em. Besides I would lean towards searching out the kind of fun for which you would prefer for there to be no photgraphic evidence at all, there is a fair bit of that to be had in London.

Don't know how old you are, but Shoreditch (Cargo Bar, anything at all on the Shoreditch High Street) for the young, Kensington and Chelsea for the glam (or wanna be :-), Soho for the mixed. Covent Garden and Leceister Sq for the tourists. Other locales; Islington, Clerkenwell, Borough (try the markets on the weekend). And that's without even leaving the "Circle Line" (much).

See some stand up comedy (http://www.99clubcomedy.com/home.html). Try a different countries cuisine every night. Pick at least one Fine Dining restaurant if you can, the best are superb. Definitely go clubbing, if that's not your thing look for some live bands. Grab a TimeOut magazine and just pick stuff.

London can be really "isolating", but if you make the effort and just try and connect with people that are doing the kind of things you want to do, you'll find them (mainly the foreigners :-) really welcoming. I find that during the pre christmas time people are much more friendly so it should be easy enough to do.

Comment Re:Skeptical thoughts (Score 1) 831

Didn't mean to suggest that it was named after him, but rather developed by Ericsson the company, in house. The name is "deliberately ambiguous" according to someone who should know. (http://www.cs.chalmers.se/Cs/Grundutb/Kurser/ppxt/HT2007/general/languages/armstrong-erlang_history.pdf at 3.2)

Comment Re:hmm (Score 1) 381

I looked hard at netezza for a big project with "absurd" requirements (many 10^4 new records per second, ad hoc queryable by clients). It seemed to be the ideal solution. Nice to see it might have worked. How fast does your data grow?

Comment Re:Put the damn thing in neutral! (Score 1) 1146

I actually had this experience in my new Mazda2. It was quite scary. I was going up a steep hill and the engine revved a little higher than normal, I did a bit of gear changing (it is an auto) and the revving continued, as I pulled up to the queue of traffic at the lights at the top of the hill (with the massive truck behind me!). It was all getting out of control. What was weird was that the throttle was still responsive, depress it, the engine revved up, back off it de-revved, but back to well above idle.

I too was worried about the engine electronics, until I checked the floor mat and found it had slid up over the base of the accellerator, pulled the mat back and all was ok. All in all probably about a minute but very disturbing at the time. I can totally see the source of an accident in such an event. It really felt like an electronic issue at the time, until I realised what it actually was.

Comment Re:Proves Only that No Platform Is Best at Everyth (Score 1) 498

The .NET platform is a generic platform that is capable of delivering good or at least acceptable performance in most cases.

Indeed. And this has been my point about the "new technologies" for a few years (specifically in the Stock Exchange space but also more generally). These new languages and platforms have lowered the barriers to creating average, adequate systems. If you use excellent people to do good design then you can really lower the implementation and operation costs of these systems and produce excellent applications. By excellent I mean cheap and fast and good or perhaps rather a triangle of cheap, fast and good that has much shorter vertices than the traditional model which makes the overall compromise easier to bear.

BUT, and it is a huge BUT. These systems cannot survive at the edges of performance. You give up too much control to the "framework" regardless of which one it is; .NET, Java, Smalltalk, Rails, whatever. As a result you cannot approach the limits of performance of the hardware at your disposal. Stock exchanges are, by their very nature a shared resource, minimum latency problem which means that every CPU cycle you are not using doing _real_ work is someone not getting serviced as well as the next guy. It is hoped that when you do the design and implementation right, you can take this tradeoff below the threshold of caring for the participants of your market.

At the top end of these systems you have to be using haute design and implementation to cut it.

A comparison that might shed light is cars, family sedan, Ferrari, F1. The family sedan of today is a highly efficient, cheap and really very good quality implementation of a car, but it just can't compete with a Ferrari on performance and whilst a Ferrari will get close to an F1 in many aspects of performance all the extra engineering in the F1 car is designed to wring out the last iota of performance that means that no matter how good the Ferrari is, it can _never_ get any closer to the F1 car. (As a sidelight note that the failures that result from a fault in each of these types of vehicles probably correlates to what goes wrong in the system space as well, not pretty!!).

As such, any system written within the constraints of a "Consultancy/Outsourced/Managed Code" environment simply cannot be at the leading edge of performance.

It is interesting to try and compare the characteristics of a Trading System with other more traditional HPC applications for which supercomputers would ordinarily be used as well as with other high requirement problems. I recall looking at VISAs data processing throughput and on their busiest day in history (23rd of December 2006 IIRC) they processed 180 million transactions in a day), and their IT spend was phenomenal. O can't find that data just now but in the year to June 30 2009, VISA processed just under 40 billion transactions, that translates to about 1300 per second. A stock exchange is an order of magnitude more transactions than that (perhaps more depending on how you count it) and the handful of seconds you wait at the Checkout for your VISA payment to authorise is 4 orders of magnitude longer than a stock exchange requires (10 seconds compared to 1/1000 of a second). That means that a stock exchange is constrained in two dimensions by a total of 5 orders of magnitude more demanding requirements than a system as comprehensive as VISA. I have no real experience about a climate model or nuclear simulation, but I get the feeling that it is a huge dataset issue and your "release valve" is that whilst execution time is ideally to be reduced, the simulation takes as long as it takes and you wait until it is done. CErtainly more Gigaflops than a Stock Excahnge but not as demanding in terms of the number or rigour of the contraints under which you must operate.

To make that work, you kinda need to be down at the metal.

To make sure that you also make it work 100% of the time you need to be bloody careful.

All in all it's a funky, funky problem and we haven't even got into the hard/interesting stuff about trading more complex products than the simple equities that we are talking about for LSE and their direct competitors. And these products require another dimension of constraints where the number of shared resources increase and the systems must process "many" of the transactions of these underlying equity markets and then provide the same performance metrics to their own customers.

Control over your algorithms is not just important it's mandatory. The frameworks that work so well in many environments just don't cut it under that set of constraints

Comment Re:Out of context theator (Score 1) 498

If you're over 300 kilometers away from the server, a one-way transaction will take more than 1 millisecond at the speed of light anyway. If millisecond gaps were that important, you'd hear about global disparities directly related to distances from the stock exchange servers.

Which is why whenever a client asks that exact question about how distance affects the "customer experience" of my sub millisecond trading system, I tell them - move closer. Not only is it a serious answer, but almost all the serious performance exchanges are offering co-location services to get the client systems on the same LAN yet alone on the same backplane.

IIRC correctly there are event some patents about co-locating the client algorithm inside the exchange trading system itself (broadly speaking). Although I am certain that if my recollection is correct then there is no formal implementation of this idea. It's something I have toyed with myself for a number of years. Kinda funky. If you can manage the risk of a rogue client, it's a trader's nirvana.

Comment Re:Unix has dominated this sector for years... (Score 1) 498

The insanity was not "Windows". Trading systems use very few OS based services. As I have said before, they basically need TCP stack and some pretty basic IPC, the extent to which Windows is good enough to provide this is debatable but it is not obviously insance. The tuning of the OS to make these limited services screamingly fast is the point at which something like GNU/Linux has benefits in my experience.

Ah yes, Sun and finance, They certainly shipped a lot of pizza boxes when PA RISC was the only non-IBM Unix equivalent, but in those days it was probablem tandem that was a more important player in the stock exchange space. Interestingly I think it was as much if not more intel rather than GNU/Linux that has driven Sun out of the machine rooms. Perhaps the Linux/Intel nexus is too hard to unwind to make that call tho'.

Comment Re:Over-analysis (Score 1) 1021

Except when the author takes a stroke of genius to avoid the technobable issues, a la "Dune", where Herbert, writing in the late 60s avoids the problem of where computers would be in 10,000 years by having them "banned" and deals with FTL travel by making it a monopoly supplied service. Similar genius with Nukes and "friggin' lasers". My point is that the "Lit" analysis of these techniques is thoroughly worth doing since it can help to make timeless future worlds and allow the student to understand what works and what doesn't.

Similary I find the absurd scope of something like Greg Bear's "Moving Mars" or "Forge of God/Anvil of Stars" to be equally as disarming because the scope of the tech is so bizarre that it is about the surrounding story rather than whether the tech might actually work.

Comment Re:No case (Score 1) 646

On the contrary, by way of example. If, in good faith, you buy some timber and use it to build a house, subsequently it is shown that the timber was stolen, it is very rare that the courts will make you pull down your house. The public policy implicatons of that result would be undesirable.

In this case the guy in question has clearly suffered loss as a result of Amazon's negligence (and that would be an interesting twist on his course of action, to couch it in terms of negligence!!), I hope he wins and wins big. Amazon should be punished for this kind of unilateral action. It's just soooo wrong to take back without checking first.

Comment Re:Mulitple Problems (Score 1) 438

Whilst the ones you mention have intense processing requirements, they are fundamentally different to Stock Exchanges. Indeed in many ways they are vastly more demanding that Exchanges but in two critical ways; uptime/integrity and response time, an exchange is horribly, horribly demanding.

Being up 99.9% of the time is wonderful, getting to 99.99% takes another order of magnitude spend. Getting to Exchange level 99.999% takes another order of magnitude spend (sort of). Given a 12 hour business day 99.99 means about 18 minutes of downtime a year. Now realistically this means 1 downtime event every three years. That means zero downtime events and every transaction coming through the system has the potential to be legally binding and worth millions, oh and the customer demands a guarantee that the response time will be less than 10 milliseconds (increasingly less than a millisecond). And this is just for a simple equity market. It only gets more complicated from there.

I am not suggesting that Linux is not the answer (indeed I advocate it in the exchange problem domain myself), just that the lessons that Google, Yahoo and Amazon have to teach this problem domain are limited and Windows can be made to work in the environment.

Slashdot Top Deals

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...