Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Comment Perl's problem: popularity, not functionality (Score 3, Interesting) 133

I've been around long enough to see Perl go from the glue of the internet to object of scorn. It's no longer the preferred tool of sysadmins or the easiest way to write web applications outside of raw C. I've had a good deal of time to consider why that's the case, and I keep circling back to it being an issue of popularity.

We like to think that we're engineers, scientists, deep thinkers, whatever - and that we as software devs therefore make sound evidence based judgements, at least more often than other disciplines. The fact of the matter is that we're just as led by emotions as anyone else. We have 'Holy Wars' over OSes and languages and frameworks, and what most of them boil down to is justification of personal preference more than anything else. Not features, not availability, just personal preference.

In that light, I've been seeing a lot of fad languages in the last decade or so. I usually refer to them as toy languages though that phrase may have a number of inaccurate definitions. Simply, they're a new toy to play with. Scripting languages are especially prone to this because they tend to have a lower barrier to entry. In recent memory, I seem to remember it going something like Ruby + RoR => Python => Scala => Javascript via node.js => and now the big thing I'm hearing about is R. The claim is that each one is so much better, but the reality of it is, it's just so much newer, and the differences in implementing identical functionality is not as important as the flash and sizzle. Even when the sizzle is backed by something useful, people stop paying attention once it stops. In fact, most of these languages have been around for a long time - several of them almost as old as Perl itself. They've just briefly become popular, not making any sort of surprising forward leap to become capable or more feature rich.

Of course, one big part of the popularity is maintaining buzz, and with what was effectively a 15 year hiatus from any real forward development, much less promotion, Perl dropped out of the limelight.

This is pretty standard though. People seem to forget so quickly; at one time, ColdFusion, Java Applets, Flash and PHP were the darlings of their day. Perl too.

Now, if someone were to take Perl 6, produce a framework for it that tried to force a remedial coding style (Python), require webapps follow a specific directory layout and naming convention (RoR, many JS webapps) as well as page templating (PHP, JSPs, Razor/Webforms, etc), add some human-friendly data query language features (Java Streams, C# LINQ), provide tools for automatic dependency search and import (Maven, Ruby Bundler), and then really play up the functional aspects of the language, and perhaps Perl will rise again too.

If that's really the features people are looking for. I deliver that line with only marginal sarcasm; I note that the number one complaint against Perl is ugly code, which we know is the domain of the author, not the language - and other languages 'fix' this by taking away developer agency.

Even without those new features, and though I don't use it as often, I still like the ole' "swiss army chainsaw," just a little bit more than these other choices. I guess you could say it was just a matter of personal preference.

Comment Re:There's a few problems here. (Score 1) 149

The problem is that with normal smoking, you're using split wood. It has a non-uniform size, and has been seasoned to different levels. Some may be greener than others, even in a batch of mostly seasoned wood. So some will burn hot and fast, and others slow and smoky (and some of those will be too smoky, and not fully combust, leading to bad tasting byproducts). A big piece of green wood won't burn the same as a small piece of dry wood, but by carefully adding the right wood at the right time - for the right external temp and humidity - and then further managing the airflow by various mechanisms such as extending the chimney or using fancy airflow redirectors in the bbq itself, a cook can actually manage the consistent heat with the right amount of aromatic smoke to enhance the flavor of the meat.

There's a lot of variables that go into play here, and there really is a whole science to it.

That being said, you could get uniform fuel for uniform heat. That usually means pellets. Of course, that in turn means that you're probably not going to be getting flavorful smoke, and that in turn, is the whole point of smoking in bbq.

You may as well just use an oven at that point though. It's easier to manage a consistent heat, and you don't have to worry about smoke. The quality of such a cook would very likely not suffer from injecting some 'liquid smoke' at that point, and it's so much easier.

Comment Net benefit of standing desks unconfirmed. (Score 3, Interesting) 340

The science to it is basically this: When sitting, your metabolism slows, you burn less calories, and all the fun that goes with that - higher likelihood to be overweight, thus higher blood pressure, cardiac issues, and so on. We have studies that prove this too.

So, don't sit right? Well, standing isn't very good for you either, not for long periods of time. We're lacking any really hard science on what the optimal time period really is, although we know that it's variable depending on the person. We do know that you're more at risk for immediate health problems from long periods of standing rather than sitting (which results in longer term, less immediate issues). For example, even with a soft gel mat, after a few weeks, one stander ended up with medical conditions.. They're not just an anomaly either; back pain, carotid atherosclerosis - a circulation issue, varicose veins, pinched nerves, and more are associated with long periods of standing.

The fact is that we don't really know how much standing is enough to ward off the dangers of sitting, and worse, we don't know how much standing is too much and will result in health problems. There's probably an optimal healthy point, but we don't have any studies that show where that optimal healthy point is on average, much less how it needs to be adjusted for an individual.

It's also important to note that positive claims associated with standing desks that are not associated with physical well-being, such as increased mental capacity, creativity, memory, attentiveness, productivity and so on, are largely due to recirculating personal anecdotes, which we know carry a strong bias and use no objective measures for comparison. What few studies there have been show no evidence of benefit, nor of detriment. In a obvious note though, they show that treadmill or cyling desks DO reduce attention and productivity by a significant amount, and they haven't been shown to result in any impressive health gains either - users average weight loss of only about 3 lbs a year, for example, and that's about the only study you'll find on the subject!

What this all means is that, scientifically speaking, advocating for the health benefits of a standing desk is about the same as advocating for the health properties of barefoot running, clay cleansing (or really any cleanses, including charcoal, pickle juice, and others), and the whole genre of fad diets.

There's no scientific proof that shows they are a net benefit, which means you shouldn't assume they provide one. They are just standard junk science until then - taking a fact or finding and running with it past the point and on to speculation and pure fantasy. In fact, these are more akin to the fad diets, in that you're not only not gaining a benefit, you're that much more likely to cause harm to yourself. Standing desks are the new fen phen.

If you're worried about staying healthy, skip the fads and just add an exercise plan to your day. Take a 40 minute walk at lunch. Maybe workout a few times a week. Eat healthy, but more important in most western countries, eat a proper portion size. That's all it really takes.

Comment There's a few problems here. (Score 3, Informative) 149

Just to name a few:

1. The water pan is there to keep the brisket moist, which not only helps keep the meat from drying out, which also aids in smoke penetration (which is where the flavor comes from). It's not there to catch drippings. In the case of an egg smoker, it's also there to reduce the impact of different burns.
2. Offset smokers seem to be preferred by most "pitmasters"; direct heat really means you're grilling, not smoking, and that means you're mostly cooking over coals, rather than producing consistent smoke with an open-flame fire for the duration of the burn, and that means you're not getting enough flavor.
3. The "fuel" - given rather short shrift here - is one of the more important parts of bbq, and very hard to automate. Green wood, seasoned, large chunks or small, each has an impact on the immediate heat, the curve that the heat follows as it burns, and of course, the flavor via the smoke.
4. 220 lbs of brisket is decent, but good brisket places do 2000 lbs a day. If you're looking for something of quality instead of, well, acceptable, you're going to need to spend more time experimenting to figure out how to make a good brisket.
5. In order to have a chance to regulate the temperature well - and not keep cycling through blasts of heat and cooling - they'll need multiple temp probes, and an awareness of the outside temp and humidity as well, since ceramic insulation or no, the external environment will play a huge factor.
6. If the flat - the lean part of the brisket - is falling apart when you pick it up, the brisket has been overcooked. It means the point is going to have the consistency of pudding - or it's been destroyed entirely and is completely dry. It's harder to avoid this in a direct-heat smoker rather than an offset.

It should probably look like this. I remember seeing that shot in Franklin's book, Franklin Barbecue: A Meat Smoking Manifesto ... which yes, sounds pretentious, but since he's lauded as the best BBQer in texas several times over, and that book is #1 in BBQ & Grilling books on Amazon, maybe he's allowed to be a bit pretentious. Go get that book if you're at all interested. Apparently fact checked by Harold McGee.

There's more things I could pick apart too. I know, I'm sounding like a BBQ snob, but the fact is that I'm not very good at cooking it, and I haven't had a lot of experience. However, like any geek, I did my research. I read around. I checked things up on the internet. I talked to cooks. I volunteered at some cookoffs. I think I have just barely enough experience to recognize when someone else is doing it poorly. Anyone who's done this at all isn't going to be very worried about this invention, since, well, the parts that you can automate are the parts that are least likely to affect whether your brisket is going to taste good. You may as well have just stuck it in an oven with a few blocks of aromatic wood in a water pan underneath at 275 for an hour and a half per lb.

Comment 2 words: lockout, tagout (Score 4, Informative) 342

C'mon folks. This is basic stuff when working with any hazardous machinery. This is entirely a human error, and the 'robot' aspect of it is unimportant. The word 'machinery' would have been less provoking. It's about the same as saying "Factory worker dies after jumping into industrial tire shredder with insecure controller hardware". The controller has nothing to do with it.

Granted, Europe doesn't have the same OSHA requirements as the US, but still, it's pretty obvious.

If you're not familiar with this concept, here's a summary & scenario.

Summary: You use a device to physically stop the operation of the machine that requires a lock, and then you keep the key to that lock with you so only you can re-enable the machine.

Situation: You need to rewire half a building. You shut down the power and lock the panel so no one can turn it on. You start work and now your hands are full of wire. At the same time, a co-worker's air compressor loses power because it's plugged into that downed grid, he comes over and wants to turn it back on, but since you have the only key, he can't. As a result, you stay alive. Alternatively, you don't lock the panel, and your co-worker electrocutes you.

Comment jQuery makes a nice introduction to javascript (Score 5, Insightful) 126

If you're doing front end web development, at some point, you're going to have to become good at Javascript.

You've got all these different frameworks where everything gets abstracted 3 redirections deep into a pseudo MVC layer populated with promise events and callback system that work well for 80% of the job and make 20% incredibly hard, and STILL in the end ties into the DOM, and is just as breakage prone, all the while explicitly demanding you adhere to their code organization because that way they can encode configuration into class names and function invokations and all of them claiming the other is bad not because of intrinsic issues but because they don't make the same ideological architectural choices like one-page-apps or putting business logic (including navigation!) in pages - even though it's more predicated on popularity than functionality, and ALL of that built on top of the mess that is Javascript and you are well and truly sunk my friend. Not even a run-on-sentence has enough space to cover all the horrible stuff that lies out there for the aspiring web developer.

So I recommend learning Javascript. Once you have a good strong feel for the fundamentals, once you've treated Crockford's "JavaScript: The Good Parts," the same way as you've done with K&R's "The C Programming Language," and you're solid on the basics, THEN jump off into these new worlds of overly pretentious web designers delving into their first languages and claiming you can't write a decent site without coffeescript, haml, and sass. So you can put them in their place, or at least, so you can understand how much they're complicating what is, after all, not that difficult of a thing. Even experienced developers write really, seriously completely awful code with Javascript, thinking they'd doing a decent job because they didn't invest time and energy to understand why they're not.

The only problem is that Javascript is bad. I like learning by examples, but that is so not the way to go with Javascript. Examples abound - but they're more than likely to be bad examples. There's reference books, like "Javascript: The Definitive Guide," but they're not great for learning. So what I do is tell people to learn jQuery first.

With jQuery and a few minutes, they can pick up the basics. Someone who already knows CSS or writes code for a living should feel productive in less than a full work day. It's easy, it's fast*, it handles browser differences**, there are well documented examples of how to use each feature on the jQuery doc pages, and best of all for me - it's non intrusive, so you can fit in the new vanilla javascript you're slowly picking up in, without needing to learn any framework specific configurations or magic function naming or guessing why it's not calling a method because it's not been added to the object prototype by your pesudo controller registration.

For me, I hate frameworks that are all-or-one solutions. Big heavyweight*** frameworks that tell you "This webpage is now a ____ application". First, they never are perfect solutions, and second, they always make some percent easier at the cost of making the rest painful. I think when learning a new language, especially if you've already got one or more under your belt, forcing someone to do it your way, or making familiar patterns or constructs hard is a huge detriment. Besides, how can you expect someone to learn if they're not allowed to get it wrong? You end up with folks who don't understand the importance of things like memory management if they never had to do it, and how useful are they during crunch mode?

So, yeah. Learn jQuery. Learn dom traversal and manipulation. Learn event registration. Learn AJAX. Try out some animations. Write a plugin or two. Many of these things are appearing in modern browsers now, but once you understand the concept, the basics that are endemic to web development, and you can be productive, then you'll have the grace time needed to properly learn actual Javascript. Then you get out your Crockford book and things like prototypical inheritance, closures & scope, and code as a first order data type will start making sense.

Then you can tackle those foul turds of frameworks that are sold as the newest and best and maybe you'll get them to work for you, instead of the other way around. At least until everyone switches gears 3 months later when the next best steaming pile is excreted.

* - enough. It's fast enough.
** - most browser differences, and it's better than rolling your own in any case.
*** - heavyweight in this case meaning they make all my architectural decisions for me, and I have little or no ability to work around them.

Comment Re:Tangentially related: Race-based admissions (Score 1) 473

I'm not sure where to start. I'd like to do a point-by-point, but there's no need. The issue on my part is very simple:

People should be judged based on their own merits, not the color of their skin or their gender, not based on the socioeconomic class they're from, not by handicap or caste or because someone has a quota or a chart that shows an irregular curve they'd like to smooth out.

What you're calling 'social fairness' is actually quite unfair to society or individuals.

The issue with affirmative action-type policies is that it is saying, in no uncertain terms, capability is not as important as quotas. That folks who have been deemed capable should not only not be given opportunity, but that opportunity should be given to someone who has not been deemed capable. That can't be considered fair in any society - and before you complain, consider that this is exactly the policies apartheid & theocratic governments and pre-civil war US used. People judged based on their skin color, religion or other minority status, rather than their ability.

This is not just a matter of treating people as individuals instead of a member of a group as an ethically and morally correct stance. Treating people who are not capable of a job as qualified and allowing them to occupy important positions is a danger to all of society. How that can be lost on you, I don't know. This is the very root of the problem with nepotism and cronyism!

Perhaps this straw man scenario might help: "Your doctor isn't technically qualified to perform your gall bladder surgery, but we needed someone from a low income family on staff to round out the numbers. So ... yeah... good luck in there!"

I don't think we're there yet with doctors, but it is already happening with police, firefighters, and other emergency personnel. How can you justify this by saying that it's acceptable or worse, preferable simply because some arbitrary binning of the population doesn't turn out equal numbers. I can't wait till someone notices there's not enough people with low intelligence being recognized as geniuses and tries to fix that. ... yes I know it's absurd, but the whole concept seems crazy to me.

Comment Tangentially related: Race-based admissions (Score 3, Interesting) 473

I was listening to NPR the other day, and this story popped up: Examining Race-Based Admissions Bans On Medical Schools .

The short version is; certain states have ruled that colleges are not allowed to consider race as part of their admissions criteria, and medical schools are noticing that black and latino graduation numbers have decreased since then.

The intent was to focus on merit-based evaluations. Seems noble, right? We want the best doctors we can get. However, the effect appears to be to reduce the number of minority students admitted. This, of course, has people outraged, and scrambling to find ways to work around the system - like sending recruiting teams to primarily-black or latino high schools, and hoping that will increase the applicant numbers.

What shocked me is that everyone is dancing around the race issue (and only certain races; not, for example, Indian or other asians). Everyone agrees the minority graduation numbers have dropped because individuals from a given group don't actually meet the admissions criteria. They're not qualified to be students or doctors. That apparently hundreds or thousands of people's failing grades were ignored because of their race. That prior to the no-race rule, doctors, in this case, were not necessarily the most well qualified individuals for the job. In fact, some significant percentage of them should not have been allowed in.

This trend isn't new either. When I was a lifeguard back in the 90's, the requirements changed from being able to swim a specific distance in a certain time, to removing many of these fitness requirements altogether. The reason? It was apparently unfairly eliminating people with poor physical ability or handicaps. The new focus was to do all the lifeguarding from the side of the pool: hooks, ropes, and life preservers.

Heck, just last month there was a minor kerfuffle about fire departments force- and expedited-promotions of minorities over whites.

I can't help but see this girls-only computer science focus being another of these sorts of ill-considered plans, where capability takes a back seat to minority inclusion and political correctness. Sure, it's not as vital as our doctors, firemen, and lifeguards, but it's the same line of thought. In our rush to be politically correct and all inclusive, we mistake equality for equally fair, and it serves no one well except those promoting our differences.

Am I the only one who thinks this is crazy? Like Harrison Bergeron crazy? I can't be the only one, right?

Comment Re:Why not nursing (Score 1) 490

Studies seem to indicate that this is largely due to the constantly shifting cultural landscape which says, for example, women make better psychologists and biologists now, whereas 30-40 years ago, that was not the case. It's not that men in general are hardwired to avoid nursing and want to be garbage men, and women are selected to be secretaries and beauty stylists, but rather, the culture they're brought up in acts as a filter to determine what jobs should be held by which gender based on cultural norms, based on no objective differences in ability or merit at all.

So it is the case that our culture is steering gender-based career selection, in an irrational, non-directed way. The bigger issue is: does it really matter all that much? Is it actively or passively hurting anyone? Would forcing equality, especially when it requires working counter to cultural values, really be any better?

My personal belief is that it won't, but there are a lot of loud, angry people out there who seem to think otherwise.

Comment Re:Equality (Score 2) 490

I'm assuming the article is referencing this data: http://www.randalolson.com/wp-.... That is a pretty striking curve, though there are two interesting points to make:
      - Women have made consistent gains year to year in nearly every engineering and "hard science" field, some more than others (biology is a standout)
      - The absolute number of women CS graduates is actually increasing as well - it's just that male graduate numbers are increasing at about twice the rate.

However, contrary to your claim, there are many, many people trying to asking why. There's lloads of speculation, but the ones that seem most reasonable to me point to a large scale cultural shift that considers women better youth educators (pre highschool teachers), psychologists, journalists, biologists, pharmacists, and a large drive into the agricultural job market. Studies have shown - as referenced elsewhere - that these cultural values are imposed early, and one of the major sources is the female child's mother, though popular media also drives some of this. This goes the other way too, with the cultural acceptance of male interest in video games potentially acting as a familiarization gateway to CS careers.

But really, we're only investigating ~that~ question because we're trying to solve a bigger one: How do we get more women with CS degrees and/or working as software engineers? If the studies are right, it seems the best way is to train new parents to stop limiting their children.

More important in my mind though: is that a valuable goal?

There is some greater potential for creativity in problem solving - meaning things like faster software, better interfaces, when you include a diverse body of people, but it's hard to measure that potential, or determine if it's been realized. There's nothing to indicate that given two individuals, one can be expected to perform any better than the other based on gender in this field, so it seems that if you myopically focus on one with a disregard as to their actual merits, you run the risk of hiring the worse performer.

Generally, I'd say that this is more of a non-issue. As we raise an increasingly tech-savvy generation, and negative cultural stigma of IT workers fall to the side, I think we'll naturally see more women in the field, as the work is fairly light and the pay is reasonable. At the same time, I don't think deliberately pushing towards that goal really serves the public or companies or women in any real way, and it may even be hurting it, as per the article's original question about gender stereotypes.

Comment Re:The Web needs a lot of things (Score 1) 126

This is not anything we're not already doing, the difference is that right now it's incredibly difficult to do well, and the result is an intractable mess. Markup - and CSS is just an externalized form of markup - lacks any real power. It's simplistic, which makes it approachable, but not very flexible or adaptive. In order to achieve even simple goals, we're using fairly complex and increasingly obscured and inflexible markup. When that fails, we turn to javascript to provide liquid layouts that work, and cart this baggage of per-page non resusable javascript around with us, all of it ready to break as soon as we make any changes - or use a viewport we didn't consider.

What I'm suggesting is that we provide the same controls we have for generating a user interface for local software in web browsers. That means being able to programatically adjust layouts, styles, etc, without having to wrap javascript around them to provide for a responsive interface or implement common layout requirements. It means doing all the things we already do, but in a way that increases readability, maintainability and future modifications.

Basically, it means internalizing something like flash, without the proprietary nature and other detriments.

That's pretty much what this WebAssembly bytecode would do, and what I've been advocating for.

What you're talking about seems to have ... absolutely no basis on either the article or my comment. I'm not even sure what it is you're suggesting. Letting other people run any program on your computer through your web browser? That does sound stupid.

Comment Re:The Web needs a lot of things (Score 4, Informative) 126

I have also been saying this for years.

It's nice that HTML and CSS were accessible to the masses, and without their simplicity, we could never have reached the level of adoption we now have. See the argument over the MEDIA vs. IMG tag from way back for a good example of that. However, now, that additive simplicity is holding us back.

What we need is a language with flow control, variables, functions, templating, the whole 9 yards. Not just on top of HTML & CSS, but as part of it. So we can dynamically change the size of page elements based on other page elements (think layout managers) without making incredibly complex and highly specific css for a variety of media screen sizes. Common features like centering a dialog box or a three-column display with header and footer sections shouldn't be a hack you need to ferret out, that may behave badly with long or short pages. It'd be nice to explicitly link action management with page elements and avoid much of the code required to identify actors and route events. Heck, some sort of built in asynchronous request mechanism might be nice, rather than having to write custom javascript each time.

What we don't need is another life support extension to a markup language.

Comment Re:Lack of mainframe skills maybe? (Score 1) 96

This depends on a number of issues.

There's a few parts to a banking system, and the 'core' - is usually on a mainframe. The name 'core' in this context means "The software system used to store financial account data" - a database, for the most part. It stores the account data, and occasionally a few other pieces of information, but mostly the account data. Account data never really changes, so neither have these "core" systems. They're also like 20-30 years old, and so rarely break anymore. They're time-tested.

However, modern banks do not use that system, or usually even a mainframe system, to perform the processing. First of all, most banks won't do real time transactions. They make it look like real time to end-users, but on the back-end, there's delays. Instead, they have proxy authorization systems that try to manage the minute-to-minute data and authorize transactions with the info they have. When you use an ATM and you're not at a bank location - or you are, but it's a distant branch, you're probably using a proxy authentication. If you're sneaky, and you've got someone on the other side of the country, you could probably withdraw the 'same' money twice before the proxies reconcile. Eventually it'll catch up to you, but it may work in the short term.

So we've got proxy authenticators that talk to each other and the core and keep the day's running total.

Then comes the clearance systems. This is the end of day processing where all account transactions are reconciled. These started out on mainframes, but most modern banks have moved off them, leaving only the core behind. It might be because the clearance processors are updated fairly frequently. They need to be aware of things like when (and how) to charge fees, customer plans that allow usage rates, taxes, court-mandated child support deductions, all sorts of things. Any time there's a new plan or program or account type or feature, basically, the clearance nodes need to be updated.

So when do they do their thing? The clearance process will go through all the proxy authorization nodes and do all the accounting that is needed, and when confirmed by a bank employee, and only then, finally updates the core with the new values. This usually happens once a day, but busy banks may be running it 4 or more times (though I rarely hear of anyone even using 2, except for special circumstances where it's manually triggered)

So I'd guess that, probably, the processing is not on a mainframe. It's probably in a cluster of servers running the clearance processing, and that they do reasonably frequent updates. Some percent of the time, these updates are what causes the "loss" of data in the first place.

Comment Happens all the time. (Score 3, Informative) 96

So, I used to write banking software for a living, and that included being in the damage control team when things went south. Which they did. A lot. Which were brought to the technical staff's attention, without fail, every friday at 4 PM - but I digress.

This isn't really a big deal.

This sort of thing happens on almost a daily basis. I'd say that 1 in every 500 banks "loses" a day's transactions each week. From hardware failure to networking problems, to someone entering bad data to simple bugs in the software (usually data-dependent and hard to source), and even an occasional overwrite from a backup. Something breaks.

Once we had an FI lie about their data center setup - it wasn't redundant, offsite, certified, or even a datacenter. They just had some machines running in their basement. Imagine our surprise when they called up wanting us to (remotely) fix their servers which were under 6 feet of water during a flood. ... again, I digress.

The thing about these transactions is they're not really lost. Nothing really goes missing.

See, all financial software is super keen on accounting. I don't mean that in a strictly 'add up the numbers' way, but in an auditing way. There are logs upon logs. Using your ATM card to withdraw $20 probably generates around, oh, I dunno, 30-40 log messages, depending on how it's routed. There's a log of the transmission and response in each node along the way, using a standardized protocol. It's a severe pain in the butt, but even if a system goes down forever, we can regenerate those logs from the other systems. It's a great deal of effort, especially to preserve the sequence order, and it's tedious, but it can be done.

This could easily account for the initial delay in fixing things.

Not only that, all these sorts of systems are very keen on sequence-order processing, so if it's just the case that the end of day processing (clearance) system went down/had a bug/etc and none of the transactions were finalized, then they'll just stack up. Once they get that system up again, it'll start processing them again. It might take longer due to a large backload, but it'll eventually complete.

98% of the time, the only people that notice are companies waiting on a payroll to go out, and most of them are happy enough to accept the financial institution's admission of fault. For a day or two at least. Why this one made the paper, I can't tell you. Probably a customer was savvy enough with social media and decided to burn the bank. I guess they should just be happy the public in general doesn't realize how many problems and how much actual work goes into making banking seem reliable and secure.

Don't sweat it -- it's only ones and zeros. -- P. Skelly