Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Submission + - Man deletes his entire company with one line of bad code (independent.co.uk)

JustAnotherOldGuy writes: Marco Marsala appears to have deleted his entire company with one mistaken piece of code. By accidentally telling his computer to delete everything in his servers, the hosting provider has seemingly removed all trace of his company and the websites that he looks after for his customers. Marsala wrote on a Centos help forum, "I run a small hosting provider with more or less 1535 customers and I use Ansible to automate some operations to be run on all servers. Last night I accidentally ran, on all servers, a Bash script with a rm -rf {foo}/{bar} with those variables undefined due to a bug in the code above this line. All servers got deleted and the offsite backups too because the remote storage was mounted just before by the same script (that is a backup maintenance script)." The terse "rm -rf" is so famously destructive that it has become a joke within some computing circles, but not to this guy. Can this example finally serve as a textbook example of why you need to make offsite backups that are physically removed from the systems you're archiving?

Comment It's about legal certainty (Score 2) 64

So many knee-jerk comments here. Get a grip folks.

This is about how we treat data of a citizen from one large jurisdiction when it moves to or is stored in another large jurisdiction, and removing legal uncertainty for the companies doing so. For example, this very site's account info of EU residents being stored in the US (handle, email and encrypted password). Nothing overly private, but still falls under privacy laws of hundreds of countries, each of which could voice a problem and issue a warrant or subpoena. Without overarching legal frameworks governing and taming this legal diversity and uncertainty, it is basically impossible to run a large website. Plain and simple. If you're an engineer, you absolutely want to be insulated and protected from all this possible BS, regardless of how much of a non-issue your own data collection might be to your engineering mind.

Submission + - Jeff Bezos: AWS will break $10 billion this year (windowsitpro.com)

v3rgEz writes: Jeff Bezos is bullish on the cloud, pegging AWS' sales for this year at $10 billion in a recent letter to shareholders. But he said there was a surprising source of that success: The company's willingness to fail. That said, with AWS now spanning 70 different services, Amazon can afford to fail some as long as few, like EC2 and S3, keep winning.

Comment Glad to see latency and packet loss (Score 4, Interesting) 99

Even is ISPs are relatively transparent about what they sell you, it is always about maximum download and upload speed, and never about latency and quality of service. In fact, sales and first-tier support folks don't even know these terms, much less what their company's typical values are. In practice, a stable, low latency broadband connection with 15 Mbit/s cap gives you a better overall experience than a jerky, high latency connection which on paper tops out at 50 Mbit/s.

I am very glad the FCC is including these numbers by default to judge a provider's disclosure practices.
As an aside, test your connection at https://www.voipreview.org/spe... and see your latency, jitter and packet loss alongside the other metrics.

Comment Re:By what definition were they not compromised? (Score 2) 159

If you've configured your site to allow arbitrary content from unknown third-parties, your site is compromised by design. If the mere act of rendering the content that your site is sufficient to get malware, then, yes, your page is compromised. Doesn't matter if the source of the malware was in somebody else's ad service. If that service feeds data directly into your site that you then present to your visitors without any sort of vetting or filtering, then you've allowed that malware to compromise your site.

You do realize that a site only embeds the ad network code, not the final downloaded content? I.e. yes, a site takes some sort of responsibility when deciding to run ads from an ad network. Beyond that, however, every user gets potentially different ads. There are real time bidding platforms and user profiling code in the middle, completely outside the direct control of the website.

Comment Re:And they wonder why I use an adblocker.... (Score 1) 159

The ads appeared when I visited those websites, therefore it appears the websites are responsible for spreading the malware.

If it were that easy this wouldn't be a problem. You've got a least three players here: The website running ads and trying to fight off the bad stuff, the ad networks which only sometimes care enough, and the advertiser trying to game the system into running bad ads. It's a continuous arms race, and as a website owner you end up in reactive mode, rather than proactive.

Comment Re:Ad Blocking (Score 1) 159

Here's an idea: How about someone writes an ad blocker that DOWNLOADS the ads, just like normal, but simply does not RENDER them on the screen, or execute any code? Seems like the best of both worlds: users that don't want to see the ads don't see them, and websites still get paid, since there's no way to tell if they actually got shown?

Won't work anymore. Big advertisers want proof that their ad was seen, via Double Verify or similar, and only pay for ads that were in front of users for a certain amount of time. Javascript and CSS make this easy to measure, and hard to work around.

Comment Re:Not Sure What the HTTPS Hooplah is all about (Score 1) 216

HTTPs only encrypts the contents of what you are retrieving

HTTPS also blinds "proxies" and antivirus software which may have their own opinions of what should and should not travel over plain old port 80. ISPs have done stunts like ad injection, antivirus software routinely blocks websockets, and on and on. HTTPS is a godsend around this bullshit.

Data Storage

Submission + - Scientists unveil most dense memory circuit ever m

adamlazz writes: "The most dense computer memory circuit ever fabricated — capable of storing around 2,000 words in a unit the size of a white blood cell — was unveiled by scientists in California. The team of experts at the California Institute of Technology (Caltech) and the University of California, Los Angeles (UCLA) who developed the 160-kilobit memory cell say it has a bit density of 100 gigabits per square centimeter, a new record. The cell is capable of storing a file the size of the United States' Declaration of Independence with room left over."

Slashdot Top Deals

"It is easier to fight for principles than to live up to them." -- Alfred Adler

Working...