Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment: Re:Have you actually tried using Rust? (Score 1) 211

by tricorn (#49426789) Attached to: Rust 1.0 Enters Beta

I was programming in Pascal on a Lisa (dual boot to the Lisa command-line OS (Lisa Workshop) for development and MacOS for testing, occassionally booting to the Office environment). I bought it shortly before it came out as the MacXL, so had non-square pixels. I wasn't rich, and it wasn't any more expensive than a PC would have been with the same capacity.

The entire thing (Office 7/7, Workshop, MacWorks) plus system partitions for each was 10MB. System RAM was 1MB. I can compress and copy that whole system in a few seconds across a network now.

I'm sorry you were stuck with BASIC, but that wasn't exactly cutting edge in 1985, and there was lots of development in better environments.

A couple years later I started using Lightspeed/THINK C. No NEAR/FAR pointers thankfully. I avoided Intel stupidity for many years.

C really hasn't changed very much. The biggest change has been function prototypes. POSIX and ANSI certainly helped, especially with esoteric details of things like real-time and multi-threading/multi-processing, but that didn't enable much, just made it more portable. There are still plenty of incompatibilities despite all of that standardization (e.g. autoconf).

C++ as on object model was there. It was a poor model, and it still is. There are a lot more features now, but a lot of the "extra complexity" that modern hardware enables is spent dealing with the extra complexity C++ adds. I never used it, but maybe the world would be in a better place if THINK Object Pascal had caught on more.

CVS started out as shell scripts working with RCS. There were also plenty of other revision systems that had been around for a long time (eg NOS MODIFY). It's not that the concepts were unknown, just that the hardware simply didn't have the capacity and speed, and networking it all together was much slower and less available.

Comment: Re:Have you actually tried using Rust? (Score 1) 211

by tricorn (#49411141) Attached to: Rust 1.0 Enters Beta

In the meantime, plenty of people were writing things in Pascal for the Mac. You had a resource compiler with resource files. You could write things in C, on a Unix system. You could build things with "make". Most of the software tools used to compile Linux and most of the current standard software was already in existence. There were source code control systems. There was X Windows. There was TeX. There was PostScript. There were a LOT of things that make up the majority of the software tools still in use today, and most are very little changed since then.

Sure, git is better than CVS. A large part of that is due to the constraints of the available hardware, you simply couldn't have done git in 1985 with available hardware.

The basis for Object Oriented Languages was well established, as was the basis for multi-threading (see Path Pascal, C++, Smalltalk).

What's been done since then is to take advantage of the massive increase in speed and storage available. Sure, there have been some incremental improvements to languages and utilities and development environments, but the impact that's had compared to the hardware improvements is fairly small.

The main advances in programming have been with encryption and compression. Everything else would have fast-forwarded within a few years if today's hardware had all of a sudden been made available back then.

Comment: Re:Have you actually tried using Rust? (Score 1) 211

by tricorn (#49407689) Attached to: Rust 1.0 Enters Beta

Your average desktop computer, compared to a system from 30 years ago, is over 7,000 times faster, has 3-6,000 times as much RAM and 1.5 million times as much persistent storage available, and can communicate over 4,000 times faster, and that's not even getting into graphics capabilities.

There's more code in the boot ROM than there was in the boot ROM plus OS plus several applications.

It's not that programming tools are so much better, or that programming techniques have advanced, it's that you can write programs with many fewer restraints.

Comment: Re:Oh this is easy .... (Score 1) 394

by tricorn (#49396265) Attached to: Ask Slashdot: Living Without Social Media In 2015?

Hmph, I've been using e-mail, online chat, forums, multi-player games for over 40 years, but I don't have a cell phone. I was telling relatives how cool things like e-mail and on-line communities were, but I barely have a Facebook account (created just so my family could share photos).

I was reisisting LinkedIn, finally gave in when an uncle sent me an invitation, and then I added my brother. No one else. My brother is an HR/headhunter type, so I guess I can forgive him using it.

I have a Twitter account. I've never posted to it, I barely check the one group I follow.

When the GlobalNet-connected AR/wearable tech finally gets here I may jump in, but so far everything I've seen has been so boring and stupid that Slashdot and the occasional Ars Technica post is about as Social Media as I get.

Comment: Re:The new antipattern (Score 1) 486

by tricorn (#49340805) Attached to: No, It's Not Always Quicker To Do Things In Memory

I wouldn't say the results are invalid, but the relevance is restricted to people who don't understand algorithms or statements such as "disk is slower than memory".

I once had to fix a program that was reading all the file names in a directory into a linked list, sorting it (using operations to retrieve, remove, and insert elements using an index, which worked by starting at the beginning of each list and counting elements until it got to the correct one), then using the resulting sorted list to process the first 10 files.

Rather than fix the abominably slow sort, I used the fact that all the file names were decimal numbers, and all the numbers were sequential, to scan the directory for the smallest number, then just increment that to find the next one. Needless to say, it was both much faster and used very little memory.

Algorithms matter, and the shame of ever faster processors and "more productive" languages is that too many programmers don't understand them.

Comment: Re:HOWTO (Score 1) 1081

by tricorn (#49272581) Attached to: How To Execute People In the 21st Century

Certainly CO2 shouldn't be used without previously rendering the person unconscious. I read that some studies had some indications of distress from straight N2 suffocation, hence using N2O first might be more humane.

Since part of the "humane" aspect of it is how it appears to observers, that should be taken into account as well. I don't know if CO2 would cause a faster death than N2 when used in conjunction with N2O, or if there's a difference in visible signs while it's happening.

Comment: Re:Issue will be resolved... (Score 1) 347

by tricorn (#49248417) Attached to: FCC Posts Its 400-Page Net Neutrality Order

The section I quoted defines "Broadband Internet access sevice". What you're talking about is irrelevant for the purposes of this rule.

What the 25 Mbps / 3 Mbps defines is not "broadband" but "advanced telecommunications capability". See the actual rule (actually "Broadband Progress Report and Notice of Inquiry"):

http://www.fcc.gov/document/fc...

Comment: Re:But the MEANING is hundreds of pages (Score 1) 347

by tricorn (#49248085) Attached to: FCC Posts Its 400-Page Net Neutrality Order

The actual regulation is 8.5 pages, about 22K characters. The rest is commentary. You'll find the commentary in the Federal Register. You won't find it in the actual regulations (Code of Federal Regulations, CFR).

There's the index, 576 paragraphs of commentary of various sorts, 12 paragraphs of procedural stuff, APPENDIX A which contains the actual rule, and APPENDIX B which contains a required analysis of the rules. APPENDIX B alone is 110 pages long.

The 8.5 pages is the actual program. The rest is the README and HOWTO combined with the man/info page, the makefile, the comments that would be in the code and a code review. The code itself is presented as a diff onto the existing codebase. Since it's a scripting language, there is no binary.

Comment: Re:The Rules (Score 1) 347

by tricorn (#49247991) Attached to: FCC Posts Its 400-Page Net Neutrality Order

It isn't 400 pages of regulation, it's about 8.5 pages of (new/modified) regulation, including all the definitions, procedures for filing complaints, etc.

The other 391 pages are commentary, explaining the rationale, the legal authority, discussing the public comments and rebuttals, talking about the implementation and implications, and so on.

Saying this is 400 pages of regulation is totally false. The 400 pages are in fact going to be published, and can be used by courts when deciding cases influenced by the new regulations, but they are not themselves regulations.

Comment: Re:The actual text of the new rules is only 305 wo (Score 1) 347

by tricorn (#49247355) Attached to: FCC Posts Its 400-Page Net Neutrality Order

Most of the 400 pages are commentary on the rules - justification, clarification, intent, responding to comments, legal authority, possible legal challenges, implications, etc.

I don't know about the "305 words" bit. The actual rule (the part that says "amend this part to read ... renumber section x to y ... insert a new section x that reads ..." is 8.5 pages long (page 283 through 290, which is about half a page long). If I copy directly from the PDF version and run it through fmt (default 65 wide) it yields 347 lines, 22542 characters.

However, the heart of it is contained in 3 short sections, about 1200 characters depending on encoding and whether you include the editing directives:

8.5 No blocking.
A person engaged in the provision of broadband Internet access service, insofar as such person is so engaged, shall not block lawful content, applications, services, or non-harmful devices, subject to reasonable network management.

  8.7 No throttling.
A person engaged in the provision of broadband Internet access service, insofar as such person is so engaged, shall not impair or degrade lawful Internet traffic on the basis of Internet content, application, or service, or use of a non-harmful device, subject to reasonable network management.

  8.9 No paid prioritization.
(a) A person engaged in the provision of broadband Internet access service, insofar as such person is so engaged, shall not engage in paid prioritization.
(b) “Paid prioritization” refers to the management of a broadband provider’s network to directly or indirectly favor some traffic over other traffic, including through use of techniques such as traffic shaping, prioritization, resource reservation, or other forms of preferential traffic management, either (a) in exchange for consideration (monetary or otherwise) from a third party, or (b) to benefit an affiliated entity.

Comment: Re:Issue will be resolved... (Score 1) 347

by tricorn (#49246733) Attached to: FCC Posts Its 400-Page Net Neutrality Order

The definition in the rule makes no such reference to speed:

8.2 Definitions.

a) Broadband Internet access service. A mass-market retail service by wire or radio that provides the capability to transmit data to and receive data from all or substantially all Internet endpoints, including any capabilities that are incidental to and enable the operation of the communications service, but excluding dial-up Internet access service. This term also encompasses any service that the Commission finds to be providing a functional equivalent of the service described in the previous sentence, or that is used to evade the protections set forth in this Part.

Comment: Re:"F" rating? (Score 1) 315

by tricorn (#49226341) Attached to: Clinton's Private Email System Gets a Security "F" Rating

I'd be curious to know what problems would have been found AT THE TIME (not now, a few years later), with the e-mail server itself (not web front-ends other than as actual vectors to compromise the system, not just an individual connection; is there any indication Clinton ever used a web front-end?), and compare that with the state.gov e-mail server (also at the same time).

Comparing this to someone using a gmail account is irrelevant. The biggest threat to security is probably going to be the people at a commercial business.

The distinction between "personal" or "private" or "government" e-mail systems is sort of dumb when she's using a specific system AS a "government" e-mail system. Perhaps she even had it authorized through whatever route that might take, maybe having State IT people take a look at it.

What were the data retention policies for the state.gov e-mail server at the time? Did they retain every single piece of mail, so you could ask now to see how many Viagra spams she received while in office? If she deleted a message, was it archived or is it gone now? Would outgoing messages be retained? What if an e-mail client was configured to send outgoing e-mail directly to the recipients server (I realize that's becoming harder to do now as more and more servers are set up to require relaying through an official authenticated server via DNS records, but what was the situation then?)

The people to put on the stand here are the IT people responsible for the state.gov e-mail servers and the IT people that Clinton used to set up her server.

Comment: Re:The patents (Score 1) 186

by tricorn (#49132765) Attached to: Jury Tells Apple To Pay $532.9 Million In Patent Suit
Many years back is 9 (when that particular patent was filed) or 16 (based on the priority date, though I'm unclear what that priority date is based on). Buying things over the Internet wasn't some stroke of genius, and couching things in standard patent-speak doesn't make it any more innovative. Makes me want to file a patent on "A Method and System of Using A Computing Device", put in all sorts of vague claims with "data means" and "storage means" and "communication means" and "user interface means", include something really specific like "a processor using graphene), then wait until someone creates something nifty after graphene has become common in chip fabrication, then sue everyone for violating my innovative patent, since I was the only person in 2015 who could have foreseen graphene being used in computers. Of course, as every new potential technology is reported on, I file a continuation on my patent and add in the new technology. Perhaps a cool new public key system is devised, I can toss using that as part of the data communications means of using my Computing Device. This will cost me some money, of course, so I'll deserve a big payout at the end for having taken so much risk in developing my innovative technology.

You should never bet against anything in science at odds of more than about 10^12 to 1. -- Ernest Rutherford

Working...