Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Ubuntu

Journal: The End of Ubuntu 4

Journal by ObsessiveMathsFreak

I have just upgraded my machine from Ubuntu 11.04 to 11.10 and everything is broken.

Everything.

It began with Unity. The horror. If there's a way of finding the main menu, I wasn't able to discover it. Menu bars have entirely disappeared from applications, to be replaced with the mac "menu on top" paradigm, a.k.a. one of the main reasons I've never used a Mac since 1994. You can't even log out of the bloody interface, let alone tweak it. Even the fonts are terrible.

It took me about 15 minutes to figure out how to get rid of Unity and replace it with Gnome 3 and I can't say there's really much of a difference in terms of usability. All of my default Gnome 2 desktop settings have been blown out of the water. Completely. My panels and taskbars and lauchers are either deleted or are all over the place. Even if I wanted to change them back, everything is basically uncustomisable as far as I can tell. You can't move objects around in the taskbars. Yes that's right. You can't move objects around in the taskbars.

Is this what a desktop is supposed to feel like?

I feel like my computer window has been turned into a walled garden, like that on an iDink, which I am permitted to carress and fawn over, but have disallowed from making my own in any way. Will I have to download some kind of "App" from the "Ubuntu App Store" to gain back basic functionality? Do I have to dive into arcane settings and ppa just to get back the system which I had and liked only a few hours ago? Do I have to give up and choose Gnome 3, or move to XFCE, or move everything to the shell, or basically waste the next two weeks getting back what I had?

If that's the price of Ubuntu (and it is) then I am leaving.

Ubuntu and Gnome died the moment they allowed the UI designers to take over. The art students, the inveiglers, the smooth talkers, the wild eyed dreamers, the "visionaries", the people who didn't care what they were doing as long as it made them feel talented and superior. These are the people who have designed unusable,confusing systems and interfaces that delete years of carefully customised menus and discourage serious use of computers.

And as for the "boring" people, the programmers, the testers, the package maintainers, the people who listen to the community, those who put real thought and concern for users into their themes and interfaces, the people who don't go to conferences, who communicate with users directly via forum and newsgroup, who sit at their desks working to make distros better, often for no reward at all; what of them? Are they in charge in this brave new work? No. They are cast down and out, by a brigade of bullshitters too busy bopping on their iPods and blogging than in doing useful work.

If you let the wrong people into an organisation or a community, they can destroy it. Ubuntu and Gnome shows that this can happen to distros and open source projects just as easily and quickly as it has happened in the many industrys, countrys, and economies throughout the word. The destroyers will fail upwards, the blazing heat of their incompetence scortching all new pastures dry. The rest of us will be left behind to pick up the pieces and start again. In 5 years time, Ubuntu may be back on the path to being usable again, but I can't wait that long.

I'm thinking of starting off with Mint.

PC Games (Games)

Journal: NAT is the Fucking Devil 3

Journal by ObsessiveMathsFreak

I need a place to have a full on rant about this. My Slashdot Journal is as good as any.

Is it so much to ask, that in 2009, the video game industry as a whole would have figured some way around the problem of home routers and getting devices behind them to communicate with devices behind other home routers. Yes, I know, it's not a trivial issue. WAN/LAN IPs, DNS, End to end connectivity, Ports, TCP, UDP, protocols and connections, planes trains and automobiles. Yes, it's not an easy thing to accomplish.

But you've had ten fucking years!!! Or as near as makes no difference.

How many times have I had to reset, reconfigure and reinstall routers? How many times have I had to click through those infuriating HTML configuration pages, one form at a time, in an effort to add, port by port, protocol by protocol, game by game, each and every little irritating requirement just to get the fucking game I bought to play online like Mechwarrior 2 did flawlessly back in 1997!?!?!?!?

I've cracked. I admit it. The final straw was this latest gem from Team Fortress 2, a game I don't even play(I basically manage the router for 5 people). I had to set up port forwarding and QoS (Whatever the fuck that is) just to let the gods damned game to play properly.

  • UDP 27000 to 27015 inclusive (Game client traffic)
  • UDP 27015 to 27030 inclusive (Typically Matchmaking and HLTV)
  • TCP 27020 to 27050 inclusive (Steam downloads)
  • TCP 27015 (SRCDS Rcon port)

61 ports. Sixty One ports. And that's just for the forwarding, never mind the QoS malarky. Yeah, Fuck you too Valve. And want to know the best part? It's a server based game!! Why in fuck's name do I need to do any of this?! Oh give me lag any day of the week.

But to be fair, it's not just Value. Far, far from it. It's not even PC developers, each mandating their own custom crafted set of ports and protocols to enable online play behind a router. No, consoles too have gotten in on the game. Take these gems required for the Playstation Network.

  • TCP Ports: 80, 443, 5223
  • UDP Ports: 3478, 3479, 3658

TCP port 80. Otherwise known as the HTTP port. Great. And what's this? TCP 443. You mean the HTTPS port. Great choice guys. Yeah, thanks for that. I'll forward those right away.

Come on Microsoft. You've been computing specialists for over 30 years. What's needed to run Xbox live behind a router?

  • TCP Ports: 80, 53, 3074
  • UDP Ports: 88, 53, 3074

Great classy. I lover that overlap with PSN on the Port 80 thing. Can't have them hogging HTTP entirely, especially since you control the DNS ports now. Awesome. Complete clusterfuck. Why doesn't one of you mandate port 22 altogether, so my entire network will be totally inaccessible from outside for anyone not using a game's console.

Oh well, I guess at least with consoles you only have to forward one set of ports for all games right... right?

In order to play GTA IV via the PS3 network you will need to open the following ports on your router:

  • UDP ports: 6672, 28900
  • TCP ports: 8000-8001, 27900, 28900

AAAAGGGGHHHHHHH!!!!! LEAVE ME ALONE!!!! I'm not a network administrator! I don't have any certs from Cisco!! No! I can't use IPTABLES!! How would I get Linux onto the router in the first place?! What do you want?! Blood?!?! I just want to play games!!!

And don't talk to me about UPnP! Just don't. As far as I can tell, the Useless, Painful 'n Pointless protocol's only meaningful function is to establish connections between devices which confirm UPnP is available, but then don't work anyway. I've never once managed to get a single game to work using it. It has never worked and it will never work. Most companies don't even mention it. They skip straight to port forwarding, gleefully rolling off their own in house list of obnoxious ports.

You know what this is like? It's like every video game publisher and company is trying to stake it's claim to ranges of ports and protocols. By insisting on their own original, capricious and dogmatic set of connection requirements, it's as though Sony, Microsoft, EA, Valve and all the rest are trying to enforce by fiat what would normally require an RFC to be made official. Namely, the assignment of a port. Companies are literally carving out their own space on what is supposed to be a no ownership zone. And trust those armchair experts at Wikipedia, to stick these turf claims in a Registered Ports List. "Oh but, the unregisted ones are in blue OMF". FUCK YOU! There are only 65000 ports, which is too few to risk being lost to this bullshit.

So that's why I think this NAT business hasn't been resolved. Moving the video game industry to a solid standard whereby games automatically established connections(and hang the technical difficulties), would mean that companies would have to give up their little slice of that very relatively small pie of 65000 port numbers. These are corporations we're talking about, and giving up something that big, that central to the functioning of the entire internet, even if it's just a squatters claim, is not a step any of them are willing to take.

So, in my opinion, we're going to be stuck with this NAT port forwarding bullshit for quite some time yet. I fully expect more and more games to lay claim to ever larger pastures of unsettled port space, and continue to do so until the whole spectrum is so fully overloaded that people's routers or patience simply snap under the strain. Mine certainly has.

Mercifully, my ISP seems to allow PPPoE over a router, which thankfully the PS3 and Xbox360 both support. True, it exposes them to the elements in a way having them behind a router would not, but I really don't care any more. NAT is the fucking devil, and I've had enough of having my crank yanked as a pawn in this port squatting farce, so it's a WAN IP for me.

At least until all the IPv4 address run out and I have to set up all this shit again of IPv6 addresses.

User Journal

Journal: Prepaid Cellphone, In Japan

Journal by corsec67

I got a cell phone in Japan. A prepaid model, without any kind of contract, since I do not know how long I will be here. (I will talk about prices in yen, since that is what I paid, but a yen is about $0.01, 1 cent. NOT 0.01 CENTS, Verizon)

Even though I was going prepaid, and would thus be buying the cell phone outright, (¥5,000), I had to provide an ID, proof of my registration with city hall, have an alternate phone number, and jump through several hoops there.

The phone (Softbank 730SC) is definitely only worth ¥5000, with a 1.5 Mpix camera, no internet access, games, or anything like that. It does have several tools, like a unit conversion that is essential for translating American to ... any other language, including International English. It doesn't have an English dictionary even though it can be set to display menus in English. The Japanese dictionary is very nice, where once the first word is entered, the rest of the sentence can sometimes be auto-completed.

The way that the Softbank prepaid phone works is that I pay for a ¥3000 card for 2 months of service. Minutes are ¥90 each, so that is about 30 minutes for 2 months, which would be useless if Softbank didn't provide SMS/Email.

Incoming calls are free, so if someone else has a contract phone they can cheaply call me, and I don't pay anything to get a phone call.

SMS/Email is the main point of a Softbank prepaid phone. ¥300 a month out of that ¥3000 card for unlimited SMS, and emails up to about 30KiB. Sending, receiving, it is all unlimited. Pictures from the phone, to the phone, big stuff, all unlimited for ¥300 a month out of what I have to pay to keep the line live.

So, in Japan, on Softbank, for ¥1500/month, I have a phone with free SMS/Email, incoming calls, and horribly expensive outgoing calls. All without a contract.

User Journal

Journal: Nerdy Mothers Day

Journal by corsec67

For mothers day, instead of mailing a card from Japan to my parents in Colorado, I just SSHed into their computer, and printed a picture of me on their color laser printer, and then called my mom and told her to look in the printer.

That was basically my first thought on how to get a letter to appear in their house, which is in retrospect incredibly nerdy. I love how Linux makes it very easy to use remote computers like you are sitting at them.

User Journal

Journal: Toward a New Unix Shell 2

Journal by Tetsujin

One thing that's interested me a lot lately is the idea of improving upon, or reinventing the Unix command shell.

Now, there's a certain class of Unix users who have stuck with the command shell while GUIs have come and gone, been released, overhauled in various releases, and so on until new versions barely resemble old ones. One of the problems I face in this kind of plan is that many of these users are fairly attached to the tradition. They use the current command shells because it's the environment they like, and many of them don't want to use a "reinvented" shell. When in my design I start thinking about giving the shell its own type system to be used over pipelines, or devising a set of rules that tools would have to follow to "play nice" in this environment - I know that a lot of my target audience just flat out isn't going to like it. This concerns me. But still the idea of this environment appeals to me. Somehow I want to make it work, and make it stick.

Now, the first question one has to ask about something like this is, why? Why should the shell change? There are a few reasons I have in mind:

  • If the shell could know things like the type of a piece of data, or the kinds of command-line options a tool supports, it could offer context-relevant help.
  • In present shells, when connecting the output of one tool to the input of another, the user must insert parser/serializer steps. I feel it's appropriate for the shell to be able to offer a greater degree of help than that.
  • I think there's real value in giving the shell a way to deal with "objects" and higher-order programming in general. For instance, an object could be an XML parser or an open network connection or a window into a running application: for the lifetime of that object the shell should be able to issue commands to that object - and when the shell's last reference to that object is destroyed, some action (like closing a connection or killing a process) may be appropriate. There are some command-line tools that implement this sort of behavior themselves, but I think it would be very nice if it were a real feature of the shell.

Also, I believe the current model of the shell has fallen behind how people actually use their computers. For instance:

  • Modern GUIs offer a lot of useful functionality - but the extent to which this functionality is integrated into the command shell is rather limited for various reasons. For instance, why isn't the volume manager or the wi-fi manager that I used inside the GUI also available outside the GUI? The basic answer is that command-line tools aren't well suited to that kind of usage profile in which they are started as a service and then, while running, receive and respond to outside commands. The framework for such a thing simply isn't in place.
  • Scripting languages like Perl and Python offer large and useful libraries that perform all kinds of different features. Why can't shell scripts access these? The basic answer at present is that the programming language provided by the shell lacks the constructs necessary to usefully interface with these utilities. The lack of "object" support (and, specifically, lack of a good mechanism to start something, keep it running, and interact with it, and shut it down when finished), the lack of any sort of namespace support, and the fact that any data going into or coming out of such a library has to be arranged in some ad-hoc format for which the shell provides no specific support - all of this severely hampers the ability to expose these libraries and the practical benefits of doing so.

Microsoft has already come up with their own solution: "Powershell" - a command shell somewhat similar to cmd or a Unix shell, but with support for "commandlets" - commands on the search path which are actually .NET classes which are dynamically loaded and run as part of the shell's process. These "commandlets" exchange .NET objects as their input and output. The shell can then store these .NET objects in environment variables for later recall (and keep track of when the object is no longer referenced, and delete it) - this goes a long way toward usefully exposing Windows API functionality within this shell.

My goal is a bit different. Linux has no standard representation for "objects" (and I'm not in a hurry to embrace Mono, let alone encourage others to do the same) so Microsoft's approach isn't suitable for my goals. Furthermore, without the ability to restrict the behavior of a piece of code within a single process, it becomes more important for the stability of the shell to continue to have tools be separate processes. Therefore, whether these outside processes communicate via shared memory or pipes, either way they need to respect a few common conventions about how data is formatted, and (in the case of "objects" - data in which it's important to know when it's time to destroy it) how to manage object lifetime.

One of the typical complaints is that a plan like this requires all shell tools to agree upon and use the same set of rules for how they format their data. This would be a real problem: people would be slow to move to this format, which in turn would make the shell less useful (since it would lack the tools to run in its "enhanced" environment). No one would want to write a tool that runs only in a new, unproven shell, and no one would want to either shoe-horn their problem into an uncomfortable data format, or waste CPU time by translating their optimal data format into the one the shell wants.

So clearly putting everything into a single data format wouldn't work. And in general, the fewer things I "mandate", the better. So my idea is this: tools can go on communicating with whatever encoding makes sense for them, but there should be some shared means of identifying what that encoding is. In order to do this without requiring, you know, every program in the world to be changed for compliance, there has to be a way to provide this information out-of-band - and, presumably, statically. That way, even if the binary itself has no provisions for working nicely in my shell, the end-user can work around that, without changing source code or recompiling, and without needing the authority to install such a workaround system-wide.

Of course, there's still all kinds of problems with this plan. Among other things, this new shell somehow has to provide a nice new environment, while simultaneously working nicely with things not made for it. For instance, if I provide my own version of "find" - do I have to give it a different name so it won't conflict with the GNU find everybody's used to running? There seems to be a never-ending supply of small problems that need to be solved before this design can really go anywhere. But if all goes well, then maybe someday I'll get it written and you'll give it a try. :)

User Journal

Journal: VoIP for free calling aross the world 1

Journal by corsec67

This is about setting up VoIP for over the internet calls, and not using Asterisk with a local PSTN connection, although all of the phones I talk about here, soft and hard, I think would work with Asterisk.

My sister sent me an email asking if we could use Skype to talk for free, since she was using up too many of her minutes on her cell phone talking to our mom. I didn't want to recommend Skype since there are a bunch of issues with their service, it being proprietary, making computers into servers without authorization, and not making AMD64 versions of their software.

The solution that I came up with is to use SIP with Ekiga.net. The SIP clients we use are Twinkle for Linux and X-Lite for other OSs. Sign up for an account on Ekiga.net, and then enter those details into your SIP client. You may need to set the stun server to stun.ekiga.net if you are behind a NAT or router. In X-Lite, for other people to be able to see if you are online or not, you have to add them or their domain (ekiga.net in this case) to the privacy rules list and "Allow status updates", since the default in X-Lite is to deny status queries.

On a softphone, you really should use a headset instead of a built-in mic and the computer speakers, I use the PS2 Logitech headset which has good support under Linux.

Codecs are where the black magic of VoIP comes in. Since everyone has a different internet connection with different limits, the codec that works for one person may not work at all for another person. Add in the limited support some SIP clients have for codecs, and it can be a real mess. My sisters Time Warner Basic Cable upload is so limited we can't use the default and well supported G.711 (which is uncompressed and uses 64kbits/second plus IP overhead), so we have to use GSM, which uses 13kbits/second plus IP overhead. GSM is an older codec, so something like G.726 or Speex might give better quality. Speex in the wideband and ultrawideband versions have twice and four times as many samples per second as a standard telephone (G.711, 8000 samples/second), so they would sound better than a regular telephone if the network bandwidth isn't an issue.

I configured my softphone(Twinkle) to use the following codecs, in order of preference from most to least: GSM, G.726 32 kbps, G.711 A-law.

If you want a hardware phone, you can use something like a Snom 300, which supports dialing to a regular phone or to a sip address which looks like an email address (username@examle.com). There are adapters that take an ethernet connection and convert that to a regular phone adapter, like a Linksys PAP2T, but you don't have any kind of display on the phone itself. Codec support is definitely an issue on hardphones or adapters, but these might be better for giving to someone else to be able to call you.

Ekiga.net provides some useful phone numbers for checking your setup, but they all use G.711 A-law, so if your network upload is less than 128kbps, it will not work for you. 500@ekiga.net is an echo test, to check that your transmit and receive work, and to check latency. 520@ekiga.net hangs up quickly and then you get a call from 500@ekiga.net, to see if you can get phone calls. Ekiga.net has more info about the conference rooms, of which there are 10,000 that can be pin protected.

Calling a PSTN phone number is provided by DiamondCard.us, where you pay a per-minute fee for calls and a fixed rate to get a phone number that other people can call.

User Journal

Journal: J.K.Rowling wins $6750, and pound of flesh 17

Journal by NewYorkCountryLawyer
J.K. Rowling didn't make enough money on Harry Potter, so she had to make sure that the 'Harry Potter Lexicon' was shut down. After a trial in Manhattan in Warner Bros. v. RDR Books, she won, getting the judge to agree with her (and her friends at Warner Bros. Entertainment) that the 'Lexicon' did not qualify for fair use protection. In a 68-page decision (PDF) the judge concluded that the Lexicon did a little too much 'verbatim copying', competed with Ms. Rowling's planned encyclopedia, and might compete with her exploitation of songs and poems from the Harry Potter books, although she never made any such claim in presenting her evidence. The judge awarded her $6750, and granted her an injunction that would prevent the 'Lexicon' from seeing the light of day.
User Journal

Journal: U. Mich. student calls for prosecution of Safenet

Journal by NewYorkCountryLawyer
An anonymous University of Michigan student targeted by the RIAA as a 'John Doe', is asking for the RIAA's investigator, Safenet (formerly MediaSentry), to be prosecuted criminally for a pattern of felonies in Michigan. Known to Michigan's Department of Labor and Economic Growth -- the agency regulating private investigators in that state -- only as 'Case Number 162983070', the student has pointed out that the law has been clear in Michigan for years that computer forensics activities of the type practiced by Safenet require an investigator's license. This follows the submissions by other 'John Does' establishing that Safenet's changing and inconsistent excuses fail to justify its conduct, and that Michigan's legislature and governor have backed the agency's position that an investigator's license was required.
User Journal

Journal: ABA Judges Get an Earful about RIAA Litigations 5

Journal by NewYorkCountryLawyer
Well, I was afforded the opportunity to write for a slightly different audience -- the judges who belong to the Judicial Division of the American Bar Association. I was invited by the The Judges' Journal, their quarterly publication, to do a piece on the RIAA litigations for the ABA's Summer, 2008, 'Equal Access to Justice' issue. What I came up with was 'Large Recording Companies vs. The Defenseless : Some Common Sense Solutions to the Challenges of the RIAA Litigations', in which I describe the unfairness of these cases and make 15 suggestions as to how the courts could make it a more level playing field. I'm hoping the judges mod my article '+5 Insightful', but I'd settle for '+3 Informative'. For the actual article go here (PDF). (If anyone out there can send me a decent HTML version of it, I'll run that one up the flagpole as well.)
PlayStation (Games)

Journal: The Trouble with PC Ports 1

Journal by ObsessiveMathsFreak

I wrote a journal entry two years back. I had recently bought Oblivion and had spent 10 hours try to get it to simply run, and the post basically outlined how PC games require far too much effort from the user to simply run, let alone become playable. This post can be regarded as a followup.

I ended up liking Oblivion, so much so that I bought the Game of the Year edition for the PS3. The graphics were a lot better, and there were no control issues or installation worries. Then I ran into the, effectively show stopping, PS3 Vampire Cure Bug, after probably 50+ hours of play. Bethesda apparently have no intention of ever patching or fixing this bug. I can safely say that if I had know that this bug was present, I would never have bough the game.

As I see it, PC game makers like Bethesda, simply are not going to make it in the current generation of games. Show stopping bugs with no official efferot to patch them might be acceptable in PC gaming, but console gaming has historically had a much higher standard when it comes to major bugs and glitches. Even in the days of the PS2, if a game crashed, it was quite a shock, and a major black mark on your opinion of the game. Show stopping bugs with no workaround, are to my memory completely unheard of.

Say what you will, but up until effectively two years ago, the first version of your console game was going to be the last. Companies had no recourse whatsoever apart from a total recall if they needed to change so much as one bit in the game binary. Under those conditions, a very high level of quality was sought and in fact was achieved in the vast majority of cases. Console gamers have spent the last 20+ years playing games that largely did not crash, did not glitch(obtusely), and did not have show stopping bugs. PC gamers have spent the last 20+ years trying, and failing, to get games not to do any of these things.

My point is that console gamers have come to expect a certain level of quality and professionalism, and console game makers have responded accordingly. PC gamers have come to expect patches, hotfixes and workarounds, and PC game makers have become complacent when it comes to errors, and contemptuous towards their users. This does not bode well for "establishment" PC game makers trying to break into the console market. I believe they are, one by one, doomed to fail in this regard.

Unreal Tournament 3 crashes all over the place on PS3. Oblivion:GOTY has character which when spoken to display "I HAVE NO GREETING" default errors. Call of Duty 4's level and art design is aesthetically appalling. The best titles PC gaming has to offer typically end up a second or third rate titles when it comes to console gaming. A lot of this has to do with control schemes. RTS titles and games like the Sims are fundamentally unsuited to a console controller. But it also has to do with the overall quality of PC titles which when compared to console titles, simply don't meet the grade.

It works both ways. Titles among the best that console gaming has to offer typically do not fare well when ported to PCs. Final Fantasy VII, Metal Gear Solid 2, Halo. However, this is likely due to control and framerate issues, and with PC gamepads becoming more common(Xbox 360 pad plug and play in Windows), and graphics cards improving, these issues alleviated somewhat.

However, PC games makers have a much larger step to overcome if they want to break the console market. They need to overcome a culture of complacency. A culture that allows games to be released that will not work without a patch. The culture that allows a game to be shipped with known bugs still present. The culture that thinks graphics improvement means simply increasing texture rates and bloom and has no time for aesthetic design. The culture that essentially holds technical metrics in awe and game players in contempt. It is a culture driven in large part by the backing of PC hardware manufacturers and not the feedback of gamers.

I was looking forward to Fallout 3. But I will no longer be buying it when it arrives. I have been burned quite badly by Bethesda already, and I have no reason to believe that they will change their ways. It's a similar situation with many PC gamer companies. They are steeped in a culture that simply will not work in the console world. I expect many to simply stop releasing console ports in the years ahead, as it becomes clear that console gamers will not tolerate half finished or unsupported products.

There's something to be said for PC gaming. But professionalism among PC game makers is not it.

User Journal

Journal: eBay beats Tiffany's in trademark case 2

Journal by NewYorkCountryLawyer
Tiffany's has lost its bid to hold eBay liable for trademark infringement of Tiffany's brands taking place on eBay. After a lengthy bench trial (i.e. a trial where the judge, rather than the jury, decides the factual questions), Judge Richard J. Sullivan has issued a 66-page decision (PDF) carefully analyzing the facts and legal principles, ultimately concluding that 'it is the trademark owner's burden to police its mark, and companies like eBay cannot be held liable for trademark infringement based solely on their generalized knowledge that trademark infringement might be occurring on their websites'.
User Journal

Journal: Corsec67's crazy tax rewrite 1

Journal by corsec67

This is my (quite crazy compared to what we have) tax rewrite for the US. It is inspired in part by Ron Paul.

My solution is to pass a constitutional amendment that prevents the federal government from collecting any fees, taxes, donations, or seized goods from any citizen. Also in that amendment would be a provision that income may not be taxed by any state, county, or other local government.

A sales tax is specifically allowed, as the primary method for states and governments to get money.

Revenue for the federal government would be provided by the state governments alone, with some specifications for how much each state pays, whether that is by population, a percentage of that state's revenue, or such. Aside from certain declarations of emergency by a state, federal monies would also not be allowed to benefit a state or other local government.

And that is the plan. It is very revolutionary, and quite different from what we have now.

Gone is the IRS, along with the huge accounting businesses that have sprung up to help people to interpret that crazy tax code. Also, the federal government would not be able to take money from people in a state and then hold that money hostage pending that state from passing certain kinds of laws (55 speed limit, 21 drinking age, 0.08% BAC to name a few).

The government has to get money somehow, and a sales tax by the state and other local governments is all that is left. This would mean that it doesn't matter where your income comes from, nor does the government at any level need to know. Nor do citizens that don't run a business have to even collect and remit to the government any taxes at all, since the current sales tax infrastructure would take care of that.

User Journal

Journal: Dow Jones MarketWatch likens RIAA to the Mafia 11

Journal by NewYorkCountryLawyer
According to commentator Therese Polletti at Dow Jones MarketWatch, "the RIAA's tactics are nearly as bad as the actions of mobsters, real or fictional. The analogy comes up easily and frequently in any discussion of the RIAA's maneuvers." Among other things she cites the extortionate nature of their 'settlement negotiations' pointed out by Prof. Bob Talbot of the University of San Francisco School of Law IP Law Clinic, whose student attorneys are helping private practitioners fight the RIAA, the illegality of the RIAA's use of unlicensed investigators, the flawed evidence it uses, and the fact that the RIAA thinks nothing of jeopardizing a student's college education in order to make their point, as support for the MAFIAA/Mafia analogy.

When a Banker jumps out of a window, jump after him--that's where the money is. -- Robespierre

Working...