Sorry, shows how ignorant I am of how things have progressed in the bitcoin world after having stopped participating a couple of years ago. I was under the impression that turning bitcoins into anything else was difficult, but obviously I am not well informed. I tried selling a couple of fractional bitcoins on eBay and just got scammed (losing about $200 worth to thieves with stolen accounts). That's all I've done to try to liquidate them. I'll investigate the options you mentioned, thanks.
A couple of years ago when I first read the bitcoin whitepaper and was very impressed and excited (how can you read that and *not* be impressed and excited?) I proposed a slight change to the bitcoin protocol to add support for messages querying for "elided" blocks, i.e. a means for querying a peer for blocks only relevant to a given transaction (the history of all addresses relevant to and leading up to that transaction). The elided blocks would just be the full details necessary to allow a client to validate a transaction, with the Merkel tree being used to elide most of the data.
With a feature like this, a client would not need to download the entire block chain, they'd just need a) enough of the block chain to be able to validate elided blocks which wouldn't be much, and b) a peer willing to answer elided block queries for them. Since answering an elided block query takes real work (you have to have the entire block chain indexed in such a way as to make answering such a query efficient, which means storing alot of data, with proper indexes, and proper software, and connecting to the bitcoin transaction firehose, with the cost associated with that), I included a mechanism that would facilitate allowing the peer to 'charge' you for this service, using the same pseudo-anonymity of regular bitcoin.
The idea being that a client should not have to trust a third party to handle their transactions for them, which is the only feasable way to do bitcoin transactions now unless you want to download 15 GB - not really feasable in most circumstances - and connect to the firehose - also not feasable in most circumstances, and would be much less so were bitcoin to actually become used with any real transaction volume on a global scale. For a small fee (probably cents per transaction I would guess) you could use a system that didn't require you to trust any peers (unlike the current "we'll hold your wallet for you and do your transactions on your behalf" services that seem to have proliferated to make bitcoin actually feasable for clients).
I started to write it, but then gave up on it because I lost interest. All I got out of it was a couple of bitcoins I bought for fun, that made me about 2 grand (on paper of course, I never sold them and I don't really think it's possible to actually liquidate bitcoins into real money without serious work and headache).
While I agree that the only reason to put acetominophen into opiates is to ensure that the drug cannot be taken beyond a certain dosage without damaging the patient's liver, I do wonder if the reason really is just a vindictive desire to harm addicts as others are stating.
More logical to me is the conclusion that the authorities just want doctors to have to be careful with their prescriptions. If there were no acetominophen doctors could be pretty liberal in how they prescribe dosages with little consequence. But add some acetophinophen, and now doctors have to be very aware that there is a certain maximum dosage built into the drug, and they cannot prescribe at a higher dosage without risking being fined or jailed or sued or whatever it is that happens to doctors that mis-prescribe dangerous drugs.
I suspect that the powers that be have decided that the maximum reasonably beneficial dosage of an opiate is X mg per day, and so they require that enough acetominophen be added so that X mg per day is also the maximum safe dosage. In doing so they limit the ability of any doctor to prescribe more than what they had believed was the maximum beneficial dose. Likely they chose X mg per day because studies shows that it was the dosage that would be beneficial in the majority of cases, and don't see the need for anyone to go above X mg per day and unnecessarily take a larger risk of addiction.
That sounds more reasonable to me than just wanting to hurt addicts.
I don't use illegal drugs and have no interest in doing so but
Lucky dog. I took a business trip to the U.K. and developed an abscess on the airplane. By the time I landed I was in excruciating, nearly panic inducing pain. And I had a week long business trip to attend to. I went to a public dentist and they wouldn't do anything for the pain - they gave me some antibiotic pills that they said should take care of the abscess in two or three days. And in the meantime? Just deal with the pain.
I maxed out on ibuprofin and acetominophen, alternating taking about 50% above maximum dose of each every two hours. I would get a slight relief, bringing the pain to almost bearable for about half an hour, and then it would go back up to full pain level. I would sit and rock back and forth in front of the computer in unbearable pain and focus enough energy to concentrate on my job for a few minutes at a time.
I didn't sleep for nearly two days (was badly jetlagged anyway) and not a morsel of food entered my mouth for about 50 hours.
This all started on Wednesday. On Friday night I started to feel a little better, was able to even fall asleep and then on Saturday I woke up and
When I got back to CA my doctor did a root canal. This was on a tooth that had already had a root canal 7 years earlier but his conclusion was "I guess I missed some nerve endings the first time around".
Alls well that ends well I suppose but
I used to think all programming languages were more or less the same, and this opinion was based on having programmed in a variety of languages, and noted how easy it was to understand the gist of a new language pretty much immediately upon seeing it, and coming to understand any nuances involved without too much further study.
Then I ran up against OCaml. And I was humbled. I didn't really realize that there could ever be a computer language as hard to approach as learning a new spoken language, but OCaml showed me the error of my preconceptions.
My favorite moment was when trying to read and understand some gnarly OCaml code (is there *any* OCaml that isn't gnarly to some degree?) and asking on an IRC channel how I would go about figuring out what the "types" were of variables I was seeing as inputs to procedures and used as local variables within procedures. I was surprised by the answer: you can't. It was recommended that I install an OCaml IDE environment and then have the IDE tell me the types. Why? Because it is more or less impossible to know, from inspection, what the type of anything is. I guess the concept of 'type' is a little to gauche for the OCaml crowd.
I never thought I'd run into a language where, in order to read and understand the code, you literally have to *implement a virtual machine in your head and run the code*, but then I ran across OCaml.
I wonder what the brain of someone reading OCaml code would look like under an MRI
Oh and to your point
When I first came out of college and was young and naive, I thought a great software developer was someone who was really smart, really able to solve complex algorithmic problems. But years in the industry have proven to me that while such talents are important, and people like that are needed, those talents are vastly overshadowed by the more important skill of being able to coordinate with other developers, and to manage detail. The hard part of software development is not on the scale of problems that a single developer faces in daily coding tasks; the hard part is taking 100 developers and figuring out how to produce software that is even 50x as large or complex as that which would normally be written by a single programmer.
A single developer will never be able to compete with an entire software development company in building large or complex software. Competitiveness between large groups of developers is where it's at, and the skills and experience needed for that is an entirely different thing than individual programming genius.
Too true. Although, honestly, in most cases the criticism is justified.
Of course when you go back and look at your own code a few years later and think the same thing
I am a programmer and have a degree in Math/Computer Science, but I always scored better on language related aptitude tests than math
What about games where the player is stationary? Like sitting in a base shooting down aliens come at you from all directions. I admit that it's a much more limited subset of gaming than most people are looking for, but I am curious to know if such a game would alleviate the nausea problem.
I am one of those people who can't read while riding in a car, I get nauseous, so I am very pessimistic about my ability to enjoy a VR headset.
There is a pompus ass here, but I'm pretty sure it's not the O.P. And I don't think it's me either
Understood, and of course people have a right to make money however they see fit.
Won't stop me from trying to plant the seeds of thought though. I'd be happier if there were fewer people operating from greed and more people trying to enrich themselves and their surrounding in more creative ways, so I don't mind trying to make the point and see if it resonates.
Why not go do something useful with your time, try to make money by creating actual value in the world, rather than surrounding yourself with get-rich-quick schemers, scammers, and thieves in the bitcoin world, hoping to score big?
I personally have tried a Mac. After 18 years of nearly Linux-exclusive computing, in 2012 I was wooed enough by the new retina Macbooks and tired enough of dealing with Linux suspend/resume/hibernate nightmares on laptops, that I decided to give a Mac a try.
Long story short: I like it, and nearly love it, but am disappointed in many aspects. The hardware is definitely a high point - I can honestly say that I have never owned another laptop with even close to the same quality of hardware design or manufacture as my 15 inch rMBP. And I say that despite having to return it to the store once to fix the image retention (at the same time getting a mainboard upgrade to fix the video flickering problem that was common on this laptop), again to fix the wireless that they somehow broke during the first fix, and finding that about 6 months later I killed 4 or 5 pixels on my own when a small grain of sand got onto the display and I closed it (the retina displays have literally *no* protection of the LCD surface and it is easy to pit/scratch them - my fault for taking the laptop on vacation I guess).
Mac OS X has impressed me with how well it integrates with its hardware, how nicely and seamlessly the UI functions, and how good the video drivers are. Also Apple's Objective C implementation and libraries are an interesting mix of weirdness and awesomeness, with the very best documentation I have ever read for any programming environment hands down (Microsoft's widows documentation is a complete and utter joke when compared to Linux man pages, let alone compared to the incredible documentation that Apple has for its APIs).
However - the Mac is still a let down in some areas. Printing is surprisingly difficult and bad. I am amazed that a company that created the desktop publishing market and that sold the first Laser printer, can have such an awful print dialog. It's inconsistent between apps and doesn't let me WYSIWYG the printouts whatsoever (I print alot of coloring pages for my kids and it's amazingly hard on the Mac, no matter what program I use, to get an image centered and fit to a page for printing). Also, some of the UI misbehaves sometimes - I've taken to completely disabling the wireless status icon in the menu bar because it tends to freeze up the entire menu bar functionality whenever it's searching for networks or otherwise unhappy. Which happens just about every time the laptop comes back from sleep.
Also I cannot stand the fact that Apple cannot give the user the choice of whether or not click-to-focus or focus-follows-mouse. Oh my god the number of times that I have been unable to interact with a program while looking at a web browser or somesuch because they overlap and I have to fidget and fuss with window positioning and size to be able to do my work. On any other system without this ridiculous flaw, I can type into my emacs buffer while observing some web page with documentation partially on top of some part of the emacs windows. But on Mac OS X I often just have to give up and copy-paste into a text edit window, save that to a file, and then open that within emacs because I just cannot manage to get the stupid windows to overlap in a way that lets me get what I need to do, done.
I think if Apple would not be such fascists about some UI policy, the Mac OS X experience would be alot better.
But overall, I'm like 90% happy with Mac OS X. Definitely beats the living hell out of Windows.
It's not the dropping into a recovery shell that is the problem; that is something that I would expect to happen (and always happened under init, as well).
It's the fact that the systemd recovery shell is nonfunctional. It doesn't accept keyboard input. I had this happen under Arch Linux, then switched to Fedora, and had the same thing happen. And thus concluded that it's endemic to systemd since it's hard to believe that both Arch Linux and Fedora independently managed to screw up systemd's recovery shell in their patches to or configurations of systemd.
When I lived in New Zealand I found their approach most sensible. Cash payments are always rounded to the nearest 10 cents.
Of course, they don't have dollar bills, only dollar coins (and two dollar coins), which is ridiculously annoying - you end up with a pocketful of heavy, clunky coins when bills would have been easier to deal with.