Comment Re:No.. (Score 1) 120
If he'd sign a message using the wallet that is known to be owned by him...
If he'd sign a message using the wallet that is known to be owned by him...
Ever since 2.96 it's been crappier by every version. I still usually download the 2.96 TO DATE, because the newer ones are such crap, bloated adware shit.
2.96 was simple, yet powerfull, enough features but not bloated. Worked as a MP3 player brilliantly.
Then came the stupid trying to play video shit (thus loosing your playlist), the GUI was changed to bloatware etc.
I did a mistake on last system setup -> i installed the latest. Now every now and then when watching netflix or videos on VLC -> it jumps out on top of everything. Yay, that's EXACTLY what i wanted.
So - in other words, you are saying, excel requires less than 40ms latency? watching a remote movie requires less than 40ms latency? Wow, did netflix just take fiber to everyhome to achieve that?
40ms being the usual latency or there abouts for adsl 2 to a geographically nearby server.
if you allow 70ms you can go as far as from northern europe to central europe etc.
If the latency is below 41.67ms - it's imperceivable visually.
Does Counter Strike require less than 40ms latency? Yes and No. Many people played fine with cable with it's 60-70ms latency, and most people have 40ms latency nowadays. But i wish for the ADSL gen1 days with 15ms latency - it did make a difference.
But if you have sufficiently good remote desktop code: Your latency is just above the network latency, so if you have 40ms latency, you might have 45ms if it's sufficiently good, if you are playing on remote desktop local server, that has less than 0.5ms latency, depending upon the number of switches between. Most likely less than 0.15ms latency - makes no difference anymore.
So why wouldn't it work?
It's a mental handicap why you think it's can't work!
one problem: Supply barely fluctuates.
It is adjusted every 2 weeks, when there is vast network growth, supply is temporarily higher yes, when the network shrinks, the supply is temporarily lower yes, but the fluctuation during normal times is smalelr than you expect - maybe 20%.
So one hour you might get 150 bitcoins, while another makes only 100.
also the supply is about to go down, until maximum of 21mil is reached.
Actually supply is limited by 125 new bitcoins per hour, or well, slightly more, but adjusted to by weekly.
If you need the bitcoins today, you will spend that 400$.
Mining -> Difficulty raises constantly etc. to get that bitcoin, you might need to spend 2 years mining for it, account for electricity, hw and time in the cost. Easily winds up cosnting more than 400$
Further, mining is a investment, not an acquirement.
Mining takes time. Just because a miner costs 90$, doesn't mean you see 100$ even within a year. Plus electricity, plus increasing difficulty etc.
Your maths are wrong, and the latter example -> People will start doing what they *like* instead of what they *must* to have a good life.
Someone wanting to be doctor, and not having the cash for education, will run a restaurant for now, while doing night studies on the cheap etc. prepping until the day his assets have appreciated enough.
In your example, banks accept savings in cash, while giving loans in cash.
After 50% deflation event, they still ahve 90k liabilities, and 100k in assets.
Farmer on the other hand, because his asset is tangible, will loose virtually 50% of the number value, but also at the same time, the buying power of that asset remains same, ie a new farm would cost 50% less, ie. the same number value in assets.
Bank's balance sheet would in real world however likely show close to 50% decrease in assets (loans given out), as people have 50% less capacity to pay off their debts, however, as it's a compounding effect, they might loose close to 100% of their assets since only a few are now able to pay up, and go belly up, and in expectance of further deflation, new loans would dwindle, not many given out as people will rather hold on to their money.
Therefore, banks are made irrelevant, only function remaining would be gatekeepers as payment service providers - but who would want them to handle such things with their arbitrary blockings, chance of loosing your money etc etc etc. absolutely zero protection of your assets held in such a provider (ie. paypal), unless you absolutely have to in order to conduct business?
Just because there is 50% deflationary event, does not mean that bank can just say "we'll take half of your money", unless there is some legal loophole, in which case riots will ensue next.
what, i don't see any problems in what you describe.
You hold BTC -> It gets more valuable over time. If you preorder something, or simply pay upfront for somethign to be delivered in 2 years - then that is a problem for the buyer.
lol. That's just limit of the current clients and accepted norm. Bitcoin itself, by protocol, is as diviseable as computers can represent, the current limit is imposed to have a limitation on transaction size in the blockchain, nothing else. The division maximum was already increased once.
Bitcoin has no limit on the division. Practical issues could happen but you can have 0.00000000000000000000000 00000000000000000000 0000000000000000000 0000000000 0000000000000000000 00000000000000000000000 000000000000000000000 00000 00000000000000000 000000000000000000000000000000 00000000000 0000000000000 0000000000000000000000000 00000000000000 000000000000000000000 000000000001 bitcoins if you wish to or even smaller denominations:)
About damn time someone was considering the impact of human heat sources, such as cars, electricity generation etc.
I've been wondering this for a long time - why no one is considering this?!
Basicly that heat has no where to go - some of it does radiate to space, but the heat what planet earth radiates away is rather limited, space being vacuum.
I wouldn't be surprised if the whole "global warming" would be all human heat caused, while pollution is minimalistic. Seriously tho, we shouldn't even blame pollution as long as we allow the super tankers, which of there is tens and each causes more pollution than 55 million cars.
Global warming got changed to climate change, as heat generated in Brazil might cause UK to get cooler etc. due to changing weather patterns.
Someone talking about "europe's temperatures getting lower": Realize this is not isolated to continents, the locality of these is this whole planet. Heat generated in New York will affect australia, and heat in australia will affect Moscow.
If one particular place is getting cooler, another place is getting warmer by the same amount.
In other words, what matters is the total energy content (heat) of the planet. Specific locality (city) only matters in the where and how.
wow, one should not need to convince why doing quality work is important
Just leave that company, or get all of those who oppose doing quality fired. There really is no middle solutions in situ like this.
lol, that's very easy. Not even basic mathematics and iteration. When you use modulus this indeed doesn't even require math skills to figure out.
And these guys call themselves coders?
I must start passing guys this test myself!
Any other good 5mins or less coding tests you've used?
And code is written faster with clean code. Way faster.
Code architecture, cleanliness is the key to make features faster and eliminate defects.
Dirty code will lend itself to a set of various problems, messy code even further.
Well architectured code base, with a clean writing style will make things happen faster, easier to maintain and every body understands a particular piece easier. On large systems tho, a great mind is required not to only comprehend but also understand the whole system with proper structure.
The rules are very simple:
* Every function should have at least short comment of what it does, preferrably a docblock style
* every function and variable name should be unshortened, camelCased (my preference) or hyphened without abbreviations or shortening the words, for example instead of uSesAuthGenId type it open userSessionAuthenticationGenerateId you also see this is incorrectly structured it should be more like users->authentication->generateId
* Insert a comment approximately every 10 lines on average
* Do not type a huge list on intended recursive function calls on single line, ie. $example = array_walk( array_map( $$exampleMethod1, $data1, $conf1 ), $$exampleMethod2, $mode ); is a bad example of code, doubly so if the callbacks use references and change the original data
* a single method should preferrably not exceed 100 lines, if it does it's doing too much
* You should never ever need intendation further that 4 levels, intendation comes from if clauses, for/while/foreach loops etc. for example: foreach($data AS $thisKey => $thisValue) { if ($thisValue['type'] == 'type') { switch($thisValue['mode']) { case 'thing': if ($something != $something2) { foreach($thisValue['someArray']) {
* Abstract, but do not abstract too much. Abstract things which require more than say 10 lines, but if you are abstracting single line things via 8 different methods/functions you are doing something HORRIBLY WRONG (even Zend FW falls for this)
* When you abstract, do not duplicate code, try to reuse code intelligently, do not copy & paste
* Separate, isolate certain types of code, for example: View, Business Logic, Flow control. MVC is not just a "fancy word" forcing you to "a nasty framework i need to work around". Infact MVC is a very old concept. Learn it, digest it, understand it. For example anyone saying Smarty is not MVC, or Smarty is MC+V or stupid things like that do *NOT* understand what MVC is and how to write structured code.
* Above all: Reuse, Reuse, Reuse, Reuse. Don't do practically the same thing in 50 different ways, do the same thing 1 time but do it excellently and reuse it elsewhere. Goes for architecture & structure, for layer separation, and line level code. For example, each model method shouldn't have it's own handling of MySQL result set, they should ask another model for the results.
And the most important lessons today are: Creating code is 7 times more reading than writing, it's 10 times better to spend 50 minutes planning/designing a particular feature then writing it in 5minutes rather than use 45 min writing it and 0 minutes planning it (First change arises you will know why).
One can create a particular system using just ~25k lines of code instead of 200k lines of code if it's well structured and thought out.
One can also spend creating that either half a year in man hours, or 5.
It's all in *executed* high worth lines of code, not about quantity of written low worth lines of code.
My basic benchmark for qualitive productive? If code style is properly adhered, how *few* lines of code and what's the ratio of comments vs. code.
God doesn't play dice. -- Albert Einstein