I find these titbits about number theory absolutely fascinating... I followed a few courses at undergraduate level that touched on this material - without giving me a solid grounding. What I'd like to know is this: Is there a good textbook that would bring me up to speed with this material? I like Wikipedia articles - but I find them disjointed.. what I'd like from a textbook is something that leads me through the subject from undergraduate level onwards. Can anyone make any recommendations?
Does LLVM have features for coverage analysis to compare with GCOV?
And how are we measuring the size? What sizes are measured for typical 'big data'?
Are we talking about detailed information, or inefficient data formats?
Are we talking about high-resolution long-term time series, or are we talking about data that is big because it has a complex structure?
Is the data big because it has been engineered so, or is it begging for a more refined system to simplify?
Perhaps... I'm considering buying a replacement for my 2000-vintage 28" 4x3 ratio CRT TV. I'm not in a hurry as I rarely "watch TV".
I like the Samsungs - especially the ultra-thin 46" ones... with fast refresh and high-definition. Their biggest down-side is that they aren't competitively priced relative to other manufacturers - IMHO.
I am interested in a seamless way to use the TV to display what would be on my Laptop otherwise... I like the idea of watching internet video on a big screen... and I like the idea of lounging with a keyboard and having a full-PC environment on my wall... but I don't know if these will be mere gimmicks for me.
I don't care about 3D - but I do care about slimline high-resolution displays with great connectivity. Thereafter, for me, it's price, price, price.
Congratulations.... I'm pleased to see that things worked out for you.
When I think about it, I notice a number of weird problems with the idea of dating sites. Free dating sites, inevitably, will be the preferred haunt of the insincere who lack commitment to the idea of forging a new lasting relationship... you'd expect the participants - if genuine at all - to be looking for cheap thrills... encouraged that by avoiding handing over credit card details, they're in some sense shielded by anonymity. Conversely, paid dating sites turn my stomach. I'd have no objection to paying a fair commercial price for introductions to people of interest to me... romantic or otherwise. The snag is that dating sites aren't selling a competitive introduction service - the most charitable description of their business model would be that they're trying to 'sell love' - though maybe they should just be regarded as old fashioned pimps. The obvious lack of integrity in the sales pitch for such services leaves me feeling very negatively towards them.
If there was a site that introduced me to groups of locals interested in obscure topics that might interest me - I'd pay for that... assuming the party I paid understood that they were engaged in a merely administrative capacity. I guess that a useful service like that doesn't present the same opportunities to gouge the vulnerable - so I don't expect to be a customer of such a service any time soon.
One of the compelling mathematical insights of Fourier's mathematics...
Perhaps not an ideal way to ask a question, but you sound authoritative.
Can you recommend a book for someone who's broadly familiar with Fourier transforms who wants to get to grips with all of "Fourier's mathematics" rather than just some limited aspects of it as exposed by a particular practical application?
Most programmers (including those egotistical twits who call themselves "developers" or, god forbid, "software engineers") DO need everything laid out in black and white. They also start with the assumption that any problem is the fault of the "lusers" misunderstanding the software or unrealistic expectations.
It seems you overlooked that my post referred exclusively to "competent programmers".
You think that 'programmers' are born with these innate skills
You're just another nerd who thinks everyone outside your profession is incompetent. Look in the mirror.
Thanks for the chuckle. I trust you notice the irony in this as a response to my suggestion that competent programmers have skills beyond ad-hominem?
Programmers are not 'born' - people (worthy or respect as such) are born. Through application and study, they may become skilled/competent programmers. You seem to be under the illusion that the label 'programmer' is a genetic deviancy - presumably one you don't think afflicts yourself?
Programming competency (obviously) is not a sufficient universal qualification - but those who are able often have a wide range of related transferable skills applicable to a far wider range of activities. It is the responsibility of competent management to make best use of these abilities, and to facilitate effective communication to establish the best possible outcomes.
While I've met a few 'programmers' whose skill set is limited - requiring everything to be laid out in black and white... far more often, I find competent programmers are also deeply insightful analysts; innovative problem solvers; dedicated, hard-working and have an eye for accuracy and an ear for honesty. While you can resort to ad-hominem when people disagree with you, such attacks don't work on machines... with fallacious argument off-the-table, those who program are forced to exercise other skills.
I definitely respect sales and marketing - when it's done well. There's a real skill in creating a buzz about a product or service you can deliver - and in closing deals to generate revenue. However... this does not mean that anyone who associates themselves with sales or marketing is automatically above constructive criticism. A major problem for both sales and marketing is that there's a motivation to short-termism... Marketing can blame someone else if they create a buzz about a product that can never be delivered (and it's easier to get people excited about things that are impossible than the mundane...) Sales suffers from the ABC - "Always Be Closing" problem, too, where there is considerable motivation to promise anything, no matter how dishonest, to 'get the deal done' - especially when some convenient 'office politics' can lay the blame for any subsequent disaster at someone else's door.
The underlying problem with all this is management. If sales and marketing run amock - without clear instruction to the aims of the business - they'll run the company into the ground soon enough. Similar catastrophes hang in the balance with technical staff and R&D... Executives need to both respect their staff, and take responsibility for the big picture... They need to avoid the temptation to micromanage (which leads to inevitable failure); they need to learn to draw on the experience of others - and to delegate without washing their hands of a matter. Without suitable direction, you'll end up with a ramshackle bunch of people all blaming each other as the company fails... this is not the fault of the employees - per se... or, even, of day-to-day management... but of the executive. In large corporations where failure as an executive is rewarded similarly to success, we should expect this sort of organisation-wide failure to be endemic.
I don't want specific media for ebooks. I want an ebook device that accurately displays the printed page.
Where's my A4 300+DPI E-ink tablet that's been promised 'just around the corner' for years now.
Doubling your computational effort to extend your weather forecast to a 24th day might well be justified, as might doubling it again to get an extra hour. Doubling again to get the next few minutes, or again for an extra few seconds is far harder to justify - especially as other addressable factors might have greater influence on the uncertainty of the predictions.
We clearly have a different subjective take on the typical practical value of calculations at the cutting edge of 'brute-force' computation. Without specifics we are unlikely to progress the debate. There are, undoubtedly, some problems that can only be tackled by more grunt (tightly coupled computation) but - in my opinion - progressing these problems head-on, typically, does not offer benefits commensurate with the cost.
The diminishing returns implied by the Lyapunov exponent definitely lend credibility to my claim that much of supercomputing is objectively pointless, but I was anxious not to focus upon only one of the ways in which calculations might be irrelevant.
I'd agree that "bigger is better" - but only if we exclude cost from our assessment.With significant financial overheads for marginal improvement in accuracy, I have to wonder - at the extremes of industry practice - might the same funding might been more effectively deployed otherwise? Might a better strategy be to simply accept the limits of inexpensive computing, and focus on finding more effective approaches to practical problems?
I like supercomputers in the same way I like architectural monuments - there's an element of beauty in stretching technology to ever more extreme goals, but I'm far from convinced that there's an objective, practical, point to any of the calculations they make.
I'm very sceptical about climate change prediction - because, without any calculation, it's blindingly obvious that climate will change (all evidence suggests vast changes throughout history) and - because mankind is significant among life on earth - obviously we should assume a fair chunk to be 'man made'. I seldom see the questions that matter addressed... for example, in what ways can we expect climate change to be beneficial to mankind? When we ask the wrong questions, no matter how large-scale or accurate our computation, it will be worthless. Don't get me wrong, I see immense value in forecasting... but I don't see available computational power as a limiting factor... in my opinion there are two critical issues for forecasting: (1) collecting relevant data accurately; (2) establishing the right kind of summaries and models. While some models are computationally expensive - in my opinion - the reason for attempting to brute-force these models has far less to do with objective research and far more to do with political will to have a concrete answer irrespective of its relevance... The complexity of extensive computation is exploited to lend an air of credibility, in most cases, IMHO.
"Don't worry about the future. Or worry, but know that worrying is as effective as trying to solve an algebra equation by chewing bubble gum. The real troubles in your life are apt to be things that never crossed your worried mind, the kind that blindside you at 4 p.m. on some idle Tuesday."
The reason is simple: avoidable disasters occur not because we haven't done enough calculations - but because the calculations we do are done for the wrong reasons and produce irrelevant results. If we want to move forwards, we need more observation and more intelligent consideration. Iterating existing formulas beyond the extent possible with off-the-shelf technology, IMHO, is unlikely to yield anything significant.
I use various desktop PCs, and I want to share my passwords and bookmarks between them... but I am not comfortable with this personal data in the cloud - even on Google's servers. This is exactly the same reason I use Thunderbird and Lightning with my own mail and calendar servers rather than Google Mail/Calendar... even while I'm disappointed with Thunderbird and Lightnig's progress in recent years. I don't want my (potentially sensitive) data lurking in the cloud.
With Firefox, I solved this using XMarks and a personal DAV server on my own hardware accessed over HTTPS.
With Chrome, while XMarks has been ported, it doesn't support personal DAV servers... which is a sticking point for me.
Chrome would probably win me over if it could synchronise bookmarks and passwords against my own server... in spite of my wider concerns about its integration with Google services.
I'm aware that the solution has been leaking out onto the net...
Starting later than most, in spare time, I've trudged through stages One and Two... I've been playing with the stage-3 executable and have disassembled it... though there remains further tedious trudging for me to demonstrate by sensible sequential steps how to go about solving stage-3.
I'm finding it difficult to convince myself that it's worth the effort... I'm sure I can fathom any remaining steps - based upon the fact that there has been little about stages one and two that was actually 'challenging'. It seems silly to plod onwards without 'cheating'.
I was interested principally to try and find out what sort of skills GCHQ actually want... I never assumed I'd be (one of the) first to solve it. The experience has left me wondering what sort of job this sort of tom-foolery would suit one for. Sure debugging and OS-level skills can be valuable - but the challenge is most time consuming as one is required to guess the objective - identifying the intentions of the challenge setter rather than to address real-world issues.
I guess it's a "no brainer" if you're paying with someone else's money.