Wow, I get the joke wasn't funny, but it's on topic, not off topic. An "overrated" mod would be more appropriate than an "off topic" one.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
The study was, itself, racist. It only studied white participants.
I think Inigo Montoya would like a word with you.
Racism is prejudice against a certain race. Studying a race is not racism.
The article was sufficient to demonstrate racial bias even with virtual participants.
It says nothing about black versus white versus white versus black. That's you reading racism into a study where the people involved probably (a) don't care much about specific races and (b) didn't have the time or money to have a larger study covering more options.
Randomly complaining about racism where none exists. Does this mean I get to call you an SJW? Seriously, I don't know. SJW seems to get used more or less for everything at the moment.
Isn't there a Women in STEM or global warming thread for you to infest?
If systemd has any bearing on women in STEM or global warming, then truly its scope has become more vast than any dared to dream or dread.
If you have weeks long running jobs on your desktop you're doing it wrong.
I disagree, speaking as someone who has in fact had a weeks long job running on my desktop before. I mean if you have a fast PC (desktop processors are often faster per core, at the expense of fewer cores) and it's a single threaded job (sadly, not all workloads are parallelisable) then it makes sense to run it locally.
If power outages happen once per year (I think that's pessimistic), and the job lasts a month then that's an average of 5 of compute days wasted per year due to power outages. And that's assuming the job doesn't checkpoint itself to allow for continuation (mine did).
That was an excessive case. Even so, a few-day job or an overnighter or so on is completely fine, but you'd still find reboots annoying.
Then complain when their desktop is running like shit
I've never had a problem with that on Linux. Even if you load up the CPU with one full job per core, the system stays very responsive. Hell even with one job per hyper-thread on Intel, which is overloaded by a factor of nearly 2 and it's pretty responsive, especially if you renice the jobs.
You have to go waayyy over the capacity of the machine to make it run like crap.
You can even put GPU compute in servers and have a lot less concern for your systems going down.
True, but there are fewer options for those and they're often much more expensive and less good in servers compared to desktops, depending on the workload.
You shouldn't need to pick a module after the Kickstarter offering.
Why not? Sure it's more expensive in serious bulk to use the module over the components. OTHO, the module has a 6 layer board and expensive 0102 (metric) component placement and expensive QFN placement not to mention a bit of 2.4GHz RF design which I know fairly well is beyond me.
Excluding the module my board is a cheapie 2 layer, with nothing smaller than 0603 and SOIC, which is substantially cheaper at assembly houses than leadless packages. The fact that the module has the full 6 layers and internal decoupling caps means I can be a lot less careful with my low frequency (under 1KHz) but very low noise analog side, especially in terms of the ground plane than, would otherwise be the case.
As a result, we're planning on going into full production (i.e. units for sale, though initially not vast quantities) using an RF module.
Nonetheless it appears that you agree with me that the existence of wireless modules means that doing RF bits isn't really that hard.
Pfff. I never [FREE VIAGRA] reboot anyway [BETTER ONLINE DATING] and my computer is [EMAIL PASSWORD REQUIRED] perfectly secure and [MY CLEAN PC] free of viruses.
My personal rule is that I never say anything online if I wouldn't say it to a room filled with my wife, my boss, my family, and my friends.
Well, not everyone has that luxury. I have the nice thing about having an amazing partner, a few good friends who I really trust and no boss. Which is nice, because I can in principle voice my opinions to them. Not everyone has the luxury, and given I also have a professional reputation to maintain, I don't feel I have the luxury of putting forth all my opinions online with my real name attached.
There are certainly opinions I hold very strongly (it's insane IMO that drawn pictures count as child pornography, for instance, and Im pretty feircly against strict liability in that case as well) which could land me in all sorts of social crap with many people. I'm glad somewhere like slashdot exists where I can have proper, in-depth conversations about serious, important topics with people who have the online appearance of rational adults.
(I know people make jokes about the comments, Betteridges law and etc still aplies, but
I think being able to say stuff in a forum (which are generally public) without picking up crap off the forum is important. It also takes into account the times I'm drunk, or, frankly when my brain isn't working and I say something outrageously stupid. The latter happens more often than I'd like, but is just words in the wind AFK. Here it is recorded in perpetuity.
So, in conclusion, even though I stand by my opinions, and would stand by them publicly if I had to, I still think it is important to be able to discuss things in public but anonymously.
Oh godyou're one of those: "SJW means anything I hate" crowd. A clue: if you randomly make up new meanings of "SJW" every time you use it, it doesn't mean anything.
Yeah but that's not the point: his point is that wireless links are particularly hard. I'm claiming the profusion of wireless modules has made it rather easy.
Well basically no. Given your statement:
So flame away, fanboys. I'm used to it. The truth hurts, and the more squealing I get, the more I know that I am saying the truth.
I don't expect this reply will get through to you or get meaningful a responses. Others might benefir so here goes:
Even with the standard libraries, there were rarely systems without a lot of custom storage code. By it's own claimed abilities for code reuse, C++ was a failure before C++11.
Well, depending on how you meant that, it's either wrong or a triumph of C++. I have used systems where there has been extensive use of custom containers. The main reason for that was it targeted android and android didn't really ship with C++. It shipped with a language looking much like C++ except with several key features and the standard library nuked. It's hardly C++'s fault for languages that are almost-but-not-quite C++ not being C++.
Other than that one, people mostly seem to make do with the standard containers for things they work for.
In my own code I also make use of two non STL containers, one for images and one for Linear Algebra (vectors and matrices). It's a triumph that these work and look and feel just like native or standard library things. They also don't use custom code: it's either C++ arrays for fixed In my own code I also make use of two non STL containers, one for images and one for
sized objects and backed by std::vector otherwise.
So now I'm going to make the same mistake again. If we take Stroustrup's publication of The C++ Programming Language in 1985 as the start of the ongoing C++ era, then it took over 25 years for the language to become somewhat OK.
No one will deny that C++11 was late. Hell it was meant to be C++0x. However, you're comparing one of the better languages today (C++14) with languages from 30 years ago which is disengenuous. It took C++ 25 years to become a good language relative to others 25 years from its inception. But it was good in the mean time compared to its contempories too.
In my estimation, C++ was never a good idea.
Well, the world more or less disagrees with you. It's the only language out there that provides high level abstractions and low runtime penalties. The only serious competitors have come around recently and aren't really production ready. What else scales as well?
Changing the internal workings of an object is very likely to propagate outside the object.
Not if you code worth a damn. Yes we all know objects are sized things and changing the object requires a recompile. However, my code compiles on LLVM, GCC and Visual Studio and ha at various times compiled on MIPS Pro and whatever that sun compiler was. I think someone got it to compile with STLPort on Android too back when that was a thing.
That's 6 complete, from scratch reimlementations of the same objects (the standard library). Yet despite those complete changes no fooling around was required in that regard.
Well, I've got one from about 16 years ago but I'm pretty sure they still make them.
It has a large knife, a small knife, a corkscrew, a hook, some scissors, a small flathead blade/can opener combo and a larger flathead screwdrive/can opener combo an awl, a toothpick and some tweezers.
The handle is a good width and I've used all of the tools repeatedly. The main disadvantage is th blade is quite soft and so it's now worn down from multiple re-sharpenings.
I still prefer my homegrown lib for lists and trees and such
Uh so how is your library less "bloated" and more readable than the STL?
ut when I have to choose some publicly available software, I pick Boost over STL
So how on earth do you square that with boost mostly being a proving ground for the standard library?
What's wrong with wireless interfaces? There's oodles of bluetooth, xbee and wifi modules out there, so unless the requirements are far out (e.g. very small, VERY long range), the problem is easily solved.
Not that I'm defensive, but I say this as someone who is currently developing some hardware and it's close to production ready. I'm also mulling a kickstarter campaign to get over the last bump. It uses a BLE module and it works fine. In fact that was the easiest bit to get working.
Well, I'm not really sure about that. The ultrabooks are generally quite high end as laptops go. Sure, they're not a match for a 17" luggable but unlike the cheap small laptops, they tend to be pretty fast, and have a decent i5 or i7 on board and a decent amount of RAM and drive space. So, the little pocket sized computer would perform awfully compared to an ultrabook.