In Scandinavia we have to use a java applet called BankID for login to our bank account. This has for the past few months become REALLY frustrating for people who really don't know what Java is. Even technicians who has a basic understanding of what a computer is, has problems keeping Java up to date(they don't know where to download it, and therefore accidentally download something they shouldn't) and all the them are infected with that Oracle search toolbar malware.
Pandas http://pandas.pydata.org/ is another great tool for data analysis. It use numpy and is highly optimized with critical code paths which is written in C.
If you are testing if the person can program or not, then it's fine. But if you're looking for only the cream of super software developers, then something minor as the persons emotional state for the day can make a huge impact on the result.
For me personally, I think understanding the actual question is the most difficult part. Some people find bizarre mathematical puzzles fun. I prefer puzzles from the real world, like for example how to get two systems to talk together.
There are many reasons!
The FreeBSD Handbook / Documentation with consistency
However FreeBSD doesn't excell for everything, for example Java support is far away from production ready. And another thing I ran into recently was that monitoring a lot of files for changes was slow/not scalable at all because kqueue uses file descriptors for monitoring changes in your filesystem. Linux, OS X or even Windows have scalable and working solutions for this.
Not only have you created an amazing tool, it is open source and the best part...it's actually well documented! Christmas came early this year!
Hammer can do deduplication with minimal memory requirements. For example only 512MB ram would still give a responsive and fast system. Hammer deduplication doesn't take a hard hit on performance like ZFS does, as ZFS dedup data in realtime while Hammer does it with a CRON job.
I'm not using BTRFS yet, however as send & receive in BTRFS is similar to the ZFS send & receive implementation you can do really cool things like superquick backup of a gigantic PostgreSQL Database.
The workflow is as following
Snapshot the filesystem with PostgreSQL data
Send the snapshot to your backup server
So if I run PostgreSQL on Windows I can be sure VSS executes psql -c "select pg_start_backup(‘hourly’,true);" before creating the snapshot?
My FreeBSD PostgreSQL backup looks like this and runs hourly.
prev=`date -v-1H '+%Y-%m-%d_%H'`
psql -c "select pg_start_backup(‘hourly’,true);"
zfs snapshot tank/pgsql@$now
psql -c "select pg_stop_backup();"
zfs send -R -i tank/pgsql@$prev tank/pgsql@$now | ssh backup@hpbackup zfs receive -Fdu tank/backup/pgsql
You can do the similar thing with Linux as BTRFS now support send and receive.
You forgot step 7 for Windows. Click ok for administration mode and next, next, next, and finally make sure to hook off those extraordinary browser bars.
Been there myself. There are so many good looking nuts out there, a dangerous world for a peaceful man.
I work with IPTV and VOD, we have 4 PB of data running on FreeBSD and ZFS which is being replicated off site with the send && receive features that comes with ZFS. Since we mostly deal with large media files we have even reversed the replication direction. That means that if master storage needs to go down for maintenance, the other offsite storage becomes the master. At the moment we're looking into using HAST which will make it even easier to switch what storage site that should be the master.
Backwards compatibility. Heck I still play c64 games! Old games are still fun!
No internal optical disc drive. Make it optional by offering an external device.
Noise > performance. I don't want to hear that box.
Todays Xbox 360 controller is fine, I do not want to buy a new controller just because it comes with an extra button.
I run my own cloud network storage business. Everything is encrypted on the client side, there is no cheating(ala bitcasa which says they manage to deduplicate encrypted data). Sure you can upload raw data that you for example want to share, but one should know that someone else then have the possibility to read and abuse the data.
So I would say the data is safe in our cloud. Sure we have access to see how much disk space you're using, but thats pretty much it.