Over a year ago, they released a 75MB full backup of the site that contained all of the torrents. Does anyone know if a more recent on exists for others to build upon?
File Name: https://github.com/justicezyx/...
They claimed copyright on a file called README of 1 byte in size. This is ridiculous.
As Luke on phoronix points out, "Webhosts should block Cyveillance, PicScout, etc. None of those automated copythug bots respect robots.txt and all of them can be construed as violating the TOS or any website that posts a demand that they stay away. One website (https://dcdirectactionnews.wordpress.com) has posted a legal notice that every access by Picscout could cost them $10,000 in liquidated damages, essentially a reverse "Getty Letter" against them. I suspect Cyveillance is about to get added to that notice, along with all their clients,.
GIThub should post similar terms and if they control the server they can also block these bots directly. So should this forum, phoronix itself, and as many websites as possible to shut down these parasites. PicScout in particular uses so much bandwidth that some smaller websites have incurred significant extra data costs until they blocked PicScout."
The problem with the BRG battery results is they rely on LG's "opimizations" which only work when you're doing things like browsing the web. When you need to push the device like playing 3D games, all bets are off. If the benchmarks showed the G3 in a looping in a N.O.V.A. 3 demo, I'd be impressed, but they don't,
The real issue here is that even my G2 slows down in heavy fights in FPS games and doesn't have enough battery life. With the G3, they increased the number of pixels that must be rendered by nearly 2x. (2560×1440)÷(1920×1080). More pixel switching is a double whammy since the LCD takes more power and the GPU takes more power to push those extra pixels. So unless the G3's GPU is MORE than 2x as powerful and the memory bandwith is MORE than 2x the G2's then you're going to end up with a SLOWER phone. And even if LG has done it, I'd much rather see that extra GPU power used to make better looking games and have a phone with longer battery life.
So who cares? I care, cauz it's a shitty engineering trade-off.
I have an old Radeon X1950PRO in guest/spare PC. While it's getting long in the tooth it's still good enough for some Star Craft 2 and Dota 2 action with friends. Unfortunately I have to boot to windows 7 to get decent performance. The kernel devs are always changing the driver interface, so the last time I was able to use the proprietary drivers was around Ubuntu 6. Now in Linux my only option are buggy, glitch drivers like Phoronix described in their drivers or booting to Windows. The hardware specs were released. Now if after 8 year, the open source drivers are still buggy and slow, they will never be as good as the proprietary. What Linux needs a stable driver interface like Windows has.
Thanks for the tips. I never considered using Xfwm with LXDE. I may give it a shot after experimenting with it myself first. In the past I've always found that the further I strayed from the main Ubuntu/RHEL distros, the quirkier and buggier the OS gets. The user was exited about trying Linux and I really wanted to give him a solid experience that rivaled XP. If Mint/Mate doesn't work, I'll give Xubuntu a shot.
This was exactly my experience when "upgrading" an old 512 MB CoreDuo laptop to Linux. GNOME 2 was too heavy and LXDE was lacking features. My first try was with LXDE, but OpenBox does not give the option to move windows without drawing the contents(Bug 3342). As a result windows operations are painfully slow and this was a major downgrade from XP's user experience.
After trying both the nouveau and proprietary drivers, I ended up using the much heavier Mate (GNOME 2) based Mint. Mate has the option to disable window contents while dragging but with Mint, just a few Firefox tabs gets the HD thrashing, which is worse than it was doing with XP and much worse than LXDE was doing. If the user complains, I'll give XFCE a shot.
It is ridiculous to put a such a high resolution display on a tiny screen. I just recently upgraded from a 720 Nexus 4 to a 1080 Nexus 5. I have great vision and side-by-side, I can't tell the difference between the two screens for fine text or pictures. While this phone is a great value, the battery life is terrible and the games run no better than their predecessor. If I had a choice, I'd much rather have the N5 with my old N4's 720p screen.
it is decisively beaten by pretty much every graphics card over $100. You can't game at 1080p or use MadVR with maximum settings on Iris Pro.
To be competitive on the desktop, Intel needs something about as powerful as a Radeon HD 7850 or GeForce GTX 650 Ti Boost. As of now they aren't even close.
I built my desktop with a 3.4GHz Core i5-3570K Ivy Bridge. Anyone telling you the HD Graphics 4000 is "good enough" for gaming is full of shit. Even my low rez 1200x1080 monitor, most games struggled to get 30 FPS at anything about the lowest detail level. When I got into Dota 2, that was the final straw. I caved in and bought a Radeon HD 7850 for $150. The difference is night and day. Integrated graphics are still garbage, worthless for anything beyond angry birds.
I dual boot to Linux and have a decent steam library. The only thing I'll give Intel, is that they do make decent open source drivers that perform nearly as well in Linux as Windows. The AMD open source drivers are terrible for gaming. They get 30-80% of the proprietary drivers FPS and have major issues with micro stuttering. And yes, I use the dev drivers from the edgy PPA along with all the tweaks like SB backend. They still suck.
Apple doesn't need to cheat because the last phone that was slower than its predecessor was the iPhone 4. Ever since then, every successor has had a faster gpu while rendering the same number pixels and therefore outperforms on the benchmarks and battery life. Above 300 PPI, you are just wasting battery life and hurting performance to display pixels the human eye can't even resolve. I wish more android manufactures had the guts to follow Apple's engineering wisdom here.
As a 12+ year Linux users, I have to give Shuttleworth some rope to hang or prove himself. For example, back in Gnome 2x-3x transition days, Gnome panel was broken for widescreen devices like LCD monitors and netbooks. Unity turned out a bloated for my taste, but I fully understand his frustration with Gnome. In the end, for heavy weight desktops, I prefer Unity over Gnome 3.
PulseAudiois fine for playing music, but a real PITA for many hardcore gamers including myself. I found latency was terrible with Wine + PA and later saw the developers had an issue with PA too. After countless hours lost trying to debug some PA issues, I lost all respect for Poettering. The only worse sound server that I’ve encountered is AudioFlinger, and at least that has the excuse of being optimized for battery life over latency. So like Shuttleworth, I'm skeptical about any of Poettering's work.
Now to the meat of the debate, Mir. It's clear X11 is fundamentally broken for modern desktop/GPUs.  It needs to die and I don't care if it is replaced by Mir or Wayland. I have been hearing about Wayland for years now, and only after Mir was announced did I start to hear about it actually reaching a usable state. I wish they'd work together but maybe a little competition will help us all to finally rid Linux of X11.
I'm deeply disappointed that this issue was decided over philosophical instead of technical merits. If Clang was superior to GCC in the majority of benchmarks, then I would support this decision. But that’s not the case, GCC is still leading in most benchmarks and can be an order of magnitude faster when the popular OpenMP library is used. Sadly, BSD users are the losers here.
Agreed. All the important edits that I've made, were from a desktop where I could properly research and cite my changes. I could only see this being useful for minor edits and for people in poorer countries that only have smartphones. However, people that can only afford to get online with a smartphone, will probably have more urgent issues than editing Wikipedia. I've also seen quite a bit of vandalism from highschool addresses. Making it easier for bored teenagers in class to graffiti Wikipedia may not be the best idea.
I always thought Alastair Reynolds' "Revelation Space" would make a great series of movies. Lets see how the Hollywood "Ender's Game" turns out.
While this post is a valuable addition to Drew's analysis, I feel it's not really a rebutal at all.