
I also don't post, so I don't know specifics, but looking at headers, most (half?) posts do have an accompanying nzb file, to the point that some search engines exclusively use the accompanying nzb file to find reports. Yet, even with the posters providing a manifest (I've called it that too), sometimes you need to do a raw search because not all posters provide one.
Most newzbin competitors either do NZB aggregation or do direct usenet searches. Newzbin was the only one (I know of) that did both, and did them well in a consistent UI that worked well over such distinct environments.
The minimum working requirements are rather high: a working news feed for headers to keep the database in sync, plus editors to manually create reports (this means a lot of community participation). They probably can't go any lower w/o changing their model radically. It would take a lot of development to do it differently (w/o starting from scratch sans a newsfeed, which is what most competitors do).
Probably the weakest point was the editor system. While you could throw money (hardware) at the technical issues of the newsfeed and database, the creation of reports remained a manual task by volunteers, and the lack of timely report creation was a reason often cited by users leaving the service. Maybe if more of the report creation had been automated a few years ago, it wouldn't have lost so many subscribers (yet, many subscribers left just because of newzbin 1 legal issues, so the investment to automate report creation might have never paid off).
Still, the raw speed of usenet and the set it and forget it nature is so much better than torrents. Torrents take babysitting to make sure you get them right (and you have to keep them around longer when you're done if you want to be a good citizen and make sure the ecosystem keeps working). With nzbs, you just chose them once and you're pretty much done in seconds (with the selection) and you're watching content in minutes (and there are many automation tools that blow RSS out of the water).
There is also the liability issue. With torrents, depending on local laws, you're usually liable because you're transferring data to others. With a distributed system like usenet, (most legal precedents place) the liability on the side of the poster (good luck finding him/her), and you're just catching something that is out there, and not taking any further action. It detaches providing something from consuming it.
BTW, Sickbeard can also work with torrent files, but I don't know how much automation it supports.
It's not just you, and not just the usenet archive. It's getting harder to find stuff, even when you know it's out there, and sometimes it's even harder when you're looking for specific keywords (it's like you're working against the grain). Between platitude only and text void web sites, flash, social media noise, and ad-driven algorithms, content is becoming harder to distinguish from irrelevant posts and spam. There is a also a strong trend to show recent results rather than relevant results, which only makes it harder when you're looking for something specific.
Hope you find your postings...
You can use addintools Classic Menu for Office. It isn't too expensive (costing $22 to $35, depending on the edition), and last I checked, it can be deployed by GPO using an MSI. After paying $130 to $500 or more for Office, it might feel like adding insult to injury (you'd expect MS to provide an optional menu alternative, at least with Office 2007). Yet, it is an affordable alternative to increase your productivity if you can't stand the ribbon (or maybe a way to selectively give users on your organization that aren't comfortable with the ribbon a way to transition).
I agree. It is a gross over-simplification to make this type of technical decisions solely based on ideology.
Organizations are going to either pay MS for a (debatable) better product, or to technicians to bridge the gap of other solutions. We have to see the whole cost of ownership and the gains and loses in productivity. The case can be made that governments and non-profits should use FOSS exclusively, but they also have to be accountable for the productivity of their employees given their specific work flow (something that business should be more aware). (potentially) wasting man hours forcing an organization to use a solution that might not fit its needs basely only on ideology is far worse than paying a commercial company for proprietary SW.
I like FOSS SW, and try and use it and support it when I can (which isn't as often and as I'd like). However, I'm tired of false equivalences that get made when two products are considered equivalent because they do the same thing w/o any regard to how well they do it. We as technical users tend to just install something and move on: we're not always around to see how our users have to deal with the technical decisions we made for them.
Don't get me wrong. MS often leaves a lot to be desired, and you have to sometimes wonder what they were thinking (and sometimes you have to take it with a grain of salt and give it a try, and you might be pleasantly surprised) and it can take them a long time to react and make things right. But they seem to be trying, and hitting the mark more often than not (specially with Office).
On a tangent. IMO, the ribbon on Office 2007 was awful, and it took a lot to get used to it, and it was understandable to refrain from upgrading to it (and it was a good opportunity for competitors to close the gap and gain market share). However, Office 2010 is far more polished and the ribbon finally made sense (mostly the drop downs with common action items). I still go back to the documentation to find old and trusted menu shortcuts in old versions of Office, but I can see how Office 2010 makes life easier for most users, and specially for newbies.
I wish I could mod you funny
The software base alone for Windows OS is a primary reason to continue using an old version, even if the next MS release is a flop.
A trend I've noticed in many Windows SW is that many commercial titles have stalled, and their offerings look pretty much as they were in 2001. Many commercial developers have been going after web products for so long that have let their desktop counterparts wither and look antiquated (and their web offerings are still not fully fleshed out or even feature complete to compete with the desktop version).
This combined with VDI (and cloud services in general), makes a case in some organizations to ditch the windows client, and just run the legacy apps on a RemoteApp window through VDI on the cloud. Then it doesn't matter what OS you run, you can always reach your legacy apps.
What MS needs is to give developers a reason to develop native apps again for its platform (in contrast for iOS or Android), so it can extend its reach past the desktop in a meaningful way. Otherwise, the platform will be relegated to back end legacy apps that can be run remotely.
Win32s API and the Windows compatibility layer in OS/2 were a serious threat to MS dominance at the time. They offered app developers a measure of compatibility with the present (OS/2 and NT) and the yet unreleased Win95, and it could have stopped MS in its tracks to get Windows to 32 bits. If Win95 would have taken longer, it would have made sense for more apps to migrate over to OS/2 Warp (or to Win32s and run on all 3 operating systems).
What MS lacks right now is an unifying development environment that spawns both form factors. Many Windows apps never migrated to WinMo (and WinPhone) because the API was too crippled and it was too difficult. And now the
The DOS 4 flop was pretty bad (most users stayed with DOS 3.3 for the longest time), but it also made DOS 5 and 6 look like gold when they came out, and made it harder to make the case for OS/2, which seemed like too much bloat and closer to DOS 4.
I think of current globalization as -nearly- a return of slavery. Right now, outsourcing to India and China seems cheaper than automation because the initial investment is low. Even if in the long term automation within USA and EU for their markets would be cheaper and lead to sustained growth and better quality: business tend to only make short term decisions and won't even consider automation.
It is also reminiscent of the aborted Rome/Greece industrial age (No Industrial Revolution in Ancient Greece?): it was cheaper to keep slaves than to invest in building machines that would work using steam power. Right now, even though we have the technology, it is considered a good business practice to outsource labor to low tech markets because their startup cost is low, and there is no regard for quality and long term viability.
That's the competitive advantage that business are giving up when they replace their local, hard earned and paid for local systems in favor of the new and shiny Cloud.
Then companies become shells, that are geared towards sales. The only thing these business will do is figure out ways to sale the re-branded generic products.
As customers realize that most providers are pretty much the same, there will be more and more pressure to lower prices, and will force these same business to figure out ways to be competitive (cheaper and distinctive). Some will figure out ways to combine cloud offerings and/or their own proprietary tech, and might come up with new distinctive offerings. Most will be bypassed and the market will consolidate (ie, going out of business, since the only one doing the work is the Cloud provider).
MS is just jumping on the bandwagon. Windows is perceived as "cloud late", so it is fighting the perception.
The Cloud concepts can useful tools in the IT arsenal. But we have to remember that not everything is a nail, and right now we're still in the phase where managers thing that their cloud hammer is good for everything.
Cloud services make a lot of sense for retail and some manufacturing and store fronts. It makes no sense for specialized service and office workers. Trying to use it for everything is the proverbial square peg into the round hole. Users haven't realized the discrepancy yet. The question is, how long it will be before they begin the reaction and correction?
Indeed, having Cloud services complement and augment a local infrastructure can be a good idea. However, most public cloud providers aren't doing this, and are bypassing the local infrastructure completely (assuming one exists in the first place). Hybrid and private clouds can make a lot of sense, but managers don't make the distinction.
And the question of productivity isn't being addressed. Many cloud systems are severely limited in performance, to the point that they can't compete with local systems. It isn't a 1:1 equivalence between LAN and Cloud. But business managers just hear Cloud and can't wait to jump. And then they become trapped in low performance closed systems, and it takes a major event for them to roll back and rebuild their local systems.
Your good nature will bring you unbounded happiness.