Somewhat ironic that my first memory of "freedom".is being locked in a large padded room with 100 kids and John Wayne. Still, it worked out great from a social POV, everyone shopped on Saturday morning because the shops were closed Saturday afternoon and all day Sunday, so after "shopping with the kids", the kids got to burn off their energy and mum and dad got a quiet afternoon to restore theirs.
It could very well be that Microsoft has decided to give something away without expecting anything in return.
Of course they get something in return. They get everybody on the WinRT APIs so that they get 30% of all software sales for Windows. That's worth way more than a windows license.
They also get OneDrive subscriptions to increase your storage and Microsoft Office subscriptions and they get you searching Bing and they get you buying Skype minutes and they get you buying Surface Tablets and they get you buying movies and music and they get you buying Music Subscriptions and they get you subscribing to Xbox Live and they convince developers to develop for WinRT/Store since there are tons of customers now and that boosts Windows 10 on phones and that attacks the ipad and....
Microsoft profits handsomely by giving you Windows 10 for free. The sooner people get off of Win7 the sooner the App Store cash cow starts getting milked.
So Windows 10 takes Windows 8.1 and spends almost all of its development resources on keyboard and mouse... but you aren't going to upgrade from *Windows 8.1* which is worse with mouse and keyboard because you are afraid that Windows 10, Microsoft's return to the Mouse and Keyboard paradigm is too touch oriented?
I can't think of a single application which actually tries to swap memory off of PCI-E. Because you not only have to load the new memory, you also then have to *reload* the old memory. Especially with something like raytracing where you never know which memory is going to be called on the next ray. You would be pathetically and uselessly slow. So ok, fine technically "that's not how GPU memory management works" but back in the real world there are few practical applications where a single task is useful when memory swapping comes into play.
Seriously, if you're downloading from a third-party mirror, why would you not check the hash of the binary compared to the original? I mean, why would anyone even use Sourceforge for this in the first place? The official website has the official versions, and whatever distro you're using has screened versions in their repos.
Where is the official website? The GIMP is easy; Google knows that it originated at gimp.org. But a search also brings up GIMP at 'softtonic', 'gimpshop', CNet, and TechRadar -- all of which probably have added malware. If the program were more obscure, finding the correct link would be more difficult.
It would be nice to have one site that served trustable downloads for shareware and open-source code. Sourceforge used to be that site.
Sourceforge used to be the one site I trusted to not contain adware and viruses, because it was near-impossible to add those without the OSS community noticing them. Now they're fuxing with the code after community review.
What sites still exist that I can trust? Sometimes I need to download apps and code, like when I'm loading up a new PC. Are there any remaining software/shareware sites that do *not* stuff their downloads full of malware?
Sure, it performs exactly the same until you run out of memory and then its performance goes to 0% since it can't do anything. You're getting 12GB of RAM with the TitanX vs the 6GB on the TI. The Titan is essentially cheap Tesla for compute heavy tasks like Adobe Premiere or 3D Rendering. I wish though that Nvidia would embrace Titan's position and enable GRID virtualization. AMD has enabled at least GPU pass-through on their entire high end line. I guess Virtual GPUs are still so niche that it's not a high priority. But with the Xbox One running 2 virtual OSes on the same Box it would be great if Microsoft embraced virtualization more broadly.
ATI/AMD has had shit drivers since the 90s. This whole "Nvidia is stabbing us in the back!" doesn't carry water since they've *never* had reliable products.
And why would the accounting department or HR consult with IT before purchasing accounting or sales or HR software? My fiancee is a school registrar. She understands the software she uses very well and is an expert within the limited domain of school registration and billing software. Why would she call their school's IT contracted company to consult before picking out a hypothetical new replacement piece of software to do her job.
Fucking developers, I swear. The fact you even know what Linux is makes you such an outlier and you don't even know it. Technology benefits more than just companies that "make great visual effects"
Fucking IT, I swear. The fact that you know how to manage a server, fix a RAID or setup a router doesn't make you an expert on ever fucking aspect of technology in the universe. I'm incredibly technologically proficient but there is no reason my fiancee would consult with me on what software to use to do her job. Ideally she would design her own software, in the real world she would find a developer and have them make software that meets her needs but most likely she'll just have to shop on the market for a commercial product already available. None of those scenarios though need any input from IT.
From what the article says, the CEO wasn't trying to get out of paying anything to the workers. The company was asking to be allowed to pay installments so they could avoid bankruptcy.
Let me translate that: the company was nearly broke and the employees were given the opportunity to become *investors* and potentially get their money in an installment plan assuming the company didn't go bankrupt before the installment plan was complete for presumably little to no return or take their owed severance.
I can't believe that employees who were fired might not choose to invest in their former employer who just fired them... especially when they were fired because the business was already going down.
The existing investors were also smart enough to cut their losses and liquidate. If the profit-seeking investors who presumably believed in the company weren't willing to invest then why the fuck would a laid off worker choose to invest int he company's future?
Basically, the internet is trolls all the way down.
I don't have anything to add, but I thought that should be posted again.
Oh...you're a moron.
Oh, you're not IT, you say. View it as an epithet, do you? Well then hope against hope that your collocation service fixes the glaring security holes you leave in the dev servers you shift into prod.
I don't view IT as an epithet I view it as a specific skillset that we don't need full time in house. IT is about being an expert at OS, Network and Database management. If we want to deploy openstack, we call our contract IT company. If our fileserver goes down, we call IT. If we are seeing a performance bottleneck in our network we call IT.
Everybody else though is focused on a completely different task, making great visual effects. To do that we write tools to assist artists, streamline workflow and automate time consuming tasks.
If I have trouble with a linux box I call a kickass IT guy who knows Linux backwards and forwards. If I need someone to streamline the workflow for managing a VFX sequence with 800 assets with evolving character rigs and ensuring that an animator can transfer their animation to a new rig I'm not going to IT I'm going to a technical artist who has deep domain knowledge on both character animation and rigging.
If that developer decides that they need a database to track animations between versions they will probably develop on a database on their local workstation. When they're happy and want to move it to production then we meet with IT, tell them "We'll have 40 users with about 10,000 requests per minute." They'll recommend hardware or say that an existing server can handle it and deploy a production ready database. They'll ensure it fits in with our existing security policies, firewalls, access rights etc and then handle maintenance and backup.
Just because someone touches a computer doesn't make them "IT". Not because IT is an insult but because it would redefine the role too broadly. Would you call a technical animator who works on developing fluid simulations an IT person? No.
Similarly someone who works on the Unreal Engine's source isn't in "IT" they are a developer. They are working on very specific problems unique to computer graphics or audio or AI or Animation etc. The person though who ensures that developer has the infrastructure they need isn't someone to be looked down on, they're just in a very different role. However when that developer says that they need 10TB of shared storage at 400MB/s to 5 users then you call IT. That's specifically *not* working two jobs that's using people where they are most productive. I see the hierarchy as such:
Physicists - Develop principles.
Fabrication Experts - User principles of physics to create better chips.
Chip Designers - Design processors which can "do work".
Fundamental Software Developers - Write the software to expose the hardware to regular developers. (OS, Drivers, File Systems, Runtimes, Networking Stacks, Compilers etc, Databases, etc.)
IT - Deploys and maintains the hardware designed by Chip Designers and software by the Systems Engineers doing the low level fundamental work necessary.
Developers - Those who write functional software to solve specific problems.
Users - People who use the software.
As I see it a fab engineer should understand physics, a chip designer should understand the limitations of fabrication, a systems engineer should understand chip design, IT should understand drivers and other low level systems engineering, Developers should know how to do limited deployments of their development environment.
Users should ideally be able to write tools to solve their problems.
But to use the obligatory car analogy, I'm not going to call a civil engineer to consult on how to tune the steering on a race car. I will call them and have them design a great race track to drive the car on, but their role is one of deploying and repairing infrastructure for cars to drive on, not to design the cars except where the two overlap by necessity.
The ideal first target is probably memory. That is a circuit that is made from the same few elements banged out billions of times. If you can make a crystal out of memory elements, then you would be able to have enormous memory densities. You could have a mole of bits for a few hundred grams of material.
The barriers are enormous. We will have to re-invent every part of a circuit at smaller scales The main barrier is probably getting the money to do the research, because it will take many decades to do this before we start getting any money back, whereas if we improve the packing density of silicon circuits by (say) 10% then we get a huge savings world-wide straight away.
There are other possible products. It would be a lot easier to make a molecular equivalent of tape. The tape might be made of square molecules such as porphyrins, with some magnetic component at the centre, and reactive groups at the corners so it forms into a ribbon or tape with sprockets at the edge. This tape would assemble itself. We would then have to make a reader, but that might be possible without full molecular circuitry. This is not as neat as the solid-state molecular circuit solution, but things like this might be useful stepping stones on the way.