Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Open Source

Ask Slashdot: What's The Best Place To Suggest New Open Source Software? 213

dryriver writes: Somebody I know has been searching up and down the internet for an open source software that can apply GPU pixel shaders (HLSL/GLSL/Cg/SweetFX) to a video and save the result out to a video file. He came up with nothing, so I said "Why not petition the open source community to create such a tool?" His reply was "Where exactly does one go to ask for a new open source software?"

So that is my question: Where on the internet can one best go to request that a new open source software tool that does not exist yet be developed? Or do open source tools only come into existence when someone -- a coder -- starts to build a software, opens the source, and invites other coders to join the fray?

This is a good place to discuss the general logistics of new open source projects -- so leave your best answers in the comments. What's the best place to suggest new open source software?
Programming

Author of Swift Language Chris Lattner is Leaving Apple; We're Interviewing Him (Ask a Question!) (swift.org) 338

Software developer Chris Lattner, who is the main author of LLVM as well as Apple's Swift programming language, is leaving Apple, he said today. From a post: When we made Swift open source and launched Swift.org we put a lot of effort into defining a strong community structure. This structure has enabled Apple and the amazingly vibrant Swift community to work together to evolve Swift into a powerful, mature language powering software used by hundreds of millions of people. I'm happy to announce that Ted Kremenek will be taking over for me as "Project Lead" for the Swift project, managing the administrative and leadership responsibility for Swift.org. This recognizes the incredible effort he has already been putting into the project, and reflects a decision I've made to leave Apple later this month to pursue an opportunity in another space. We're delighted to share that we are interviewing Lattner, who says he's a "long-time reader/fan of Slashdot." Please leave your question in the comments section. Lattner says he'll talk about "open source (llvm/clang/swift/etc) or personal topics," but has requested that we do not ask him about Apple, which is understandable.

Update: Lattner is joining Tesla.
Education

Ask Slashdot: What's The Best Job For This Recent CS Grad? 259

One year away from graduating with a CS degree, an anonymous reader wants some insights from the Slashdot community: [My] curriculum is rather broad, ranging from systems programming on a Raspberry Pi to HTML, CSS, JavaScript, C, Java, JPA, Python, Go, Node.js, software design patterns, basic network stuff (mostly Cisco) and various database technologies... I'm working already part-time as a system administrator for two small companies, but don't want to stay there forever because it's basically a dead-end position. Enjoying the job, though... With these skills under my belt, what career path should I pursue?
There's different positions as well as different fields, and the submission explains simply that "I'm looking for satisfying and rewarding work," adding that "pay is not that important." So leave your suggestions in the comments. What's the best job for this recent CS grad?
Toys

Ask Slashdot: What's The Most Useful 'Nerd Watch' Today? 232

He's worn the same watch for two decades, but now Slashdot reader students wants a new one. For about 20 years I've used Casio Databank 150 watches. They were handy because they kept track of my schedule and the current time. They were very cheap. They required very little maintenance, since the battery lasts more than a year and the bands last even longer. Since they were waterproof, I don't even have to take them off (or remember where I put them!) They were completely immune to malicious software, surveillance, and advertising. However, their waterproof gaskets have worn out so they no longer work for me. Casio no longer makes them or any comparable product (their website is out of date).
Today's watches include everything from heart rate monitors to TV remote controls, and Casio even plans to release a new version of their Android Wear watch with a low-power GPS chip and mapping software. But what's your best suggestion? "I don't want a watch that duplicates the function of my cell phone or computer," adds the original submission -- so leave your best answers in the comments. What's the most useful nerd watch today?
Programming

Ask Slashdot: How Would You Deal With A 'Gaslighting' Colleague? 427

An anonymous reader writes: What's the best unofficial way to deal with a gaslighting colleague? For those not familiar, I mean "bullies unscheduling things you've scheduled, misplacing files and other items that you are working on and co-workers micro-managing you and being particularly critical of what you do and keeping it under their surveillance. They are watching you too much, implying or blatantly saying that you are doing things wrong when, in fact, you are not...a competitive maneuver, a way of making you look bad so that they look good." I'd add poring over every source-code commit, and then criticizing it even if the criticism is contradictory to what he previously said.
The submission adds that "Raising things through the official channels is out of the question, as is confronting the colleague in question directly as he is considered something of a superstar engineer who has been in the company for decades and has much more influence than any ordinary engineer." So leave your best suggestions in the comments. How would you deal with a gaslighting colleague?
Software

Ask Slashdot: Why Are Some Great Games Panned and Some Inferior Games Praised? (soldnersecretwars.de) 145

dryriver writes: A few years ago I bought a multiplayer war game called Soldner: Secret Wars that I had never heard of before. (The game is entirely community maintained now and free to download and play at www.soldnersecretwars.de.) The professional reviews completely and utterly destroyed Soldner -- buggy, bad gameplay, no single-player mode, disappointing graphics, server problems and so on. For me and many other players who did give it a chance beyond the first 30 minutes, Soldner turned out to be the most fun, addictive, varied, satisfying and multi-featured multiplayer war game ever. It had innovative features that AAA titles like Battlefield and COD did not have at all at the time -- fully destructible terrain, walls and buildings, cool physics on everything from jeeps flying off mountaintops to Apache helicopters crashing into Hercules transport aircraft, to dozens of trees being blown down by explosions and then blocking an incoming tank's way. Soldner took a patch or three to become fully stable, but then was just fun, fun, fun to play. So much freedom, so much cool stuff you can do in-game, so many options and gadgets you can play with. By contrast, the far, far simpler -- but better looking -- Battlefield, COD, Medal Of Honor, CounterStrike war games got all the critical praise, made the tens of millions in profit per release, became longstanding franchises and are, to this day, not half the fun to play that Soldner is. How does this happen? How does a title like Soldner, that tried to do more new stuff than the other war games combined, get trashed by every reviewer, and then far less innovative and fun to play war games like BF, COD, CS sell tens of millions of copies per release and get rave reviews all around?
Advertising

Ask Slashdot: Is Computing As Cool and Fun As It Once Was? 449

dryriver writes: I got together with old computer nerd friends the other day. All of us have been at it since the 8-bit/1980s days of Amstrad, Atari, Commodore 64-type home computers. Everybody at the meeting agreed on one thing -- computing is just not as cool and as much fun as it once was. One person lamented that computer games nowadays are tied to internet DRM like Steam, that some crucial DCC software is available to rent only now (e.g. Photoshop) and that many "basic freedoms" of the old-school computer nerd are increasingly disappearing. Another said that Windows 10's spyware aspects made him give up on his beloved PC platform and that he will use Linux and Android devices only from now on, using consoles to game on instead of a PC because of this. A third complained about zero privacy online, internet advertising, viruses, ransomware, hacking, crapware. I lamented that the hardware industry still hasn't given us anything resembling photorealistic realtime 3D graphics, and that the current VR trend arrived a full decade later than it should have. A point of general agreement was that big tech companies in particular don't treat computer users with enough respect anymore. What do Slashdotters think? Is computing still as cool and fun as it once was, or has something "become irreversibly lost" as computing evolved into a multi-billion dollar global business?
Books

What's the Best Book You Read This Year? 338

The year is almost over. It's time we asked you about the books you read over the past few months. Which ones -- new or old -- were your favourite? Please share just one title name in the comments section (and if you would like, rest in parenthesis). Also, which books are you looking forward to reading in the coming weeks?
Chrome

Slashdot Asks: Why Are Browsers So Slow? (ilyabirman.net) 766

Designer Ilya Birman writes: I understand why rendering a complicated layout may be slow. Or why executing a complicated script may be slow. Actually, browsers are rather fast doing these things. If you studied programming and have a rough idea about how many computations are made to render a page, it is surprising the browsers can do it all that fast. But I am not talking about rendering and scripts. I am talking about everything else. Safari may take a second or two just to open a new blank tab on a 2014 iMac. And with ten or fifteen open tabs it eventually becomes sluggish as hell. Chrome is better, but not much so. What are they doing? The tabs are already open. Everything has been rendered. Why does it take more than, say, a thousandth of a second to switch between tabs or create a new one? Opening a 20-megapixel photo from disk doesn't take any noticeable amount of time, it renders instantaneously. Browsers store their stuff in memory. Why can't they just show the pixels immediately when I ask for them? [...] Unfortunately, modern browsers are so stupid that they reload all the tabs when you restart them. Which takes ages if you have a hundred of tabs. Opera was sane: it did not reload a tab unless you asked for it. It just reopened everything from cache. Which took a couple of seconds. Modern browsers boast their rendering and script execution performance, but that's not what matters to me as a user. I just don't understand why programmers spend any time optimising for that while the Chrome is laughably slow even by ten-years-old standards.Do you agree with Birman? If yes, why do you think browsers are generally slow today?
IT

Ask Slashdot: How Should I Furnish (And Secure) My Work-From-Home Office? 303

"If someone gave you a big chunk of change to build a small one- or two-room office, what would you do?" asks long-time Slashdot reader darkpixel2k, as he plans to build a small office out in his backyard. My plan is to trench CAT6 from our ISP fiber DMARC over to the ~12x20 building, wire the structure up for network and power, and furnish it with a small rack, UPS, switch, router, a desk, whiteboard walls, a wireless access point, and an air conditioner for the summer heat... While I have the "big picture" idea in my head, I don't really have a grasp of the fine details that would make it a comfortable work environment... Should I put down carpet and one of those plastic mats for chairs? A friend suggested I wire up speakers so I don't have to listen to my terrible laptop speakers, and a large flat-screen TV so I can display dashboards and statistics.

Lastly, physical security is somewhat of an issue. While everything is insured, downtime of a few days or weeks due to meth heads would be a huge impact to the company and also on my paycheck. I was talking with the local company that builds small office-like structures, sheds, and barns, and they said they can "double up" the 2x4s to strengthen the walls and make a stronger door, but I need to supply my own lock. Should I use some off-the-shelf lock from a big-box hardware store? Should I install a digital lock?

There's more details in the original submission -- but it's also a lot of fun to speculate about what you'd do with a big chunk of change to build your own work-from-home office. So leave your best answers for darkpixel2k in the comments. How should he furnish (and secure) his work-from-home office?
Christmas Cheer

Ask Slashdot: What's The Best Geeky Gift For Children? 204

Everyone's suggesting gifts to teach the next generation of geeks about science, technology, engineering, and math. Slashdot reader theodp writes: In "My Guide to Holiday Gifts," Melinda Gates presents "a STEM gift guide" [which] pales by comparison to Amazon's "STEM picks". Back in 2009, Slashdot discussed science gifts for kids. So, how about a 2016 update?
I've always wanted to ask what geeky gifts Slashdot's readers remember from when they were kids. (And what geeky gifts do you still bitterly wish some enlightened person would've given you?) But more importantly, what modern-day tech toys can best encourage the budding young geeks of today? Leave your best answers in the comments. What's the best geeky gift for children?
Movies

Slashdot Asks: Would You Like Early Access To Movies And Stop Going To Theatres? 341

It appears many major stakeholders in the movie industry want to bring new titles to you within days, if not hours, as they hit cinemas. Earlier this year, we learned that Sean Parker is working on a service called "Screening Room", an idea that was reportedly backed by Peter Jackson, Steven Spielberg and JJ Abrams, to bring movies on the same day as they show up in theaters. Apple seems interested as well. It is reportedly in talks with Hollywood studios to get iTunes rentals of movies that are still playing on the big screen. Earlier this month, Bloomberg reported that several studios are exploring the idea of renting new movies for $25 to $50 just two weeks after they have hit cinemas.

None of such deals have materialized yet, of course, and also it needs to be pointed out that several movie companies have discarded these ideas before because they know that by offering you new titles so early they are going to lose on all the overpriced cold drinks, and snacks they sell you at the theatre. There's also piracy concerns. If a movie is available early, regardless of the DRM tech these companies deploy, good-enough footage of the movies will crop up on file-sharing websites almost immediately.

But leaving all those aspects aside, would you be interested in getting new titles just hours or a week or two after they hit the cinemas? Would you want to end the decades-long practice of going to a theater?
Books

Ask Slashdot: Have You Read 'The Art of Computer Programming'? (wikipedia.org) 381

In 1962, 24-year-old Donald Knuth began writing The Art of Computer Programming, publishing three volumes by 1973, with volume 4 arriving in 2005. (Volume 4A appeared in 2011, with new paperback fascicles planned for every two years, and fascicle 6, "Satisfiability," arriving last December). "You should definitely send me a resume if you can read the whole thing," Bill Gates once said, in a column where he described working through the book. "If somebody is so brash that they think they know everything, Knuth will help them understand that the world is deep and complicated."

But now long-time Slashdot reader Qbertino has a question: I've had The Art of Computer Programming on my book-buying list for just about two decades now and I'm still torn...about actually getting it. I sometimes believe I would mutate into some programming demi-god if I actually worked through this beast, but maybe I'm just fooling myself...

Have any of you worked through or with TAOCP or are you perhaps working through it? And is it worthwhile? I mean not just for bragging rights. And how long can it reasonably take? A few years?

Share your answers and experiences in the comments. Have you read The Art of Computer Programming?
Programming

Ask Slashdot: Has Your Team Ever Succumbed To Hype Driven Development? (daftcode.pl) 332

marekkirejczyk, the VP of Engineering at development shop Daftcode, shares a warning about hype-driven development: Someone reads a blog post, it's trending on Twitter, and we just came back from a conference where there was a great talk about it. Soon after, the team starts using this new shiny technology (or software architecture design paradigm), but instead of going faster (as promised) and building a better product, they get into trouble. They slow down, get demotivated, have problems delivering the next working version to production.
Describing behind-schedule teams that "just need a few more days to sort it all out," he blames all the hype surrounding React.js, microservices, NoSQL, and that "Test-Driven Development Is Dead" blog post by Ruby on Rails creator David Heinemeier Hansson. ("The list goes on and on... The root of all evil seems to be social media.") Does all this sound familiar to any Slashdot readers? Has your team ever succumbed to hype-driven development?
Robotics

Slashdot Asks: Will Farming Be Fully Automated in the Future? (bbc.com) 278

BBC has a report today in which, citing several financial institutions and analysts, it claims that in the not-too-distant future, our fields could be tilled, sown, tended and harvested entirely by fleets of co-operating autonomous machines by land and air. An excerpt from the article: Driverless tractors that can follow pre-programmed routes are already being deployed at large farms around the world. Drones are buzzing over fields assessing crop health and soil conditions. Ground sensors are monitoring the amount of water and nutrients in the soil, triggering irrigation and fertilizer applications. And in Japan, the world's first entirely automated lettuce farm is due for launch next year. The future of farming is automated. The World Bank says we'll need to produce 50% more food by 2050 if the global population continues to rise at its current pace. But the effects of climate change could see crop yields falling by more than a quarter. So autonomous tractors, ground-based sensors, flying drones and enclosed hydroponic farms could all help farmers produce more food, more sustainably at lower cost.What are your thoughts on this?
Programming

Slashdot Asks: Are You Ashamed of Your Code? (businessinsider.com) 280

Programmer and teacher Bill Sourour wrote a post last week called "Code I'm Still Ashamed Of," where he recounts a story in which he was hired to write code for a pharmaceutical company. Little did he know at the time, he was being "duped into helping the company skirt drug advertising laws in order to persuade young women to take a particular drug," recaps Business Insider. "He later found out the drug was known to worsen depression and at least one young woman committed suicide while taking it." Sourour was inspired to write the post after viewing a talk by Robert Martin, called "The Future of Programming," who argues that software developers need to figure out how to self-regulate themselves quickly as software becomes increasingly prevalent in many people's lives. Business Insider reports: "Let's decide what it means to be a programmer," Martin says in the video. "Civilization depends on us. Civilization doesn't understand this yet." His point is that in today's world, everything we do like buying things, making a phone call, driving cars, flying in planes, involves software. And dozens of people have already been killed by faulty software in cars, while hundreds of people have been killed from faulty software during air travel. "We are killing people," Martin says. "We did not get into this business to kill people. And this is only getting worse." Martin finished with a fire-and-brimstone call to action in which he warned that one day, some software developer will do something that will cause a disaster that kills tens of thousands of people. But Sourour points out that it's not just about accidentally killing people or deliberately polluting the air. Software has already been used by Wall Street firms to manipulate stock quotes. "This could not happen without some shady code that creates fake orders," Sourour says. We'd like to ask what your thoughts are on Sourour's post and whether or not you've ever had a similar experience. Have you ever felt ashamed of your code?
Networking

Ask Slashdot: Could A 'Smart Firewall' Protect IoT Devices? 230

To protect our home networks from IoT cracking, Ceaus wants to see a smart firewall: It's a small box (the size of a Raspberry Pi) with two ethernet ports you put in front of your ISP router. This firewall is capable of detecting your IoT devices and blocking their access to the internet, only and exclusively allowing traffic for the associated mobile app (if there is one). All other outgoing IoT traffic is blocked... Once you've plugged in your new IoT toaster, you press the "Scan" button on the firewall and it does the rest for you.
This would also block "snooping" from outside your home network, and of course, keep your devices off botnets. The original submission asks "Does such a firewall exist? Is this a possible Kickstarter project?" So leave your best answers in the comments. Could a smart firewall protect IoT devices?
Portables (Apple)

Slashdot Asks: Which Windows Laptop Could Replace a MacBook Pro? 315

Last month, Apple unveiled new MacBook Pros, featuring an OLED Touch Bar, Touch ID, and all-new form factor that shaves off roughly 3mm in thickness. There are three base versions of the 13-inch MacBook Pro with Intel Core i5 processors and 8GB of memory (upgradable to 16GB RAM and dual-core Intel Core i7 processors) for $1,499, $1,799 and $1,999. The base model 15-inch MacBook Pro comes with Core i7 processors and 16GB of memory for $2,399 and $2,799. Of course, adapters and AppleCare support are sold separately. The new laptops are great for Apple users -- but what about Windows users? Is there a Windows laptop that matches the new MacBook Pro in terms of build quality, reliability, and performance? Jack Schofield via The Guardian attempts to help Patrick, who is looking for a PC that matches Apple's new offerings as closely as possible. "I use my Mac for all the usual surfing, watching videos, listening to music and so on," Patrick writes. "I also use Adobe Photoshop pretty heavily and video-editing software more lightly." Schofield writes: The Dell XPS 13 and 15 are the most obvious alternatives to MacBooks. Unfortunately, they are at the top of this price range. You can still get an old-model XPS 13 (9350) for $950, but that has a Core i5-6200U with only 4GB of memory. The latest 9360 version has a 2.5GHz Core i5-7200U, 8GB of memory and a 128GB SSD for $1,050. If you go for a 512GB SSD at $1,150, you're only saving $420 on a new 2.0GHz MacBook Pro. HP's Spectre x360 range offers similar features to Dell's XPS range, except that all the x360 laptops have touch screens that you can rotate to enable "tent" (eg for movie viewing) or tablet operation. The cheapest model is the HP Spectre x360 13-4126na. This has a 13in screen, a Core i5-6200U processor, 8GB of memory and a 256GB SSD for $1,050. You can upgrade to an HP Spectre x360 13-4129na with better screen resolution -- 2560 x 1440 instead of 1920 x 1080 -- plus a 2.5GHz Core i7-6500U and 512GB SSD for $1,270. Again, this is not much cheaper than a 2.0GHz MacBook Pro 13. You could also look at the Lenovo ThinkPad T560, which is a robust, professional 15.6in laptop that starts at $800. Do any Slashdotters have any comparable Windows laptops in mind that could replace a new MacBook Pro?
IT

Slashdot Asks: Is Paperless Office a Dream? (betanews.com) 260

A new report by Danwood, which surveyed 1,000 office workers, almost half said that they print something every day and 84 percent said printing things on paper at work was an "important aspect of work." In the past, we have seen a trend growing at many workplaces where things are moving increasingly digital, implying strongly that our reliance on paper must be reducing as a result. From a report: Danwood even cites a recent IDC research which says 49 percent of business expect their print volumes to increase over the next two years. Eight in ten (80 percent) of respondents say they need paper documents to get their job done. "Despite a move to digitization, organizations remain reliant on print", says Danwood CEO Wes Mulligan. "Businesses are mindful of unnecessary waste when it comes to physical documents, but print and digital will continue to coexist in today's organizations. The easiest way to strike a balance is to look at ways that you can better integrate paper and digital processes to have a real impact on efficiency, productivity and cost reduction."What do you guys think? Will we ever hit a stage where paper will have a minimal footprint, if at all, at workplaces?

Update: Reader argStyopa shares his views on why paper is here to stay, and for good: (1) Paper is portable and readable in all circumstances. I don't need to fire up a reader, connect to Wi-Fi, turn on a laptop, whatever: here's your piece of paper, read it.
(2) Paper is durable and fixed-format: if I put a paper in a file and come back 10 or even 100 years later, barring catastrophe, it'll still be there. The vagaries of non-cloud storage, and (for the cloud) the evolution of e-storage and e-doc formats means that even if I HAVE the file, I might not be able to read/open it. I have enough trouble opening now 25-year-old docs from my college days plunking on a MacSE.
(3) It's harder to edit paper: simply put, e-docs are easier to fake, generally.

Social Networks

Ask Slashdot: Should Web Browsers Have 'Fact Checking' Capability Built-In? 240

Reader dryriver writes: There is no shortage of internet websites these days that peddle "information", "knowledge", "analysis", "explanations" or even supposed "facts" that don't hold up to even the most basic scrutiny -- one quick trip over to Wikipedia, Snopes, an academic journal or another reasonably factual/unbiased source, and you realize that you've just been fed a triple dose of factually inaccurate horsecrap masquerading as "fact". Unfortunately, many millions of more naive internet users appear to frequent sites daily that very blatantly peddle "untruths", "pseudo-facts" or even "agitprop-like disinformation", the latter sometimes paid for by someone somewhere. No small number of these more gullible internet users then wind up believing just about everything they read or watch on these sites, and in some cases cause other gullible people in the offline world to believe in them too. Now here is an interesting idea: What if your internet browser -- whether Edge, Firefox, Chrome, Opera or other -- was able provide an "information accuracy rating" of some sort when you visit a certain URL. Perhaps something like "11,992 internet users give this website a factual accuracy rating of 3.7/10. This may mean that the website you are visiting is prone to presenting information that may not be factually accurate." You could also take this 2 steps further. You could have a small army of "certified fact checkers" -- people with scientific credentials, positions in academia or similar -- provide a rolling "expert rating" on the very worst of these websites, displayed as "warning scores" by the web browser. Or you could have a keyword analysis algorithm/AI/web crawler go through the webpage you are looking at, try to cross-reference the information presented to you against a selection of "more trusted sources" in the background, and warn you if information presented on a webpage as "fact" simply does not check out. Is this a good idea? Could it be made to work technically? Might a browser feature like this make the internet as a whole a "more factually accurate place" to get information from?That's a remarkable idea. It appears to me that many companies are working on it -- albeit not fast enough, many can say. Google, for instance, recently began adding "Fact check" to some stories in search results. I am not sure how every participating player in this game could implement this in their respective web browsers though. Then there is this fundamental issue: the ability to quickly check whether or not something is indeed accurate. There's too much noise out there, and many publications and blogs report on things (upcoming products, for instance) before things are official. How do you verify such stories? If the NYTimes says, for instance, Apple is not going to launch any iPhone next year, and every website cites NYTimes and republishes it, how do you fact check that? And at last, a lot of fake stories circulate on Facebook. You may think it's a problem. Obama may think it's a problem, but does Facebook see it as a problem? For all it care, those stories are still generating engagement on its site.

Slashdot Top Deals