Network

Ask Slashdot: Best Way To Isolate a Network And Allow Data Transfer? 226

Futurepower(R) writes: What is the best way to isolate a network from the internet and prevent intrusion of malware, while allowing carefully examined data transfer from internet-facing computers? An example of complete network isolation could be that each user would have two computers with a KVM switch and a monitor and keyboard, or two monitors and two keyboards. An internet-facing computer could run a very secure version of Linux. Any data to be transferred to that user's computer on the network would perhaps go through several Raspberry Pi computers running Linux; the computers could each use a different method of checking for malware. Windows computers on the isolated network could be updated using Autopatcher, so that there would never be a direct connection with the internet. Why not use virtualization? Virtualization does not provide enough separation; there is the possibility of vulnerabilities. Do you have any ideas about improving the example above?
Privacy

Ask Slashdot: How Do You Prepare For The Theft Of Your PC? 262

A security-conscious Slashdot reader has theft insurance -- but worries whether it covers PC theft. And besides the hassles of recreating every customization after restoring from backups, there's also the issue of keeping personal data private. I currently keep important information on a hidden, encrypted partition so an ordinary thief won't get much off of it, but that is about the extent of my preparation... What would you do? Some sort of beacon to let you know where your stuff is? Remote wipe? Online backup?
There's a couple of issues here -- including privacy, data recovery, deterrence, compensation -- each leading to different ways to answer the question: what can you actually do to prepare for the possibility? So use the comments to share your own experiences. How have you prepared for the theft of your PC?
Yahoo!

Ask Slashdot: Advice For a Yahoo Mail Refugee 322

New submitter ma1wrbu5tr writes: Very shortly after the announcement of Verizon's acquisition of Yahoo, two things happened that caught my attention. First, I was sent an email that basically said "these are our new Terms of Service and if you don't agree to them, you have until June 8th to close your account". Subsequently, I noticed that when working in my mailbox via the browser, I kept seeing messages in the status bar saying "uploading..." and "upload complete". I understand that Y! has started advertising heavily in the webmail app but I find these "uploads" disturbing. I've since broken out a pop client and have downloaded 15 years worth of mail and am going through to ensure there are no other online accounts tied to that address. My question to slashdotters is this: "What paid or free secure email service do you recommend as a replacement and why?" I'm on the hunt for an email service that supports encryption, has a good Privacy Policy, and doesn't have a history of breaches or allowing snooping.
Businesses

Ask Slashdot: What Are Some 'Best Practices' IT Should Avoid At All Costs? (cio.com) 348

snydeq writes: From telling everyone they're your customer to establishing a cloud strategy, Bob Lewis outlines 12 "industry best practices" that are sure to sink your company's chances of IT success: "What makes IT organizations fail? Often, it's the adoption of what's described as 'industry best practices' by people who ought to know better but don't, probably because they've never had to do the job. From establishing internal customers to instituting charge-backs to insisting on ROI, a lot of this advice looks plausible when viewed from 50,000 feet or more. Scratch the surface, however, and you begin to find these surefire recipes for IT success are often formulas for failure." What "best practices" would you add?
Hardware

Ask Slashdot: What Would Happen If You Were To Put a Computer Inside a Fridge? 181

dryriver writes: This is not asking what would happen if you were to place your iMac inside your kitchen fridge. Rather, what if a computer casing for a high-powered graphics workstation with multiple CPUs and GPUs, lets say, worked just like a small fridge or freezer, cooling your hardware down without using any CPU fans or liquid cooling and similar. How much would such a fridge-casing cost to make and buy, how much electricity would it consume, how much bigger would it be than a normal PC casing, and would it be a practical solution to the problem of keeping high-powered computer hardware cool for extended periods of time? Bonus question: Is such a thing as a fridge-casing or "Fridgeputer" sold anywhere on the world market right now? Linus Tech Tips tackled this question in a video a couple of years ago, titled "PC Build in a Fridge - Does it Work?"
Python

Ask Slashdot: Will Python Become The Dominant Programming Language? 808

An anonymous reader shares their thoughts on language popuarity: In the PYPL index, which is based on Google searches and is supposed to be forward looking, the trend is unmistakable. Python is rising fast and Java and others are declining. Combine this with the fact that Python is now the most widely taught language in the universities. In fields such as data science and machine learning, Python is already dominating. "Python where you can, C++ where you must" enterprises are following suit too, especially in data science but for everything else from web development to general purpose computing...

People who complain that you can't build large scale systems without a compiler likely over-rely on the latter and are slaves to IDEs. If you write good unit tests and enforce Test Driven Development, the compiler becomes un-necessary and gets in the way. You are forced to provide too much information to it (also known as boilerplate) and can't quickly refactor code, which is necessary for quick iterations.

The original submission ends with a question: "Is Python going to dominate in the future?" Slashdot readers should have some interesting opinions on this. So leave your own thoughts in the comments. Will Python become the dominant programming language?
AI

Ask Slashdot: How Can Programmers Move Into AI Jobs? 121

"I have the seriously growing suspicion that AI is coming for us programmers and IT experts faster than we might want to admit," writes long-time Slashdot reader Qbertino. So he's contemplating a career change -- and wondering what AI work is out there now, and how can he move into it? Is anything popping up in the industry and AI hype? (And what are these positions called, what do they precisely do, and what are the skills needed to do them?) I suspect something like an "AI Architect", planning AI setups and clearly defining the boundaries of what the AI is supposed to do and explore.

Then I presume the requirements for something like an "AI Maintainer" and/or "AI Trainer" which would probably resemble something like an admin of a big data storage, looking at statistics and making educated decisions on which "AI Training Paths" the AI should continue to explore to gain the skill required and deciding when the "AI" is ready to be let go on to the task... And what about Tensor Flow? Should I toy around with it or are we past that stage already and will others do AI setup and installation better than me before I know how this thing really works...?

Is there a degree program, or other paths to skill and knowledge, for a programmer who's convinced that "AI is today what the web was in 1993"? And if AI of the future ends up tied to specific providers -- AI as a service -- then are there specific vendors he should be focusing on (besides Google?) Leave your best suggestions in the comments. How can programmers move into AI jobs?
AI

Ask Slashdot: What Types of Jobs Are Opening Up In the New Field of AI? 133

Qbertino writes: I'm about to move on in my career after having a "short rethink and regroup break" and was for quite some time now thinking about getting into perhaps a new programming language and technology, like NodeJS or Java/Kotlin or something. But I have the seriously growing suspicion that artificial intelligence is coming for us programmers and IT experts faster than we might want to admit. Just last weekend I heard myself saying to a friend who was a pioneer on the web, "AI is today what the web was in 1993" -- I think that to be very true. So just 20 minutes ago I started thinking and wondering about what types of jobs there are in AI. Is anything popping up in the industry from the AI hype and what are these positions called, what do they precisely do and what are the skills needed to do them? I suspect something like an "AI Architect" for planning AI setups and clearly defining the boundaries of what the AI is supposed to do and explore. Then I presume the requirements for something like an "AI Maintainer" and/or "AI Trainer," which would probably resemble something like an admin of a big data storage, looking at statistics and making educated decisions on which "AI Training Paths" the AI should continue to explore to gain the skill required and deciding when the "AI" is ready to be let go on to the task. You're seeing we -- AFAIK -- don't even have names for these positions yet, but I suspect, just as in the internet/web boom 20 years ago, that is about to change *very* fast.

And what about Tensor Flow? Should I toy around with it or are we past that stage already and will others do AI setup and installation better than me before I know how this thing really works? Because I also suspect most of the AI work for humans will closely be tied to services and providers such as Google. You know, renting "AI" as you rent webspace or subscribe to bandwidth today. Any services and industry vendors I should look into -- besides the obvious Google that is? In a nutshell, what work is there in the field of AI that can be done and how do I move into that? Like now. And what should I maybe get a degree in if I want to be on top of this AI thing? And how would you go about gaining skill and knowledge on AI today, and I mean literally, today. I know, tons of questions but insightful advice is requested from an educated slashdot crowd. And I bet I'm not the only one interested in this topic. Thanks.
Data Storage

Why Does Microsoft Still Offer a 32-bit OS? (backblaze.com) 367

Brian Wilson, a founder of cloud storage service BackBlaze, writes in a blog post: Moving over to a 64-bit OS allows your laptop to run BOTH the old compatible 32-bit processes and also the new 64-bit processes. In other words, there is zero downside (and there are gigantic upsides). Because there is zero downside, the first time it could, Apple shipped with 64-bit OS support. Apple did not give customers the option of "turning off all 64-bit programs." Apple first shipped 64-bit support in OS X 10.6 Snow Leopard in 2009. This was so successful that Apple shipped all future Operating Systems configured to support both 64-bit and 32-bit processes. All of them. But let's contrast the Apple approach with that of Microsoft. Microsoft offers a 64-bit OS in Windows 10 that runs all 64-bit and all 32-bit programs. This is a valid choice of an Operating System. The problem is Microsoft ALSO gives customers the option to install 32-bit Windows 10 which will not run 64-bit programs. That's crazy. Another advantage of the 64-bit version of Windows is security. There are a variety of security features such as ASLR (Address Space Layout Randomization) that work best in 64-bits. The 32-bit version is inherently less secure. By choosing 32-bit Windows 10 a customer is literally choosing a lower performance, LOWER SECURITY, Operating System that is artificially hobbled to not run all software. My problem is this: Backblaze, like any good technology vendor, wants to be easy to use and friendly. In this case, that means we need to quietly, invisibly, continue to support BOTH the 32-bit and the 64-bit versions of every Microsoft OS they release. And we'll probably need to do this for at least 5 years AFTER Microsoft officially retires the 32-bit only version of their operating system.
Media

Ask Slashdot: What Is Your View On Sloot Compression? (youtube.com) 418

An anonymous reader writes: A Dutch electronics engineer named Jan Sloot spent 20 years of his life trying to compress broadcast quality video down to kilobytes -- not megabytes or gigabytes (the link in this story contains an 11 minute mini-documentary on Sloot). His CODEC, finalized in the late 1990s, consisted of a massive 370Mb decoder engine that likely contained some kind of clever system for procedurally generating just about any video frame or audio sample desired -- fractals or other generative approaches may have been used by Sloot. The "instruction files" that told this decoder what kind of video frames, video motion and audio samples to generate were supposedly only kilobytes in size -- kind of like small MIDI files being able to generate hugely complex orchestral scores when they instruct a DAW software what to play. Jan Sloot died of a heart attack two days before he was due to sign a technology licensing deal with a major electronics company. The Sloot Video Compression system source code went missing after his death and was never recovered, prompting some to speculate that Jan Sloot was killed because his ultra-efficient video compression and transmission scheme threatened everyone profiting from storing, distributing and transmitting large amounts of digital video data. I found out about Sloot Compression only after watching some internet videos on "invention suppression." So the question is: is it technically possible that Sloot Compression, with its huge decoder file and tiny instruction files, actually worked? According to Reddit user PinGUY, the Sloot Digital Coding System may have been the inspiration for Pied Piper, a fictional data compression algorithm from HBO's Silicon Valley. Here's some more information about the Sloot Digital Coding System for those who are interested.
Movies

What Are Some Documentaries and TV Shows That You Recommend To Others? 278

Reader joshtops writes: Wow thanks for the overwhelming response on my previous post. I'm taking notes and intend to give all of the suggested books a go in the near future. If I may, and I hope the editors approve of this, could you also list some of your favorite TV shows and documentaries? Also, is there any show or documentary you think that changed or influenced your life, or at least your perception on any particular subject?
Government

Slashdot Asks: Is Trump's Blocking of Some Twitter Users Unconstitutional? (usatoday.com) 390

An anonymous reader shares an article: Some Twitter users say President Trump should not be able to block them on the social network. The president makes unprecedented use of Twitter, having posted more than 24,000 times on his @realDonaldTrump account to 31.7 million followers. His tweets about domestic and foreign policy -- and media coverage of him and his administration -- has transformed Twitter into a public forum with free speech protections. That's the opinion of two Twitter users, who have the backing of the Knight First Amendment Institute. They are sending a letter today to the White House asking Trump to unblock them on his @realDonaldTrump Twitter account. Both users say they were blocked recently after tweeting messages critical of the President. Holly O'Reilly (@AynRandPaulRyan), whose Twitter account identifies her as a March for Truth organizer, said she was blocked on May 23 after posting a GIF of Pope Francis looking and frowning at Trump captioned "this is pretty much how the whole world sees you." In the letter to Trump and the White House, the Knight First Amendment Institute's attorneys argue that Trump's Twitter account "operates as a 'designated public forum' for First Amendment purposes, and accordingly the viewpoint-based blocking of our clients is unconstitutional." In some other news, Press Secretary Sean Spicer said today "@realDonaldTrump's tweets are official White House statements."
Programming

Ask Slashdot: How Does Your Team Track And Manage Bugs In Your Software? 189

Slashdot reader jb373 is a senior software engineer whose team's bug-tracking methodology is making it hard to track bugs. My team uses agile software methodologies, specifically scrum with a Kanban board, and adds all bugs we find to our Kanban board. Our Kanban board is digital and similar to Trello in many regards and we have a single list for bugs... We end up with duplicates and now have a long list to try and scroll through... Has anyone run into a similar situation or do things differently that work well for their team?
The original submission ends with one idea -- "I'm thinking about pushing for a separate bug tracking system that we pull bugs from during refinement and create Kanban cards for." But is there a better way? Leave your own experiences in the comments. How does your team track and manage bugs in your software?
Programming

Ask Slashdot: Is There a Way To Write Working Code By Drawing Flow Charts? 264

Slashdot reader dryriver writes: There appear to be two main ways to write code today. One is with text-based languages ranging from BASIC to Python to C++. The other is to use a flow-based or dataflow programming-based visual programming language where you connect boxes or nodes with lines. What I have never (personally) come across is a way to program by drawing classical vertical (top to bottom) flow charts. Is there a programming environment that lets you do this...?

There are software tools that can turn, say, C code into a visual flow chart representation of said C code. Is there any way to do the opposite -- draw a flowchart, and have that flowchart turn into working C code?

Leave your best answers in the comments.

Slashdot Top Deals