Please create an account to participate in the Slashdot moderation system


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:easily made up in peripherals. (Score 1) 490

Our company has definitely experienced a bit of this .... BUT this is also where I.T. needs to step in and set some ground-rules, if the budget is limited.

For example? I got tired of all the expenditures on (IMO inferior) "Magic Mice". So I did some research and found a couple of solid,reliable bluetooth mice that could be used instead at far less cost. There's a Microsoft model that only costs about $30-35, for example. Doesn't support "gestures" by rubbing the top of it, but that's just really NOT necessary for any use-case I've seen a user come up with. It does have a scroll wheel which doubles as an assignable 3rd button and 2 regular buttons, and is very conservative on battery usage too.

If someone wants that "Magic Trackpad" mouse? I don't have a problem with buying that one, IF they honestly can be more productive with it. This is no different than in the Windows world though, IMO. A trackpad mouse will cost a premium for Windows too. It's a different way to interface with the machine, and if you really do a task regularly where the gesture support is helpful? One of these beats trying to do the gestures on top of a regular "Magic Mouse" any day.

We had to fight with some of our Creative professionals about the displays, as well. They were just SO certain that those nearly $1,000 each Thunderbolt Cinema displays were better than anything else. We cracked down on that and started physically removing a couple of them that they'd gone around I.T. and purchased for their department, substituting a matched PAIR of HP 24" IPS panel displays with anti-glare coated screens. After a few days working with a dual display environment with glare-free versions of panels with the same or better color calibration capabilities and similar resolutions? They really had no argument to want the Apple displays back. (And to be fair, those displays were kind of slick IF you had the right make/model of Macbook Pro laptop to pair with them, since they acted like a port replicator for one at the same time. But they replicated ports like Firewire 800, which are getting phased out in current machines!)

Comment Not quite a fair statement, but ..... (Score 1) 490

Yes, Macs are more friendly to users who aren't willing to learn more than the very basics of how to navigate a computer. They're far less likely to succumb to random malware/spyware/virus threats than Windows machines -- and in my experience as a regular Mac user, when they DO get infected? They tend to clean up more quickly and painlessly too. (EG. They make a Mac version of Malware Bytes now, and it generally knows how to fully clean just about any of the Mac malware created to-date. It runs quickly, does its thing, and after a reboot - chances are high that you're back to normal. There simply aren't the challenges the Windows world faces of people constantly modifying existing malware into new variants that hide in different sub-folders, do different kinds of damage, etc.)

On the other hand? There's no good reason to claim you can somehow do more on a Windows PC, and/or a Mac is only appropriate for the most clueless of users.

You may have a personal hatred for Apple and possibly even for the design of Mac OS X ... but quite a few "power users" use them all day long, every day, to get real work done.

I work for a company that has close to a 50/50 split of Macs and Windows machines in use (we let employees choose which they prefer in most cases). It's really not a problem managing the mixed environment, other than a bit of extra work creating 2 sets of instructions with different screen-captures for Mac and Windows, when you want to document something. As it stands today? The Mac actually makes it easier to get a VPN connection going from a PC back to the office network. We use Cisco Meraki hardware which doesn't provide any special "extra friendly" VPN connection client. You're just supposed to properly configure what's built into the OS. On the Mac side, that pretty much "just works". In Windows, there are still annoying bugs in Microsoft's TCP/IP stack implementation that can create "gotchas" -- even when you use Windows 10. (For example, if you don't manually edit the "metric" values for each adapter, ensuring the VPN adapter in the list has a higher metric manually set, like 15? Win 10 will stupidly try to send out DNS lookup requests over ALL the available adapters, instead of only going through the VPN tunnel when it's up.)

And especially with the new update mechanisms Microsoft now uses in Win 10? It's just creating a lot of needless havoc. For example, we have a number of Surface Pro 4's out in the field, and because Microsoft insists on pushing updates through at some scheduled time (defaulting to 3AM or something like that), it will leave the tablets in odd states at times. People leave their system on to go into "sleep" mode overnight, and when they come back in the morning? They may have a solid black screen and seemingly unresponsive computer. Bingo... another trouble ticket gets put in, "high priority", for I.T. to troubleshoot. In reality, it can be things as simple as the Intel video driver getting an update pushed to it that needed a full reboot to start working correctly again. This is NOT something I've ever had issues with on the Mac side.

Comment AI -- FAR more hype than substance (Score 2, Interesting) 206

I'd argue that as far as I've seen, practically every single project or experiment labeled "AI" is really just fake intelligence.

In other words, you've cobbled together a mechanism so a standard human language formatted query (spoken or written/typed) can be parsed out and searched in a useful way through extensive databases of information and a sensible result spit back, again in a manner that mimics a human's way of communicating the result.

This is a pretty cool thing, as we've seen by how handy the "personal assistants" like Cortana or Siri can be on our smartphones.

But IMO, Hawking is talking about achieving a way to simulate the way a human brain actually thinks. That's something we're NOWHERE near doing successfully, and I'm not even sure it's realistic to pretend we could with today's computer technology.

For starters, it's becoming more and more clear that humans don't really file away tons of information in our brains like a computer does on a hard drive in a database. A big part of what we "remember" goes to "short term memory", meaning we'll try to keep it in our heads for a little while -- but as soon as it becomes something we don't need to recall again for a period of time, it starts fading away and eventually is forgotten. At the same time though? Our brain seems to make lots of other connections to these things. (Even though you forgot, say, an old phone number of a friend you haven't called in years? When you see the number again, you may recognize it from a list of other random phone numbers and remember that's the one you USED to remember. Computers don't do that.)

The entire concept of being "reminded" of something is pretty foreign to how binary computers compute... They either have or don't have information. They don't struggle to remember and occasionally recall things, and/or realize they used to know them when reminded.

Comment re: inkjet printers (Score 1) 307

Yeah, good point.... I admit I used a bad example with the printers. To be honest though, it's been quite a while since I bought one. I still own and use several older ones here, and in at least one of those cases, it actually did include a USB cable with it. But sure, the cost of a cable is relatively minimal and if they're going to make you buy it separately anyway -- no big deal to go with a USB-C type.

But flash drives are going to be a problem, as are plenty of specialty cables. (EG. I have a USB to OBDII programming cable that's needed to download custom tunes to one of my cars.) I guess you can use any of this stuff with USB-C to USB adapters or a hub that converts the connection -- but again, that's extra stuff you have to carry with you on a laptop, so not really an attractive option.

Comment Not happy at all for a "Pro" laptop from Apple.... (Score 4, Interesting) 307

I've been a long time Apple supporter, even going so far as to pay all the $$$'s for one of the late 2013 "trash can" Mac Pro workstations, shortly after it was released. (I did that only because I owned both a 2006 and 2008 Mac Pro tower before it, and both were excellent computers that I got years of daily use out of -- paying for themselves several times over with the work and entertainment value I got out of them. I figured I'd invest in the new direction Apple was taking things, with faith they'd make sense of what seemed at first to be kind of a step backwards in design and functionality.)

Well, unfortunately, what I'm seeing is a trend away from Apple catering at all to "power users" or "computer enthusiasts". Under Steve Jobs, at least their push towards minimalist styling/design was still well-balanced with giving the user what they really needed to get things done. (EG. When Apple declared the 3.5" floppy was dead and removed it? The rest of the Windows PC world thought that was crazy. Yet the advent of IOMega Zip disks, Syquest cartridges, dirt cheap CDR media, flash drives, SD and CF cards and more proved Apple was right. They were just pushing people a little further towards that "cutting edge" of tech, instead of sitting complacent in the middle of the "tried and true, but fading in usability" zone of technology. And when Apple decided to quit including optical drives in any of their systems? Again, some people threw fits but it's ultimately proved to be the sensible solution. External CD/DVD/Blu-Ray players and recorders are cheap and easy to plug in if/when needed, and they don't bulk up or weigh down a computer when you DON'T intend to use one. It also means when they break down, which they do fairly often with all their mechanical parts inside, they're easier to replace.)

With Thunderbolt? I feel like Apple tried, once again, to "skate to where they thought the puck was going to be" instead of to where it was. But that time, perhaps they took a chance and weren't quite right. Nonetheless, it wasn't really a big problem for users because it was only there in addition to plenty of other ports. The ability for Apple's Thunderbolt port to double as a "Mini DisplayPort" connector ensured people would use it with a dongle to attach extra monitors even if they never used it for anything else. And on higher end systems like my Mac Pro? It's actually quite useful since you pretty much need some kind of external drive enclosure to have a decent amount of storage space directly attached to the machine. There are a number of good options for multi-drive cabinets with Thunderbolt connections, and it provides great throughput without bottlenecking a USB bus.

But now, I feel like options are getting deleted just because Apple would prefer to have fewer configuration options to stock in their lineup, or because they're pushing change just for the sake of being different. (That whole elimination of matte vs. glossy displays is a great example, even if it still happened under Steve Job's watch. There was clearly a LOT of demand for anti-glare screen displays, yet Apple simply ignored it and told people "Tough luck. We think you'll love our product enough to buy it anyway, so we don't care.")

This move to USB-C? I think the new standard is just fine for netbooks or "Ultraportables" where people are primarily concerned about how light and thin it is, and probably don't WANT to connect very much up to it. But it definitely has no business in a Macbook PRO laptop being sold any time this year ... Not unless it's just there in additional to a couple of regular USB 3.0 ports. Otherwise, you're ignoring a universal standard that has no signs of dying yet. Go shop for a new inkjet printer and tell me how many have USB-C connections on them vs. traditional USB right now. Same for any digital cameras with connection cables.)

Secondarily, I agree that this change means eliminating a connector (mag-safe) that really does offer a great feature that competing laptops never had. IMO, if Apple really wants to move things forward again, they should create a laptop that inductively charges from a charging transmitter that you place anywhere in the same room with it. Moving it to charging via USB-C is a step sideways, if not backwards.

Comment It's also about one's mentality though .... (Score 1) 403

When you talk to some of these people who work full time for low wages, yet buy something like a high-end automobile on credit? You discover something interesting. Most of the time, it's not about them being so unable to do basic math that they don't realize they're "living above their means".

Rather, they're taking the attitude of, "Screw it.... If some lender is willing to let me get this, why not do it? Then I can drive something around I'm proud to be seen in and enjoy driving. If something happens and I can't make the monthly payment anymore? Oh well... let them come take it back from me. At least I got to enjoy using it while I had it."

In other words, they'd see YOU as the sucker for working as hard as they do, and still settling for driving around some 10 year old Toyota. I mean, YOU'RE the one playing into the hands of the bankers and the "system" -- all worried about hurting your credit score, instead of realizing that in the worst case scenario, you can just file bankruptcy, wipe away all the debts while hanging onto most of what you amassed up until then. Wait for 7 years and you're right back to where you were before with those scores and levels of "credit risk".

Comment Re:I like working. People are different. (Score 1) 403

Honestly, I think one of the sad things is that relatively few companies manage to make the work environment pleasant enough so people actually like (or at least don't dislike) working.

There are always going to be jobs to do that practically nobody enjoys.... But that's also part of the beauty of Capitalism. There will be people who do them anyway because they aren't skilled enough to move up to something better, and that may motivate them to learn those new skills. Others will be ok doing work they dislike because for them, it's still acceptable to them to spend that portion of their lives doing things they dislike so they have the income to do things they DO like when they're not working there.

But in general, I think MOST jobs can be made much more pleasant than they are. This is one of the things good management can help accomplish. It's often a matter of changing around a few rules to allow some of the "little things".... Let people listen to music while working, maybe? Or relax the dress code so jeans are acceptable, if that makes some of your workers happier. Organize small blocks of social time, perhaps? (One of my jobs was in a tall building, so they worked something out with the building management to let us have the rooftop area once a month to do a rooftop lunch.) Personally, I've never accepted a job for any length of time at a place where I hated going in. Even if the only thing a place had going for it was a group of co-workers I liked talking to, it had to offer at least something like that of value. Otherwise, no ... just quit and find a place that's a better fit for you.

Comment Re:Misleading results (Score 1) 403

Perhaps.... but at least in your example of "safety equipment and adaptations for people with disabilities" -- I'd say those are pretty well covered here in the U.S. today. In fact, some would say we overdo the "catering to folks with disabilities" with some of our legislation. (EG. Strict rules about how many handicapped parking spots you have to make available as a percentage of the total means in some establishments, you have row after row of them, unnecessarily wasting space and going unused.) And for some small businesses, they're forced to pay out large sums of money retrofitting older buildings to be wheelchair accessible as soon as they try to do any kind of update to the property that requires an inspection or permit -- even if they have no wheelchair bound employees at all, and no need for customers to access those parts of the building.

I'm not sure what types of safety related things you imagine the U.K. has in place that the U.S. doesn't, which they'd be eager to remove to be more competitive either? But from everything I've seen, safety is taken fairly seriously in the U.S. Organizations like OSHA impose all sorts of safety requirements, and even at the level of a person buying a used home, the homeowner's insurance often does a drive-by inspection and demands changes such as putting up railings on any outside steps, or else you risk getting your policy cancelled. The days of companies not caring at all about worker safety pretty much went out with the initial rise of the unions to power -- leading to those demands getting enshrined into Federal law of the land.

Comment re: quality applications (Score 1) 889

That's one argument you can make ... but my point is, this particular college also offered an opportunity where you get a full 6 weeks of vacation per year, starting with your first year of employment, and work weeks are 35 hours, not 40, despite receiving all the same health insurance and related benefits of any 40 hour per week job. The salary is definitely not measurably lower than what anyone else offers for the same type of work, given those facts.

If computer support is considered such as "low level" type of work today (as some people on here claim), then surely it's within the abilities of some of these folks currently upset that their retail or fast food job won't pay them $15/hr. or more? Study to get an A+ certification and show some enthusiasm for wanting to work someplace, and you're essentially qualified.

But no... I guess it's easier to bellyache and claim we all need a basic income paid to us for doing nothing instead?

Comment UBI? Often discussed but premature, IMO ..... (Score 2, Interesting) 889

As a libertarian, many people expect I'm going to be completely against the concept of a UBI. However, that's not really the case. What I *am* against is the premature pushing of it on people in still-functional Capitalist society, which amounts to an appeal to convert to Socialism.

The UBI makes complete sense as a way to handle the economy in a POST Capitalist world, which we're nowhere near ready to transition to. (Just because you have some fear-mongering about specific industries like trucking going away doesn't mean "all the jobs are gone".)

From what I've observed on the hiring end of the equation? There are actually a lot of decent-paying, respectable jobs out there that go unfilled for months because the quality of the applicants is pretty terrible. My wife recently helped interview over a dozen people for a computer support job at a local college, and she showed me the "best of the bunch" of 40 or 50 resumes they received. We were laughing at how bad several were, including misspellings and people who had NO clue how to sell themselves as having any useful skills.

When they finally did interview 4 or 5 people? One of them showed up an hour late. They agreed to reschedule him and give him another chance, only because he had an excuse about a traffic accident on the highway keeping him from making it on time. When he was due to come back, he waited until 10 minutes before the scheduled time to tell them he wasn't interested any longer. Another candidate was a woman who actually sounded like she had good credentials on paper and they were excited to possibly offer her a position, but she was so "ho hum" about the whole interview, they decided to move on. She not only made no effort to dress nicely for it, but when asked questions about what she did in her previous jobs, etc. -- she just gave really brief answers, acted like she was bored, and didn't do a thing to impress anyone.

I see evidence of a similar mindset in other areas too, including this uproar over a $15 minimum wage.... In reality? You should really be able to eliminate the "minimum wage" completely and it would make no real difference. Why? Because first of all, there's no one number anybody can quote you that *really* makes sense as THE proper starting wage to pay people that's "fair" instead of "unfair". Depending on where you live in the country, the cost of living is radically different, for starters. A very small percentage of people in America actually work for the mandated minimum wage, and when they do? A big percentage of THOSE are people who earn tips - meaning it's almost not even fair to count them in those totals to begin with. What you wind up left with are a lot of people who don't really need a "living wage" in the first place. (For example, many of the mentally or physically handicapped people are already receiving SSI benefits, and can't earn much of their own income or they lose those. Yet they want to feel like productive members of society and get out of the house. So they'll accept very low paying jobs, doing such things as putting advertisements in envelopes. They don't really WANT a higher wage because it'd put them in a much worse situation overall than what they get without it.) But people keep pushing for this with the mindset that by boosting the minimum by another $X per hour, that translates to "across the board" increases of about the same amount. And that, in turn, means they can do some minimally skilled job of limited real value to an employer but receive the type of pay you should really only get for doing something much more valuable to society. I don't believe that really works except in the short-term, before the overall economy has time to adapt to the changes (inflation).

Comment re: Rob Zombie (Score 1) 550

IMO, he gave a better answer/explanation for his request than many musicians do who have problems with phones.

Still? That whole attitude rubs me the wrong way. I'm from the generation who listened to White Zombie when it was new, and part of what gave that music its character was all the dubbing in of clips of sound f/x and people talking in old horror movies. In other words, he has re-recording pieces of other artist's work to thank for making his own music better.

But now, he takes issue with people recording his own performance.

Rock seems like it's dead to me, NOT because of people wanting to record parts of concerts they attend -- but because the core audience is older. I went to see Disturbed and Breaking Benjamin in concert recently. There just weren't that many of the energetic, partying college students and 20-somethings in the audience. You had far more middle-aged or older folks who got more excited about Disturbed's remake of Simon & Garfunkel's "Sound of Silence" than anything else. As a 40-something myself, I'm not going to scream my lungs out and go crazy jumping around at a concert anymore. Not happening when it means it'll impact my ability to do my job the next morning or other commitments. I think many of the people going to ROCK concerts today are in a similar mindset. We still love the music and want to experience it live, but we're happy to sit on the lawn drinking a beer and maybe eating a slice of pizza or a burger from the concession stand while we take it all in. If you, as a performer, need the whole audience going crazy to validate what you're doing? I can understand that, but that's going to be an ongoing challenge for you when you perform a type of music that appeals to an audience that's maturing and aging.

Comment Not a fan of this stuff .... (Score 1) 550

Personally? I support the right of musicians, comedians, and other performers to make these sorts of demands of their audience. As soon as you buy a ticket to any show, you've agreed to all sorts of contractual obligations (as printed in the fine print on the back that few people bother to read). So this is really just one more requirement to add to the list of things you can't do (like bringing in coolers).

BUT, I'd probably refuse to pay for a show with this rule in place. I know it drives some people crazy seeing others trying to take photos or video of a live performance with their phones. But in this day and age? That's a part of the experience people are paying for. I know when my wife and I went to see Van Halen last year, for example? We both grabbed video recordings of Eddie doing his guitar solo. As far as I'm concerned, that's one of those moments of "rock history" worth preserving. How many more years left of monster guitar playing does the guy have left in him? (And for that matter, how many more times will Van Halen perform live with David Lee Roth?) For what we paid to see it, I feel like getting to take home a little piece of the concert to replay later for friends isn't too much to ask.

Yes, it's stupid trying to record a whole show. All you're going to do is waste the money you spent to see it live so you have poor quality audio and relatively poor video of it that nobody will ever sit through and watch again. But making me put my phone in a locked bag until it's all over? That's a bit much.

Comment That's FUD .... (Score 1) 365

Most instances where you have kids "jumping out in front of cars", you're talking about parking lots or side roads where the kids didn't look both ways before trying to cross the intersection. It's not something that should really ever happen on a highway or interstate where you've usually got fences, a thick tree line, or other things blocking easy access to get onto the highway by way of walking up to it.

The car will probably try to slow down as much as possible, if not come to a complete stop. At most - that would bump the kid(s) but not seriously injure them. Perhaps it would even turn to avoid them while slowing down, since it would know it wasn't going to flip the vehicle at that relatively slow rate of speed.

Comment I think these predictions are questionable .... (Score 5, Interesting) 272

First off, the person who commented that it's "predictable when newer isn't better" is correct. Right now, I.T. technology is stagnant. IMO, that's not a bad thing either. What's happening is, just as many people are using computers and electronic devices as ever -- but the market has matured. There's a very low level of "techno lust" for the latest and greatest, because truthfully, what's already readily available is good enough for everything people want to do with their machines right now.

There's finally some sanity in corporate America, where technology is getting replaced on a schedule that reflects the time-frame it takes for the old gear to physically wear out, instead of demands for 2-3 year replacement cycles just because "the new thing is already old, since it can't run the cool new OS and new versions of apps X and Y".

And really, HP's situation is a different/unique one in the current round of layoffs. HP split their company into two, not long ago, realizing they had to jettison the "boat anchor" of printing/imaging so it didn't weigh down profits of the rest of the business. The HP doing all the layoffs now is the one holding the bag selling printers and scanners. They've also tried to acquire other printer manufacturers like Fujitsu, but ultimately, they're selling a product that's gone from a "must have" companion purchase with every new PC to a niche need. Their investment in 3D printing didn't appear to pan out either. (I think that's turned out to be a niche hobbyist interest, since it's still a very slow process to print a 3D object, the size of said object is really limited, AND it won't realize its full potential until there are much larger collections of downloadable projects to print. If all the major manufacturers of appliances offered a way to print repair parts on their "support" web sites, for example? Then you'd see 3D printing really take off.)

But claiming the people who get laid off have no future in I.T.? That's FUD, plain and simple. The trend to cloudify everything is still strong, but I've worked in the field long enough to say I'm confident it's going to trend back the other direction in the next decade or so. If you just look at the ridiculous number of data breaches in the news in the last year or two, you quickly see the problems with concentrating a large number of customer's data in one place. But hacking aside, the cloud services amount to giving up direct control over big chunks of your business operations. When one of these services has an outage, you can't do anything but sit around and hope they actually provide some status updates to pass along to your users and to management. Everyone's first question is "When will it be back up?" and you're stuck shrugging and saying, "Beats me! We're at the mercy of the provider." When this happens enough times, companies start demanding some more accountability and control. And you tend to get locked in to these services too. So if they raise prices or change pricing structures, you can't do much besides pay the new, higher bill (or go through a huge, unplanned project to migrate all the data elsewhere and retrain everyone on how to access it). I'm not sure how a cloud provider decides someone with many years of general I.T. experience, including such things as administering servers, troubleshooting networks and supporting staff wouldn't have skills applicable to working for their business anyway? But cloud is an overrated buzzword, either way.

Comment Wow .... Just ..... wow .... (Score 1) 78

I was just complaining about an earlier news item posted on Slashdot for a Facebook corporate chat client they were apparently going to sell as competition to apps like Slack chat. And now THIS?

I'm even a regular Facebook user and I have to say I hope this thing crashes and burns!

There are SO many options out there to handle "internal social media". Where I work, we've been using Salesforce "Chatter" for this stuff for years.
WHY would anyone decide Facebook is the optimal thing to re-purpose as an Intranet site of sorts, for employees to share content?

Again, they're very late to the party and offering something that nobody should really need - hoping your company buys it anyway based on the familiar branding alone.

Slashdot Top Deals

"Probably the best operating system in the world is the [operating system] made for the PDP-11 by Bell Laboratories." - Ted Nelson, October 1977