Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment I wear a watch, so obviously (Score 1) 381

I am interested in watches.

Whether or not a smart watch is worth it is an open question. If they can provide me something that I think I need with it, then sure. I've outlined a list in comments on previous stories, for quasi-trolls that were about to lash into me for being so general.

But I wear an automatic mechanical beater right now—specifically because it's virtually indestructible, represents only a minor investment (and thus financial risk), and requires no maintenance, attention, or battery-swapping. It's accurate to about 2 minutes per year, which means that about once a year I tune the time on it.

Most of the stuff that smartwatches are currently being said to do I either don't care about (fitness tracking, health monitoring) or currently use a smartphone for with far less hassle (bigger screen, more natural UI) so it'll be a stretch. But I'm open.

Comment Um, this is how it's supposed to work. (Score 3) 109

Journals aren't arbiters of Truth (capital T), they're just what they say they are: JOURNALS of the ongoing work of science.

Someone records that they have done X in a journal. Because said journal is available to other scientists, other scientists get to try to make use of the same notes/information/processes. If they are able to do so, they journal it as well. Get enough mentions in enough journals that something works, and we can begin to presume that it does.

If only one mention in one journal is ever made, then it is just another record in another journal of another thing that one scientist (or group of scientists) claim to have done.

Peer review is just to keep journals from expanding to the point that there is too much for anyone to keep track of or read. It is emphatically NOT the place at which the factuality or truthfulness of notes/information/processes are established once and for all. That happens AFTER publication as other scientists get ahold of things and put them through their paces.

Seriously, this is all exactly as it is supposed to work. I have no idea why there is such hoopla about this. There is nothing to see here. One group journaled something, other groups couldn't replicate it, they no doubt will reference this failure in future articles, and "what happened" is recorded out in the open for all of science, thereby expanding our pool of knowledge, both about what consistently works in many situations and of what someone claims has worked once in one situation but appears either not to work in the general case or requires more understanding and research.

Again, there is nothing to see here. Let's move on.

Comment Regardless of the accuracy of the numbers, (Score 1) 190

this would seem to be moot to me. Humans have only been here for the briefest of very *recent* moments, but we do have a particular interest in keeping earth habitable for *human* life.

Assuming your numbers are correct, it still doesn't do us any good to say that gosh, a few million years ago there was a lot more carbon dioxide, if for the purposes of *human* life a particular (and lower) level is necessary.

The goal is for us, not for the earth itself, to survive.

Comment Not tell time. (Score 1) 427

1) Monitor and keep and continuous chart of blood glucose, sleep cycles, blood pressure and pulse rate, blood oxygenation. Don't even know if the tech is viable for these, but they'd interest me.
2) Be part of a payments system that actually gets traction out there. Let me import all of my cards of various kinds and then provide them wirelessly to others without having to pull out a card (and/or a phone with a specialized app).
3) Same thing, but hold all of my tickets for entry into events.
4) Connect to a voice-to-text service to enable personal logging/journal-keeping just by talking at it.
5) Find a way to operate clearly and reliably using gestures and voice recognition rather than touch input when desired.
6) Have built-in GPS and voice navigation.
7) Have a built-in high-resolution camera to enable convenient visual capture of information.
8) Do all of this in a cloud-based manner so that everything that the watch did/tracked was available from all of my other tech devices.
9) Have a between-recharges time measured in weeks.

I don't know, it would have to be pretty freaking fabulous. But there are some basic things that I *don't* care if a smartwatch does, and those are probably more telling. I absolutely do not care about doing these things on a smartwatch:

1) Calls
2) Web
3) Email
4) Facebook
5) SMS
6) Linking it to my phone via bluetooth
7) Telling time

Number 6 in particular is a non-starter for me. Battery life on phones is already too short. And phones are the devices that I use for web, email, and other informational tasks on the go because they (not a smartwatch) have the screens suitable for reading/editing. I need them to last as long as possible, and I have no interest in duplicating their functions on a smartwatch. So I refuse to enable bluetooth on my phone all the time just to get some additional "watch" features.

It needs to be a "standalone" device in the sense of no other devices needed for it to operate normally, but a completely cloud-integrated device in the sense of "but I can access everything it does and it can access everything I do on my other devices over the network."

Number 7 is also pointedly interesting. I don't care if something on my wrist can tell time. Social "time" as a concept is more ambient than ever. Everything has a clock on it. Your computer. Your phone. Your thermostat. Your radio. Your car dash. Every ticket machine of every kind, from movies to transit to events. Public spaces and the sides of buildings and billboards and retail shop signs. I don't look at my wrist or my phone to know what time it is. I do a quick visual 360 and in general, I find what I'm looking for, wherever I happen to be. A "time-telling device" is frankly a bit 19th/early-20th century a this point.

Comment Ugh. I hate it when (Score 1) 284

the very ruthless and very rich tell us that money doesn't solve problems.

Well, you already got everyone else's money after being absolutely driven to do so for decades. Now you tell us that having tons of other peoples' money is no good anyway when it comes to really important stuff. Hmm...

So rather than sit on a pile of billions that you've tied up after getting it from other people, just give it back if you've now learned that it didn't do all that much good in the first place. No? Well then, you're either a liar or a hypocrite.

Comment Seriously? (Yes, seriously.) (Score 4, Insightful) 466

I do this all the time in my line of work. Someone hands us a data dump of 2 million lines in a messy CSV file with dozens of columns that's old collected data. We benefit if we can make use of it--but we have to clean it up first.

It's a one-time job--script something up to process the data into cleaner data and format it as we need it to make use of it ourselves. Then, toss the script away.

There's a big difference between "software engineering" and "coding." Coding skill is useful in all kinds of edge cases that mainstream applications don't handle, and when you're just looking to finish an edge-case task, you don't need to think about correctness or maintainability. You certainly don't want to pay a contractor $$$ to cook something up if it's only three dozen lines and will only be used for two weeks. For those casesÃ"the "who cares if it works six months from now, we just need it to do something specific for us this afternoon" caseÃ"you just need it to work once or twice, during the current project, as a part of the *process* of getting it done, not as an *asset* to monetize over time.

I totally get where he/she is coming from.

Comment And the benefit is... (Score 1) 209

what, exactly? Calendars are synthetic tools used to synchronize human activity. That is their one and only value. They do not exist in nature; nature synchronizes with itself without our intervention.

But we need a shared, common way to refer to particular dates in time so that we can refer to records and events retrospectively and arrange for future events prospectively—together, in a coordinated fashion.

So your proposal replaces one time measurement system on which everyone is more or less on the same page, in which the representation of a particular moment in time is broadly accepted across a large swath of humanity...by another system in which across that very same swath of humanity, a moment in time can be represented in multiple ways.

This would seem to reduce, not increase, the value of a calendar for all practical intents and purposes.

This proposal is most likely to catch (well, let's be honest, it's never likely to catch) but it's most likely to catch in advanced industrial/post-industrial societies where the resources and level of education to make use of it are in place. So you're proposing to introduce extensive new ambiguity in timekeeping into the population in which there is currently the least ambiguity in timekeeping.

Again, seems ass-backward to me.

Submission + - Has Apple lost its cool? eBay shoppers seem to suggest the opposite (terapeak.com)

An anonymous reader writes: There's been a lot of buzz recently about Apple having lost its "coolness factor," but a new report from market research company Terapeak—one based on hard data—says that on eBay, in four major categories (tablets, phones, desktops, and laptops) Apple goods are doing as much business as all other manufacturers combined right now. That's shocking if true. Have the assertions of Steve Jobs' importance been greatly overstated or is the Apple crash yet to come?

Submission + - The Science Behind A Time Lapse Night Sky

StartsWithABang writes: Recently, time-lapse photographer Thomas O'Brien put together his first video of the night sky, focusing on meteors and using nine years of footage to do it. But the majority of what you're seeing in that video isn't meteors at all, but presents an amazing opportunity to showcase what you actually see (and why) in the night sky. Enjoy the science behind a time lapse night sky.

Comment Re:MS should focus on winnable battles (Score 1) 379

Cute. But check out the Gartner numbers (amongst others) for the last few years.

Overall PC shipments were down 12% from 4Q12-4Q13. Meanwhile, Mac shipments were up over the same period by nearly 30%. While year-over-year PC shipments have been falling since 2011, Mac shipments have seen steady year-over-year growth for a decade.

In my corner of the SaaS world, it's clearly a Mac game now. I'm sure there are many areas where this is not the case. But the trend numbers are not good for Microsoft at all, and when combined with the reception of Windows 8 (not just here but across trade press as well as online generally), and the comparison of tablet unit shipments vs. PC unit shipments, I think MS is better off focusing on what MS products can be made mission-critical on other platforms, not on the continued dominance of their own platform.

Comment MS should focus on winnable battles (Score 3, Interesting) 379

The Windows battle is largely over, and they have lost.

On mobile devices, which are the most ubiquitous form of computing on the planet today, they are effectively out of the game for this round. Their only shot there is to become the next big innovator launching the next paradigm of computing—something that MS has never been able to do before.

In productivity computing, a decade ago it was still a Windows world, but I've seen shop after shop effectively go Mac in recent years. First the door is opened—and once employees and/or departments are able to opt for Macs to do their work, the balance goes from 90/10 Windows to 90/10 Mac in the space of one or two upgrade cycles. Apple significantly outpaced the PC industry overall in unit shipment performance over 2013 (particularly 4Q) and this matches what I'm seeing in business meetings across partnerships—senior reps from four companies are in the room and now the Windows guy is the odd guy out and everybody snickers a little. Or you're in a multi-hour videoconference on GoToMeeting and the one guy that's sharing a Windows screen rather than a Mac screen stands out like a sore thumb. It's the opposite of what you'd see over the '90s and '00s.

But Exchange and Office remain ubiquitous—more and more people in business are using a Mac but their Mac is invaribaly outfitted with MS Office (because iWork simply doesn't compare) and their entire business lives are accessed from Outlook. Finding ways to better integrate mobile Android/iOS offerings into their Exchange/Office universe would open a natural space for strong growth and continued dominance in critical business infrastructure. The focus on Windows and hardware is a head-scratcher.

The most worrying thing for Microsoft is that I've started periodically receiving OpenOffice/LibreOffice/Google Docs/Drive word processing and spreadsheet documents over the last year or so. That never, ever happened for the first decade and a half of my life in business (since about 1997) and now, suddenly, I've received about 20 documents like this this year from people at five different companies—without anyone mentioning it or even apologizing ("Hope you can open this!").

I don't know if the investment required to make a plausible attempt at reversing Windows' downward slide in market position is worthwhile. I suspect it's far more important for MS to shore up and grow their Exchange/Office business. Nobody is really challenging them yet in this space, but if a viable competitor were to emerge, the forces and trends related to Windows now pull *away* from Microsoft platforms rather than irresistibly toward them.

Slashdot Top Deals

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...