Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment I wear a watch, so obviously (Score 1) 381

I am interested in watches.

Whether or not a smart watch is worth it is an open question. If they can provide me something that I think I need with it, then sure. I've outlined a list in comments on previous stories, for quasi-trolls that were about to lash into me for being so general.

But I wear an automatic mechanical beater right now—specifically because it's virtually indestructible, represents only a minor investment (and thus financial risk), and requires no maintenance, attention, or battery-swapping. It's accurate to about 2 minutes per year, which means that about once a year I tune the time on it.

Most of the stuff that smartwatches are currently being said to do I either don't care about (fitness tracking, health monitoring) or currently use a smartphone for with far less hassle (bigger screen, more natural UI) so it'll be a stretch. But I'm open.

Comment Um, this is how it's supposed to work. (Score 3) 109

Journals aren't arbiters of Truth (capital T), they're just what they say they are: JOURNALS of the ongoing work of science.

Someone records that they have done X in a journal. Because said journal is available to other scientists, other scientists get to try to make use of the same notes/information/processes. If they are able to do so, they journal it as well. Get enough mentions in enough journals that something works, and we can begin to presume that it does.

If only one mention in one journal is ever made, then it is just another record in another journal of another thing that one scientist (or group of scientists) claim to have done.

Peer review is just to keep journals from expanding to the point that there is too much for anyone to keep track of or read. It is emphatically NOT the place at which the factuality or truthfulness of notes/information/processes are established once and for all. That happens AFTER publication as other scientists get ahold of things and put them through their paces.

Seriously, this is all exactly as it is supposed to work. I have no idea why there is such hoopla about this. There is nothing to see here. One group journaled something, other groups couldn't replicate it, they no doubt will reference this failure in future articles, and "what happened" is recorded out in the open for all of science, thereby expanding our pool of knowledge, both about what consistently works in many situations and of what someone claims has worked once in one situation but appears either not to work in the general case or requires more understanding and research.

Again, there is nothing to see here. Let's move on.

Comment Regardless of the accuracy of the numbers, (Score 1) 190

this would seem to be moot to me. Humans have only been here for the briefest of very *recent* moments, but we do have a particular interest in keeping earth habitable for *human* life.

Assuming your numbers are correct, it still doesn't do us any good to say that gosh, a few million years ago there was a lot more carbon dioxide, if for the purposes of *human* life a particular (and lower) level is necessary.

The goal is for us, not for the earth itself, to survive.

Comment Not tell time. (Score 1) 427

1) Monitor and keep and continuous chart of blood glucose, sleep cycles, blood pressure and pulse rate, blood oxygenation. Don't even know if the tech is viable for these, but they'd interest me.
2) Be part of a payments system that actually gets traction out there. Let me import all of my cards of various kinds and then provide them wirelessly to others without having to pull out a card (and/or a phone with a specialized app).
3) Same thing, but hold all of my tickets for entry into events.
4) Connect to a voice-to-text service to enable personal logging/journal-keeping just by talking at it.
5) Find a way to operate clearly and reliably using gestures and voice recognition rather than touch input when desired.
6) Have built-in GPS and voice navigation.
7) Have a built-in high-resolution camera to enable convenient visual capture of information.
8) Do all of this in a cloud-based manner so that everything that the watch did/tracked was available from all of my other tech devices.
9) Have a between-recharges time measured in weeks.

I don't know, it would have to be pretty freaking fabulous. But there are some basic things that I *don't* care if a smartwatch does, and those are probably more telling. I absolutely do not care about doing these things on a smartwatch:

1) Calls
2) Web
3) Email
4) Facebook
5) SMS
6) Linking it to my phone via bluetooth
7) Telling time

Number 6 in particular is a non-starter for me. Battery life on phones is already too short. And phones are the devices that I use for web, email, and other informational tasks on the go because they (not a smartwatch) have the screens suitable for reading/editing. I need them to last as long as possible, and I have no interest in duplicating their functions on a smartwatch. So I refuse to enable bluetooth on my phone all the time just to get some additional "watch" features.

It needs to be a "standalone" device in the sense of no other devices needed for it to operate normally, but a completely cloud-integrated device in the sense of "but I can access everything it does and it can access everything I do on my other devices over the network."

Number 7 is also pointedly interesting. I don't care if something on my wrist can tell time. Social "time" as a concept is more ambient than ever. Everything has a clock on it. Your computer. Your phone. Your thermostat. Your radio. Your car dash. Every ticket machine of every kind, from movies to transit to events. Public spaces and the sides of buildings and billboards and retail shop signs. I don't look at my wrist or my phone to know what time it is. I do a quick visual 360 and in general, I find what I'm looking for, wherever I happen to be. A "time-telling device" is frankly a bit 19th/early-20th century a this point.

Comment Ugh. I hate it when (Score 1) 284

the very ruthless and very rich tell us that money doesn't solve problems.

Well, you already got everyone else's money after being absolutely driven to do so for decades. Now you tell us that having tons of other peoples' money is no good anyway when it comes to really important stuff. Hmm...

So rather than sit on a pile of billions that you've tied up after getting it from other people, just give it back if you've now learned that it didn't do all that much good in the first place. No? Well then, you're either a liar or a hypocrite.

Comment Seriously? (Yes, seriously.) (Score 4, Insightful) 466

I do this all the time in my line of work. Someone hands us a data dump of 2 million lines in a messy CSV file with dozens of columns that's old collected data. We benefit if we can make use of it--but we have to clean it up first.

It's a one-time job--script something up to process the data into cleaner data and format it as we need it to make use of it ourselves. Then, toss the script away.

There's a big difference between "software engineering" and "coding." Coding skill is useful in all kinds of edge cases that mainstream applications don't handle, and when you're just looking to finish an edge-case task, you don't need to think about correctness or maintainability. You certainly don't want to pay a contractor $$$ to cook something up if it's only three dozen lines and will only be used for two weeks. For those casesÃ"the "who cares if it works six months from now, we just need it to do something specific for us this afternoon" caseÃ"you just need it to work once or twice, during the current project, as a part of the *process* of getting it done, not as an *asset* to monetize over time.

I totally get where he/she is coming from.

Comment And the benefit is... (Score 1) 209

what, exactly? Calendars are synthetic tools used to synchronize human activity. That is their one and only value. They do not exist in nature; nature synchronizes with itself without our intervention.

But we need a shared, common way to refer to particular dates in time so that we can refer to records and events retrospectively and arrange for future events prospectively—together, in a coordinated fashion.

So your proposal replaces one time measurement system on which everyone is more or less on the same page, in which the representation of a particular moment in time is broadly accepted across a large swath of humanity...by another system in which across that very same swath of humanity, a moment in time can be represented in multiple ways.

This would seem to reduce, not increase, the value of a calendar for all practical intents and purposes.

This proposal is most likely to catch (well, let's be honest, it's never likely to catch) but it's most likely to catch in advanced industrial/post-industrial societies where the resources and level of education to make use of it are in place. So you're proposing to introduce extensive new ambiguity in timekeeping into the population in which there is currently the least ambiguity in timekeeping.

Again, seems ass-backward to me.

Submission + - Has Apple lost its cool? eBay shoppers seem to suggest the opposite (terapeak.com)

An anonymous reader writes: There's been a lot of buzz recently about Apple having lost its "coolness factor," but a new report from market research company Terapeak—one based on hard data—says that on eBay, in four major categories (tablets, phones, desktops, and laptops) Apple goods are doing as much business as all other manufacturers combined right now. That's shocking if true. Have the assertions of Steve Jobs' importance been greatly overstated or is the Apple crash yet to come?

Submission + - The Science Behind A Time Lapse Night Sky

StartsWithABang writes: Recently, time-lapse photographer Thomas O'Brien put together his first video of the night sky, focusing on meteors and using nine years of footage to do it. But the majority of what you're seeing in that video isn't meteors at all, but presents an amazing opportunity to showcase what you actually see (and why) in the night sky. Enjoy the science behind a time lapse night sky.

Comment Re:MS should focus on winnable battles (Score 1) 379

Cute. But check out the Gartner numbers (amongst others) for the last few years.

Overall PC shipments were down 12% from 4Q12-4Q13. Meanwhile, Mac shipments were up over the same period by nearly 30%. While year-over-year PC shipments have been falling since 2011, Mac shipments have seen steady year-over-year growth for a decade.

In my corner of the SaaS world, it's clearly a Mac game now. I'm sure there are many areas where this is not the case. But the trend numbers are not good for Microsoft at all, and when combined with the reception of Windows 8 (not just here but across trade press as well as online generally), and the comparison of tablet unit shipments vs. PC unit shipments, I think MS is better off focusing on what MS products can be made mission-critical on other platforms, not on the continued dominance of their own platform.

Comment MS should focus on winnable battles (Score 3, Interesting) 379

The Windows battle is largely over, and they have lost.

On mobile devices, which are the most ubiquitous form of computing on the planet today, they are effectively out of the game for this round. Their only shot there is to become the next big innovator launching the next paradigm of computing—something that MS has never been able to do before.

In productivity computing, a decade ago it was still a Windows world, but I've seen shop after shop effectively go Mac in recent years. First the door is opened—and once employees and/or departments are able to opt for Macs to do their work, the balance goes from 90/10 Windows to 90/10 Mac in the space of one or two upgrade cycles. Apple significantly outpaced the PC industry overall in unit shipment performance over 2013 (particularly 4Q) and this matches what I'm seeing in business meetings across partnerships—senior reps from four companies are in the room and now the Windows guy is the odd guy out and everybody snickers a little. Or you're in a multi-hour videoconference on GoToMeeting and the one guy that's sharing a Windows screen rather than a Mac screen stands out like a sore thumb. It's the opposite of what you'd see over the '90s and '00s.

But Exchange and Office remain ubiquitous—more and more people in business are using a Mac but their Mac is invaribaly outfitted with MS Office (because iWork simply doesn't compare) and their entire business lives are accessed from Outlook. Finding ways to better integrate mobile Android/iOS offerings into their Exchange/Office universe would open a natural space for strong growth and continued dominance in critical business infrastructure. The focus on Windows and hardware is a head-scratcher.

The most worrying thing for Microsoft is that I've started periodically receiving OpenOffice/LibreOffice/Google Docs/Drive word processing and spreadsheet documents over the last year or so. That never, ever happened for the first decade and a half of my life in business (since about 1997) and now, suddenly, I've received about 20 documents like this this year from people at five different companies—without anyone mentioning it or even apologizing ("Hope you can open this!").

I don't know if the investment required to make a plausible attempt at reversing Windows' downward slide in market position is worthwhile. I suspect it's far more important for MS to shore up and grow their Exchange/Office business. Nobody is really challenging them yet in this space, but if a viable competitor were to emerge, the forces and trends related to Windows now pull *away* from Microsoft platforms rather than irresistibly toward them.

Comment Apple and Google provide me with a great deal of (Score 2) 394

freedom. I'd venture to say that I value the kind of freedom that they provide me more highly than I do the freedom from any one free software instance.

If I had to choose between "you never see the source, and have to pay big bucks, but Google continues to work as-is" and "no Google, but you get the source and the ability to control your own devices," I'd go for the first option in a second, because it enables me to do more things—not in theory (in theory, of course, the opposite is true) but certainly in actual everyday practice.

As I've outlined elsewhere in this discussion, but perhaps not so explicitly, the problem with RMS and the FSF is that they care only about one kind of freedom directly: the freedom to control one's own hardware and software.

This freedom is seen as being logically prior to all others. In a bizarre, historical sense, that may be true—if hardware and software had always been completely and entirely locked down, we wouldn't have the computing world that we have today.

At the same time, most people can not and do not take advantage of this freedom, don't care much about it, and might even have a great deal of trouble imagining what it amounts to.

But the list of things that they are able to do thanks to Apple and Google that they couldn't do without Apple and Google is quite long and quite clear to them.

There's nothing wrong with wanting to protect openness, but FSF discussions always manage to carry this to the logical extreme: you shouldn't use Apple and Google because they may eventually, someday lead to the end of open computing (e.g. the end of Apple and Google). So, even though Apple and Google radically expand the list of choices that you have at every moment of your life relative to not having them, you should forego them and have neither. Not to worry, though—since you have the more fundamental freedom (the freedom to control your software and hardware), you can just remake Apple and Google!

This is not going to fly with the average consumer. They won't be getting a free and open Apple or Google anytime soon—and thus, not using Apple and Google represents a net loss of freedom for them.

The best analogy I can think of is that of dropping someone in the middle of a wilderness with no other people in it and over which no state has control, then flying away and yelling down to them as you depart, "Congratulations! You're the freest person on earth! Enjoy the rest of your life in the wilderness, where nobody will ever control you again! And don't worry—if you get bored or lonely, you can always build civilization anew, this time with More Freedom[TM]!"

For the wilderness explorer that likes a solitary existence (or, say, the RMS-styled software developer), this may all indeed be true. But most people would find this kind of freedom less desirable than, say, the freedom that comes with a management job, a million dollar bank account, and an apartment in a major city.

The wilderness explorer cries out, "But you're not free! You have to go to work! You have to use a bank! You have to pay the rent! You have to pay your taxes! A policeman could write you a ticket for any number of things!"

Everyone else says, "You poor thing—living in the wilderness all alone like that, with no amenities, no friends, and nothing to do!" (Think RMS in his no mobile phone, doesn't use anything but Emacs, has never seen another email client world.)

Which one is "freedom?" It's a silly question. They both are—or they both aren't. Because freedom isn't an objective quantity.

As I mentioned in another post, society doesn't come to us as an empty field of possibility. It comes to us with conventions and practices that are well-established and well-understood at any moment in time. These open up new possibilities for individual life—that is, in fact the benefit of "society" in the first place, and why we bothered to evolve the capability—it enables us to build New York City, or create an iPhone, or go to the Best Birthday Party Ever[TM].

But each of these conventional practices that is so liberating and desirable for most people also comes with built-in rules and norms. These are not optional, and they are not choices; the rules of participating in society are *what society is made of*. Birthday-party behaviors (smiles, pleasantness, wishes, singing, gift-giving, cake) are *both* rules and also *what the practice is made of*. To have the freedom to go to a birthday party, one must submit to the rules of birthday partying. To refuse to *do birthday party things* is to lose the ability *to birthday party*. The rules and limitations are intrinsic to the experience and its benefits; you cannot sacrifice the one without the other.

Similarly, to have the freedom to enjoy a mobile ecosystem or a powerful desktop environment, one must submit to the conventional and existing rules of practice and integral processes that currently constitute them. Legalities (DRM, copyright, and so on), for better or for worse, are the general formalizations of such rules as these possibilities for behavior exist right now.

Some, like RMS, argue that this isn't freedom, or that it directly leads to unfreedom, and thus, the right answer is to never go to another birthday party (after all, who knows where the incredible rule-making tendencies associated with such functions will end—the tyranny of society and social norms knows no limits; concede to this rule or to that one, and you'll eventually have to concede to another; the more you become dependent upon birthday parties for your well-being and social life, the less you'll be able to escape the accumulation of rules without radically altering your life in negative ways).

Most people, however, are much more interested in simply going to the birthday party than they are in living in the wilderness. They concede that they'll have to sing the song, put on a smile, issue well wishes, buy a gift, and eat some cake as conditions for engaging in the behavior. But they just plain don't mind. It may even be *why they go in the first place*.

Just like I appreciate Google and Amazon surveiling me since it helps me to find the things that I want to find. It is part of how they work, and I value their ability to do this. It's a quid-pro-quo that we agree upon. To me, this relationship also represents a kind of freedom, and a more important one than my ability to hack the code (which I haven't done—in the submitting a patch or similar way—for at least a decade).

Slashdot Top Deals

Without life, Biology itself would be impossible.

Working...