Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Consider Your User Base (Score 4, Insightful) 247

Anything you do that adds an additional step to an existing process they "appears" to be working perfectly fine will potentially earn you some enemies. Some of the people most likely to be frustrated by the process may also be in positions of great influence.

A noble cause, but its success depends a lot on the existing culture of your workplace.

Certainly coming to the table with a well thought out argument in favor of this isn't bad.

But if the culture is right, you should be able to bring this up casually with superiors and discuss it with them candidly and THEN discuss putting together a formal document proposing a solution. If anything they are better equipped than we are to evaluate the user needs of the workplace and give you ideas of how to pitch this to the rest of the business.

Comment Re:Really? (Score 1) 196

"The level of connectivity to things is what makes the difference."

I made that distinction already in the different deployments of IP cameras. They are perfectly capable of this and have been used in this way utilizing automation protocols.

"IoT is defined as internet connected things talking to eachother, without needing a human or central server to poll them"

Many automation protocols are peer-to-peer and do not require polling. Some IP cameras can be deployed along side other protocol compliant devices in this matter. Again, as I said before, IoT is just a broad term for a specific type of deployment of devices that have been around long before IoT.

"IoT is "new" because it is neither a client, nor a server"

If that's what you are saying makes IoT new, then it is not new. There are already non-client/server home automation devices that integrate in a peer-to-peer fashion using home automation protocols.

At one point you say it's distinct concept because it's being applied outside of the home. Then here you make the statement that it is new and distinct simply because of the lack of a client-server architecture.

"You are confusing the definition of the word with the use of the word."

No I'm not. I clearly demonstrated my awareness that the strict definition of the phrase versus the general usage of the word are different. We simply differ in our opinions of whether this discrepancy is bad or not. You sir, are confused on the distinction between having an understanding of something, and simply differing in the opinion of whether that distinction is potentially bad. I understand the difference, I just think the huge gap between how it is used and what it truly means will result in it being a mushy buzzword that will be misused. Even among academia, I'm sure if you asked for a strict definition of IoT, you will get vastly different answer tailored to whatever pet project that professor is working on.

The bottom line:
When there is a gap between strict technical definition, and general usage, within this gap are included things which do or do not fall within the technical definition. Thus you have parasites that are being unwittingly promoted, but which have none of the actual benefits that true IoT would have.

It will be no less cringe worthy than hearing some non-techy rambling about cloud computing, all the while lumping in things that are in no way part of that paradigm and thus carry none of the benefits.

"The IoT was first coined when someone talked about applying the home automation model to everything. Why not do that on a factory floor? In a car? For an entire city?"

Never once did I argue against the actual implementation of any of these things, not to mention that it's already been going on for awhile in some factories in the absence of talking heads rambling on about IoT. To take your approach, you are confused about the distinction between arguing the legitimacy of terminology, versus arguing the actual implementation of a pre-existing concept by a new name.

Some of your other points would merit a response, but the above is just a sample of how this discussion is bound to just go in circles. You are ignoring points I made, saying I'm confused about things I already demonstrated a distinct understanding of, and trying to introduce entirely unrelated arguments. Any more of this just looks like two people talking at each other with their hands over their ears.

Comment Re:No, it's not even possible (Score 1) 181

The problem with this discussion is that yall are inter weaving to very different AI development paradigms. Not all AI is created with the goal of emulating human thinking. If anything, much of what we see as applied AI is intended to avoid the complexity of human decision making. I know the Post Office is old news these days, but their hand writing recognition for hand written addresses was able to read addresses more accurately than humans.

Does this have anything to do with AI self consciousness? Absolutely not. But when you start ignorantly citing the implementation details of different AI systems without acknowledging the purpose and goal of each design, then you are arguing irrelevant facts.

Comment Re:Really? (Score 2) 196

You're certainly right. If anything that's why they're bad, because now those same suit wearing people are spending money on anything/everything called "cloud" even though many of those things aren't within the strict definition of cloud computing, and thus don't offer the actual benefits that true cloud computing offers.

Comment Re:Really? (Score 4, Insightful) 196

Your effort to specify the internet-of-things as a well defined set is noble, but I wouldn't give the term that much credit. It's already a mushy buzzword that spills over into other technologies, and despite anyone's best efforts will never be used in any consistent manner. It overlaps everything from home automation, to remote crowd sensing, to simple devices that act as their own servers.

Your definition takes things touted as an internet-of-things and places them outside of that. The thermostats being called part of the internet-of-things are nothing more than a server that you can connect to remotely and control, and include some "smart" functions to make energy use more efficient. Many of them do not implement any standard home automation protocols that would allow the integration you speak of. In this respect they are just a standlone server you connect to with your phone/computer as a client.

Your definition basically narrows it down to things that communicate in a peer to peer fashion, no different than what existing home automation protocols do. "Internet-of-things" is just a buzzword that is popularizing what has already been possible for quite awhile. Oh yes, your camera senses motion and triggers lights? Guess what, there's already a standard for that that predates the internet-of-things concept.

Additionally, your definition of IP cameras either falls into or out of your definition of internet-of-things depending on how you use them. Yes they can act as a standalone server, not different than remotely accessible thermostats. Often you network them to a server and manage/monitor them remotely through the server. Otherwise it would be maddening to access every single device separately.

Additionally some support home automation protocols such as X10, which places them squarely into your definition of IoT because that allows them to be integrated with other devices in exactly the way you describe. Some cameras are poor at motion detection, and so you can rig recording/notifications of the cameras based on a dedicated motion sensor device.

IoT will fall into the same trap as a cloud computing. The terminology will be vastly misused to market things which cover very different paradigms.

Comment Re:Really? (Score 1) 196

Oh we get it, it's just kind of silly. IP cameras have been around before this terminology was in use, and there was never any confusion within the surveillance industry about what made them distinct from traditional surveillance cameras, even among amateurs doing DIY setups. And they haven't adopted this new terminology either, because a "Internet of Things Camera" sounds retarded.

Comment Re:Maybe, maybe not. (Score 1) 652

"but that hardly makes it a saint."

I never said they were a saint. You are clearly illiterate so I won't bother trying to have a productive discussion with you.

"I'm not going to bother listing the UNETHICAL things Google has done"

So basically you're just one more person joining the bandwagon with nothing substantial to back it up.

"Should I count how many times it has been to court -- and either LOST or settled"

Should I count the number of times judges have made extremely stupid rulings due to their ignorance of technology fields? How many alone have we seen on slashdot. I would list them but I already gave you some specific examples and you responded with a refusal to basically participate with any actual citings of anything that can remotely support your argument, I'm not going to waste anymore time on you.

Comment Re:Spinning storage is king... (Score 1) 438

Even a thick terminal supporting remote GUIs like X require less than 10 gb of space, even if you are supporting something like X.

A $170 gets you a 480 GB SSD: http://www.newegg.com/Product/...

If you setting up terminals that take up 480 GB, then you are doing it wrong.

I have 240 GB SSD and I have Windows along with several varations of Linux VMs and a Windows VM for isolated testing. Numerous repositories. SQL Server, Postgre Server, all the client tooling that goes with them. Numerous multi GB games.

Even 480 GB is plenty for most amateur audio/video production if you are moving finished projects off to a NAS.

And as for the phone comment, show me a single microSD that costs $170 and offers 480 GB of space.

Every aspect of your response is littered with stupidity.

Comment Re:Maybe, maybe not. (Score 1) 652

Let me clarify, alot of other major players sell your information and or give third parties access to it. This is how people end up with their picture in Facebook ads from third parties. Facebook has enabled a system where to do just about anything, you have to share information with a third party. Google on the other hand decides which ads are shown where, so they have no need to share your information outside of their own system.

Now you're going to bring up Google sharing your information with government requests, but that's different because they have a legal obligation to do so, and they have done what they can within their legal power to fight that. Additionally, that is very limited amount of sharing when compared with what Facebook does of their own design and intent, AND on a much larger scale, involving sharing with a very large number of third parties.

Slashdot Top Deals

In any formula, constants (especially those obtained from handbooks) are to be treated as variables.

Working...