Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Leap Towards a Career in Ethical Hacking with 60+ Hours of Prep Toward CISM, CISA, & More Certification Exams at 95% off ×

Comment This is already the law! (Score 1) 347

There's a broad category of self-assessment taxes you're supposed to be paying : use tax and their ilk.

State laws almost always indicate that it is the responsibility of the purchaser to account for this. Generally, they require that you pay the taxable difference by percent between the point of purchase and the state of residence.

Example 1: You purchase a good on the internet from out of state. You pay no (sales) tax on it at the time of purchase , but if you were to buy it in your state, you'd be assessed a 10% tax. You now owe your state a 10% tax.

Example 2: You live in one state, but purchase a car in another. Autos are taxed at 10% in your state, and at 5% in the state where you purchased it. The tax for the state of purchase IS collected, but you still owe 5% tax in your state.

The reason this doesn't come up very often is that most of these sums are very very small. Certainly far smaller than the cost of litigation to investigate, find proof, and litigate remuneration of those sums. The states still want that money! They can't afford to spend tens of thousands of dollars to recover two dollars from an individual, but they're willing to spend a hundred thousand if it brings in millions. So they target specific retailers because it IS cost effective for them to force THEM to collect the sales tax as if they were in-state sellers. Thus Amazon now collects per-state tax.

The short version is that the state wants money, and they'll keep creating laws until they can both legally take it, and it's cost effective (and easy) to do so. The same idea drives universal fingerprinting/ID and encryption backdoor initiatives. There are laws that cover the situation, but they're too hard to enforce. So they add more laws simply so it's easier to enforce the other ones.

Comment Re:Lack of leg-room and ergonomics (Score 1) 68

I was surprised to find out that the most comfortable position for me at a desk for long periods of time is actually with the top of the desk more or less in line with the middle of my sternum. From shoulder to wrist, my arms are almost entirely horizontal.

No idea of the ergonomics of it, but I seem to be managing to avoid fatigue and feel comfortable, even if I usually have to resort to putting the chair to the lowest level.

Comment Standing desks provide no health benefits (Score 1) 68

Remember - the studies show that standing desks may actually incur more, and more serious health problems, much more quickly than sitting - in weeks instead of months or years. We DO know sitting for long periods of time is bad, but as per a previous slashdot story. there's no indication standing all day is any better.

I wrote about this before here too, but the summary is this: the idea that standing desks provide health benefits is based more on well-intentioned ignore-proof-or-lack-thereof feel-good rationalization than the anti-vaxxer movement.

Just make sure if you're getting a standing desk like this or others, that you're not just buying into hype. Remember how it went when everyone raved about how awesome Ruby was? ... yeah.

Comment We already knew this (Score 1) 134

I posted about this last year, here https://ask.slashdot.org/comme...

The summary version is this:
    - Sitting too much is associated with certain health risks that take a long time to appear and are common with a sedentary lifestyle (so may not be caused only by sitting)
    - Standing too much is associated with certain health risks that occur fairly rapidly (relative to sitting)
    - We don't really know how much standing is enough to ward off the dangers of sitting,
    - We don't know how much standing is too much and will result in health problems.

There's probably an optimal healthy point, but we don't have any studies that show where that optimal healthy point is on average, much less how it needs to be adjusted for an individual. The only real advice to come out of this is that you should take a break and walk around every once in a while and outside of work, maintain an active lifestyle with exercise and properly sized nutritious meals.

Comment So we need a Ministry of Truth now? (Score 4, Insightful) 292

A quick perusal above shows where people's heads are at on the 'right to be forgotten':
    "We enjoyed that right until google came up. before that, everybody could simply be forgotten by moving to the next village"
    "Before that, if you wanted to be forgotten, you simply moved and adopted a new name."

        No, it was not a 'right' then, as there was nothing in the law to provide it, nor was it considered an unstated right assumed by society.
        No, you were not forgotten, rather, new individuals were ignorant.
        No, name changes were public record and so too were most criminal complaints – simply not having a trivial way to search them does not equate being inaccessible, and certainly not to being ‘Forgotten’.

Why target google searches alone? Shouldn’t someone need to go through the police records, newspaper archives (and any microfiche for places still using that at the time of the offense), magazines, comedians routines, and song lyrics (if the crime was public enough) - and any recordings thereof – to eliminate the references? As per 1984, you’re going to need a whole department working 24/7 to censor or rewrite all the data there ever was if you’re really pushing for ‘forgotten’ status.

Really though, this isn’t about a right. It’s about restriction of rights. What advocates of this restriction are really trying to do is eliminate access by society at large to public records. Since the very nature of public records is that they are publically accessible, they’re instead attacking the ability to search the records, in an attempt to make the data useless. Basically, it’s the same sort of political machinations you see in attempts to do end-runs around laws in US politics today: so called sanctuary cities deciding not to check the residency status of illegal aliens, or requiring state ID to vote to drive away minorities. It’s folks deliberately doing an end-run around the law.

What it really comes down to is this: If we’re not supposed to do something, be it identify someone as an ex-convict or other, then why can we do it through every other channel allowed except for a single one singled out simply because of it’s current popularity and ease of use?

Comment Re:Odd title (Score 1) 237

My problem is with the strongly opinionated frameworks. You know, the ones where you use framework 'X' and now you have a 'X'-website or 'X' application? Where the framework's author makes the majority of architectural and project organization decisions. Sure, they make 80% of the common usage patterns easy, but that almost always results in making 10% of what's left hard and the remaining 10% near-if-not impossible.

In a world where the programmer makes the rules, this is fine - they can find some awkward workaround, like forcing a second login or putting validation logic only on the client side. Unfortunately, in the business world, the customer is the one making the rules, and the customer always wants 100% of the product working the way they want it to work.

They seem to flourish - at least until the next fad framework - thanks to their low barrier to entry and focus on the intuitive space the article author references.

Hipster coder though? I don't know. Almost every dev I know has some subset of interest in new languages, frameworks, libraries, and so on. It's their toy language or pet project. In fact, the biggest violators I see are actual experienced programmers who do know what they ought to, but don't want to spend time reinventing that same tired ground when they just want to play around.

In fact, not having to reinvent that same solution is exactly what they're interested in.

That's why you'll find a Karaf server running a Camel app in a Microsoft shop, even though there's no one left at the company who knows OSGI. Or find an external build script dependency in the form of a Rust script from out of nowhere. The people who should know better just want to play, and the easiest way to do that is to make it part of their day to day job.

What this article and experience tell me is that more than ever, we're in need of technically-competent managers who can evaluate, triage, and reign in individual developer output and provide focus.

At least, in a business environment, where product delivery, support, and maintenance are more important than playing around.

Comment Re:Specialization (Score 2) 237

I've noted that businesses that have historically shown high growth and apparent long term stability seem to be looking for "T-shaped" employees. These are employees who know a little about a lot (a wide horizontal) and a lot about a little (a narrow vertical). Specialization is good, but you have to know enough to be able to respecialize in anything else quickly, which means dabbling in a bit of everything so that it's at least familiar.

Basically, a jack of all trades AND a master of one.

To use your analogy, you don't need to know how to rebuild the motor in order to drive the car, but you should be aware of all the parts and how they operate, so that way when something makes a noise, you not only know what it likely is, you either know how to fix it (or if it even needs fixing) and where to start looking for the information on how to fix it - even if you lack the tools or experience to do so.

Comment Re:Glueing things together is how I teach OO desig (Score 3, Interesting) 237

That's my experience as well, at least, as of the middle 90's. Why use a mainstream, commercial language, when you could use a theoretically complete one? One where they recognize that features like "output" and "input" are dirty exceptions that spoil the purity of your program when you're trying to get a good hard proof using Hoare logic.

Better not clutter up the syntax either by using confusing, pre-existing symbols to form constructs, not when you can add whole new symbols requiring a special (APL) keyboard!

I remember having a TA once challenge me - I had written an algorithm to operate iteratively, rather than recursively, because I had noticed the program would run out of memory if I did it the other way when fed large data sets - because to him, recursion was theoretically perfect and not using it was a personal affront. The fact that my code worked and his crashed after 4-5 minutes didn't matter.

To paraphrase a professor of mine, "There's absolutely no reason for a computer scientist to use a computer."

This sort of attitude was pretty rampant all throughout my college career, fairly startling having already been employed in the industry for some years prior. No concept of real world usage at all, or worse, some strange bias against it.

Comment There are a lot of strange allergies out there! (Score 3, Informative) 503

For example, every time I see or hear Donald Trump, or hear about his standing in polls, I experience waves of nausea, get headaches, become irritable, and have troubles thinking anything other than 'dark' thoughts.

I know a lot of people who have the same allergic reaction, and I think it's only fair that we make the US a Donald Trump free zone, to end this sort of suffering.

Comment Re:Definitions (Score 1) 163

Their definition of male and female is indicated to be 'sex' - that is, biological gender, as opposed to the social concept of gender. Pretty clearly stated.

What they're attempting to do is find out if there are any consistent trends within each group of biological genders that contrast with each other, that can be used to indicate a difference between biological genders. As they found none, speculation on what a trans-woman's brain would be, if they had found it is as useful as speculating on the type of hat a tyrannosaurus rex would prefer, if they had access to hats and the desire to wear them.

That is to say, completely pointless, on a number of levels.

Comment Wrong. (Score 1) 163

It doesn't prove that. It doesn't disprove it either. It makes no claims on that whatsoever.

Interestingly, this phrase shows up early in the article: “Nobody has had a way of quantifying this before,” which indicates that previous claims were invalid, based on supposition and assumption rather than data. They were at best, theories, in the scientific sense of the word, but lacked a mechanism to test.

Should you assume that previous claims were valid, even without proof, I can see how you might come to the conclusion that we lack the tools to derive differences. If you ignore the idea that these things could be impacted more by chemical, environmental or cultural factors, and that brain structure contains an inherent representation of male-like vs female-like brains, it is a pretty reasonable assumption.

Except that the article then goes on to point out that all the things previously claimed to indicate gender did not have reliability: there was no inherent consistency or commonality between results, except when you zoom out far enough to indicate a generic trend (one that cannot be used to reliably identify a brain as 'male' or 'female'). Again from the article:

"Some modest disparities have been reported: On average, for example, men tend to have a larger amygdala, a region associated with emotion. Such differences are small and highly influenced by the environment."

So, the previous claims are considered invalid, and the assumption - assuming I'm not painting a straw man - requires them to be true, and reliable. Instead, they are not, and while it doesn't discount the assumption, it does make it more unlikely to be true. ... personally speaking, I believe 'genderness' as it relates to social norms rather than biological ones, is predicated primarily on those same social norms and cultural values, rather than biological, which serve as a tertiary influence after cultural and environmental factors. If that were not the case, then it would be unimportant for a trans-individual to look and dress in a stereotypical gender-specific way, except to serve as a label to others. There's nothing that says you're not unless you dress like your culture's stereotype of , except insecurity and social pressures. Nothing except what an individual allows inside themselves.

Is there really a problem with it being primarily a psychological state? It's equally valid, isn't it?

Comment This is an easy answer (Score 1) 169

Is there business value in retaining, training, and developing individuals who will become domain experts with cross-functional expertise and proficiency with working within your business structure?

        Well, of course. That's a hypothetical question.

The real question: is that value greater than the cost to retain them vs. the cost of hiring multiple underpaid, low quality workers, perhaps from another country?

This is going to different industry to industry, and company to company. Those selling software as a service may find their short term apparent advantage results in a severe disadvantage over the long term. Those releasing products may see great gains in that same long term view, with a painful short term as their teams are brought up to date.

If any of these trends /are/ true on average, though, then the good companies will survive and the bad ones will go under in a socioeconomic version of natural selection.

In either scenario, there will still be other companies that value long term employees, and those that don't.

Comment Re:15M (Score 1) 291

They've been saying this since the origin of the Luddites in the early 1800's!

Technology is a force multiplier. It allows one person to perform the work of many. Historically, instead of increasing unemployment, it's done the opposite - creating larger numbers of jobs, especially in positions that can't exist without a large-scale economy technology provides. The ability to make a shirt via a machine, thousands of times faster than by hand necessitates buildings for the machines, a distributed sales force, logistical concerns like transport and storage, advertising, etc. Each technological advance may reduce the number of employees in that explicit role, but increases the total number of jobs, the total hours of work required by an order of magnitude or better.

However, there's a limit. We haven't reached it yet, but it's there, looming in the distant future. At some point, we will reach peak technological assistance. Everything that can reasonably be automated will be. We'll literally have no need for all the people in the world to be working, much less the ability to employ them all.

Automated tellers at fast food joints is not the limit, by the by. I'm guessing we're still hundreds of years away.

The hard part isn't getting there though. The trick is going to be how we handle the runup to this post-scarcity world. At some point in the interim, we'll still consider money to have a direct relationship to work done via services and goods, and thus be necessary for comfortable living, but a significant minority will be unemployed and poor.

That 15 hour day you're referencing, that's going to be like the end times. It means either we have a revolution and start all over, or we jump the hurdle and enter into some sort of Star Trek-like world where we just accept that not everyone should work as a condition of having a comfortable life.

Personally, I don't see it happening in a positive way. The average human just isn't wired to be altruistic, and it's about the furthest from what our governments are set up to give us.

Slashdot Top Deals

Unix: Some say the learning curve is steep, but you only have to climb it once. -- Karl Lehenbauer

Working...