Unless a state passes a right-to-work law (California has not), "closed shops" are allowed under Federal law and typically required by union contracts. A "closed shop" agreement means that employees must be union members at the time of hire, or must join the union within a certain period. To conform with the First Amendment, employees who do not wish to pay for the union's "extra" activities (beyond collective negotiation for their bargaining unit that the employee belongs to) can opt out of full union membership and pay a reduced rate for the union's representation. The reduction is almost never a big reduction, which might surprise people who know how much unions spend on political activity. Also, people who do opt out can no longer vote on what the union should negotiate for, and unions like to make them social outcasts, so there are strong incentives to not opt out.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Jimmy Hoffa was unavailable to comment for this story about the Teamsters' violent behavior and links to organized crime.
The money to pay for benefits will come out of the employee's paycheck one way or another. If the employer has to pay employees when they're not working, it means the employee's per-working-hour salary will be lower than it would be otherwise.
Drivers who would get "50% pay when ill" are probably not that likely to stay home, and most companies that have policies like that require a doctor's note when someone takes advantage of the policy (to deter people from abusing it). Making your employer part of your relationship with your doctor should be an option for the employee, not a requirement.
You do know that IEEE 754 defines, and two IBM mainframe architectures implement, decimal floating point formats and arithmetic, right? Some languages -- mostly to support financial applications -- provide data types that are defined in decimal floating-point terms.
Using these is uncommon, there are various tradeoffs for decimal floating point, and the values are encoded using bits, but the high-level semantics and in some cases hardware instructions are explicitly base 10.
We must prosecute Galileo -- the academic consensus is clearly against his nutty heliocentric ideas.
We must exterminate defective races -- the science is clearly in favor of Aryan supremacy.
We must take to the courts and prosecute anyone who dares to insult our religion^Wscience!
In some cases, yes. In the other cases where I've noticed it, the turning car was already far enough out of the lane that I could drive past it while staying entirely within my lane.
Is the tradeoff in favor of this kind of system yet? I have it one my car, and the part that I have found useful is the flashing warning -- not the braking. Among other flaws, it tends to have false alarms on certain stretches of road (I'm not sure whether it is picking up signs or fences or something else, but it is almost a given at one spot on my commute home from work), and it gets close to the auto-braking threshold when a car in front of me is turning into a parking lot. (The system in my car has three levels: flashing a warning, pre-tensioning [tugging] seat belts, and automatically braking. A turning car sometimes triggers the pre-tensioning, which is distracting in itself.)
You are just addressing a different part of the 4A's limits than I am. Some things are not 4A searches. The government theory here is probably that Smith v. Maryland (1979) makes an IMSI catcher not a 4A search. Some things are 4A searches, but do not require a warrant to be reasonable -- if the police say "mind if I search your car?" and you say that's okay with you, they don't need a warrant. Other things are 4A searches, but require a warrant to be reasonable -- non-consensual searches of a home, absent some imminent danger, require a warrant. Other things would be considered searches under the 4th Amendment, but even a warrant cannot make them reasonable; but this category is so small that I don't know of any good examples (a lot of possible examples are more clearly prohibited by the Fifth Amendment's limits on compelled testimony against oneself).
My personal take is that use of an IMSI catcher is probably a 4A seizure that would need a warrant -- it disrupts the normal functioning of many phones in an area, temporarily disconnecting them from the cell phone network -- or alternatively it counts as a search because it scoops up so much data from so many people (similar to the "mosaic theory" that some circuits have recently approved).
Lots of 4A searches do not require warrants -- searches incident to arrest, custodial searches, searches with consent, and probably more. The warrant requirement only kicks in when a warrantless search would be "unreasonable" (violate a reasonable expectation of privacy, and such expectation is narrower than most non-lawyers would believe).
The single-session CD is supposed to come from an unsecure network. What good will putting it back there do an attacker?
I am not thinking small, I am thinking rational. You are assuming an "insan[e]" attacker, which is rather silly. I'm not claiming that single-session CDs will make a system unbreachable, or that you should try to. My claim is simply that using single-session CDs (in a controlled, hygenic way) makes the cost to breach a system much higher than the alternatives that were suggested (network and USB) -- not even that single-session CDs are always the right solution.
Do you have any idea what the error rate for manual data entry is? Typically about 0.5% of the entries will be wrong. Retyping information is a very error prone process.
Do you have any idea that there are known good practices for checking entered data before committing to it? And that most people would want to apply this kind of check before kicking off a production run, of just about anything, regardless of how the order was sent to the system?
What is it about this topic that makes people forget basic engineering practices?
If you have a air-gapped system, you don't let people plug either random USB devices or random Ethernet devices into it. You help enforce this by disabling USB ports, MAC-locking switch or router ports, making it clear that only specific authorized people can import data, and making sure those authorized few use hygienic practices. It's IT security, not brain surgery.
1) If you can define the protocol to be simple enough, and
2) if you can be sure that only the intended application will process the data stream on the secure side, and
3) if you actually test that application enough to be confident it is secure, and
4) if you can ensure that sensitive information will not (improperly) leak back down the other direction, and
5) if you use it often enough to pay for that development cost, and
6) if you can resist the pressure to add features or "generality" to the protocol that makes it more costly to ensure secure processing...
then maybe such a protocol makes sense. Maybe somebody somewhere has satisfied all those ifs, but I would suspect not. For your simplified example, it is probably cheaper -- and just as secure -- to have an operator enter the dozen or so keystrokes to order "produce x amount of class y steel" than to design, build, install and support a more automated method. Human involvement has the added bonus of (nominally) intelligent oversight of the intended behavior for the day.
The point of an air gap is to make data transfers much more controlled. Some can be crossed regularly (with appropriate control), and some should not. One should only adopt any security measure after a cost-benefit analysis. The depth and rigor of that analysis should be determined by the expected costs (ongoing/operational) and potential costs (from a successful exploit).
Thus, I said "If I really wanted to reduce exposure", not "Everybody should do this to reduce exposure". If the productivity costs are very high, you had better impose enough oversight to deter or catch any policy violations... or choose a security policy besides "air gap". My basic points stand: much more software regularly talks to a network than regularly reads from CDs, and the protocols involved are much more complex for network communications; and USB sits in between those two.
FWIW, industrial control instructions can be made much more regular than arbitrary data, making it easier to detect a compromise before it reaches its ultimate target. For example, if the usual file size is 1 MB, you had better have a good reason for it to suddenly be 3 MB. If you are really paranoid, you might have a format checker or sanitizer to act like a very application-specific antivirus.