The trick is to grasp that at high RPMs, it's not pressure that drives the work. A light touch will make more progress.
My mid-90s Dremel kicks ass.
A *NEW* use. Not the same old use but online now.
By that time there were millions of slaves in the U.S. and as you pointed out, they reproduced and even resulted in a surplus for the larger plantations. There was a lively internal slave trade at that point.
Actually, the war on poverty was working until the GOP insisted on surrendering.
And yes, businesses that mooch on the taxpayer to supplement their inadequate payroll are evil. They know damned well they are mooching off of people with a lot less than they already have.
We don't claim the car thief is blameless if you leave your keys in your car, do we?
As timeOday said, they cost about 10 years wages for an equivalent free worker, so if the owner didn't keep them alive and well at least that long, it was a losing proposition.
So as despicable as the practice was, the modern practice is in some ways worse.
That's the new innovation of forced labor. In the bad old days, slaves were quite expensive so you had to provide food, clothing, shelter, and at least minimal healthcare.
The new improved forced labor lets them pick up the slaves cheap, provide them minimal food and shelter and just let them die from overwork.
Both upsides were already easily solvable. Most distro's rc scripts already call a function to start a daemon. That could easily have called a helper program to set up the cgroup and register on dbus to act as a controller for the group.
Meanwhile, at least Debian's rc scripts already had dependencies listed in their headers that could be used to compute a start order. It could as easily be used to compute a makefile to start in parallel.
The problem is, now that the init process will be such a hairball of dependencies, it becomes harder to implement such solutions without seemingly unrelated bits breaking. For example, no reasonable person expects the GUI desktop to break if you switch out init. (and no reasonable person creates such a dependency)
Or, you go with signed routes. That is, you use a public key system to prove that you have the right to broadcast a route for a particular subnet.
In practice, it will probably mean some router upgrades. No more router cpus that were considered a bit underpowered for a calculator in the '90s. However, as an interim measure, it could be used to set some BGP filters to limit the potential damage.
The problem is, we're tipped over into corporatism where the net is controlled by a very few very large legal sictions tha tthe courts insist are somehow people.
You worry about the bad old government censoring the net but forget to worry about the ISPs censoring the net.
I can't imagine why you think the overmetered network protects us from the market cornering legislation and the pompous asses. Without proper net neutrality, we get all of the above and nowhere to turn.
Can a robot learn right from wrong? Attempts to imbue robots, self-driving cars and military machines with a sense of ethics reveal just how hard this is
In an experiment, Alan Winfield and his colleagues programmed a robot to prevent other automatons – acting as proxies for humans – from falling into a hole. This is a simplified version of Isaac Asimov's fictional First Law of Robotics – a robot must not allow a human being to come to harm.
At first, the robot was successful in its task. As a human proxy moved towards the hole, the robot rushed in to push it out of the path of danger. But when the team added a second human proxy rolling toward the hole at the same time, the robot was forced to choose. Sometimes, it managed to save one human while letting the other perish; a few times it even managed to save both. But in 14 out of 33 trials, the robot wasted so much time fretting over its decision that both humans fell into the hole.
Winfield describes his robot as an "ethical zombie" that has no choice but to behave as it does. Though it may save others according to a programmed code of conduct, it doesn't understand the reasoning behind its actions. Winfield admits he once thought it was not possible for a robot to make ethical choices for itself. Today, he says, "my answer is: I have no idea".
As robots integrate further into our everyday lives, this question will need to be answered. A self-driving car, for example, may one day have to weigh the safety of its passengers against the risk of harming other motorists or pedestrians. It may be very difficult to program robots with rules for such encounters."
Link to Original Source
I appreciate the offer, but I'm really not qualified. My interest is of the avid armchair variety. As I understand it, the dialysate is the key to making it work. Previous experiments achieved some removal of urea but it wasn't adequate or it caused electrolyte imbalances. In all forms of dialysis, it's something that could easily be mixed up at home but for the requirement of a sterile solution for hemo or peritoneal dialysis.
Waiting to put on a black shirt?
Who is first on your extermination list?
If the civilian police could do the search, then they probably should have.
If the search was done on-duty, it used military resources and so should not be reported.