Comment Re:And it's already back up (Score 5, Informative) 186
There are other repos: https://github.com/rbrito/pkg-...
There are other repos: https://github.com/rbrito/pkg-...
That's also incorrect. The lawyers don't allege they own the source code. They're arguing that the source code circumvents DRM. The supposed DRM is Youtube's "rotating cypher" that takes the video identifier string from the URL and returns the URLs of the actual video data. Unfortunately, the DMCA does not make a distinction as to the quality of any supposed DRM, so a court might side with the RIAA lawyers in this. This is more similar to the DMCA being weaponized to take DeCSS source code offline, which the MPAA argued violated the anti-circumvention provisions in the DMCA.
I think we can all agree that it's asinine to claim that the "rotating cypher" constitutes DRM. This is particularly true when Youtube freely serves up the source code to do this in the form of a Javascript file that is interpreted by a user's browser. To obtain the URLs with video data, youtube-dl simply uses a Javascript interpreter, performing the same functionality as a browser. In that sense, this is flimsier than the legal arguments against DeCSS, but a court might well still accept them.
I agree the lawyers' arguments are dubious. They are a little more complex than what you're saying, though.
Youtube URLs have a string that indicates which video is requested, e.g. https://www.youtube.com/watch?... indicates that DLjJwW1lFxI is the video identifier. This is then passed to a JavaScript function that converts that string to the URL to obtain the actual video files. This is the "rotating cypher" that the complaint refers to.
There is the precedent of DeCSS source code being taken down with DMCA complaints, which is probably the most analogous situation. The DMCA does not require that "encryption" be well-designed or strong, just that its presence at all requires it not be circumvented. But the DeCSS analogy doesn't quite work, either. The main issue is that Youtube actually serves the code required to "unlock" the "cypher" they're using. It's in the form of a Javascript file that's executed in the browser. As I understand it, youtube-dl simply locates the same function using a regex, then uses a Javascript interpreter to execute that function. It's performing the same functions that would occur in a browser. Instead of using third party code like DeCSS, youtube-dl actually uses the code served by Youtube.
It's absurd to characterize Youtube's URL obfuscation scheme as a cypher, but that reasoning might hold up in court. That said, any browser extension that monitors content requests can find the same URLs. For example, the logging feature of uBlock Origin also provides the same information. I suppose the RIAA could argue that such browser extensions are also infringing because they also block ads, but I'd hope that courts would view them as security tools.
The other issue is that the source code includes test cases to download works copyrighted by the RIAA. This wasn't a good choice and the lawyers have seized upon this to argue that youtube-dl is a tool intended for piracy. It would have been better to choose some public domain content for the test cases. It shouldn't matter, except that the lawyers might well be successful at convincing a judge that youtube-dl is intended for piracy.
It's unfortunate that the developers were clumsy about the test case. They've resisted feature requests to break actual DRM schemes like Widevine. They've also refused to add support for sites whose primary purpose is to distribute pirated works. Given the care that they've taken to avoid attracting the attention of lawyers in other situations, they should have chosen better test cases. That's an unforced error and would make it riskier to file a DMCA counterclaim.
While I agree that nobody is entitled to a degree, administrators are generally very concerned with student retention rates. One of the reasons for this is that funding and accreditation is increasingly being linked with retention and graduation rates. Taken to the extreme, maximizing retention rates would effectively turn schools into degree mills. In this case, the role of the advisor or the first year experience is to reach out to the student and try to find out if there's an issue with study habits, mental health, or something like that where they're able to provide assistance. In many cases, the students decline the help or it doesn't lead to a meaningful change in behavior. The students are still subject to academic probation and dismissal. This is just a mechanism to try to intervene during the semester, when it's still possible for students to avoid receiving low grades. Universities are implementing programs like these because retention rates influence funding, accreditation, and perhaps their marketability to prospective students.
As for TopHat, I don't use it in my classes, so I'm not totally sure how it works. I assume the code periodically changes to limit the ability of students to text the code to others who aren't in the classroom.
I agree the cost of iClickers is egregious, and I'd really like to move away from them. I've heard of an open source app called Qlicker, which seems to have much of the same functionality as iClicker. I don't teach every semester because my appointment is primarily research-based. I did recently ask about teaching again in the fall of 2020 and/or spring of 2021, and indicated that I wanted to find an alternative to iClickers because of the expense. It looks like Google Forms is an easier alternative to Qlicker that I wasn't aware of previously, so this is very helpful for me. Thanks!
I completely agree that the lecture needs to be relevant and presented with enthusiasm. This should be a given for any lecture. Due to the nature of my appointment, I tend to teach a lot of large classes with mostly non-majors. Many of them are going into a very different field, but the material still needs to be relevant. I'm a meteorologist, by the way.
Rather than simply teach theory and ideas, I try to spend more time applying the theory to current or recent weather. The site weather.cod.edu has a wealth of weather data. Asking the students to look at the forecast models on their laptops and apply the lecture concepts is an opportunity for both active learning and to make the material relevant to the students. I recognize that not every topic lends itself to being taught in this manner, but the general principle about active learning still applies.
I believe students will find the lectures more interesting if I incorporate more active learning and spend less time just lecturing. I suspect there are primarily two reasons that large lecture classes have attendance issues: 1) the perception that the instructor won't notice who's skipping class and 2) they tend to be mostly passive learning. Even if I'm teaching relevant material and presenting it with enthusiasm, and that should be a given, students are going to get bored and lose focus if there's a long stretch of passive learning. I think students are more likely to attend if they feel they're actually learning in class. In my experience, one of the best ways to accomplish that is to increase the amount of active learning, hence the in-class exercises I was talking about. Your experiences may vary, but I've generally received good feedback from students about more active learning in class.
I agree to a point that faculty aren't the problem -- but only to a point. I have an appointment at the University of Nebraska and I'm speaking from my experience at this institution. I'm not aware of this technology being installed here, and I hope it never happens.
In another comment, I mentioned about how iClickers and how they can be used to enforce attendance with some rather draconian measures. Many instructors use a different system called TopHat to enforce attendance. There are many similarities to iClickers. At the start of the lecture, a code is displayed on the screen. Students enter the code on an app to verify that they're in the class. It is entirely voluntary for instructors to use iClickers or TopHat in their classes. Part of the reason these tools are being developed is a response to complaints by instructors about low attendance. In regard to larger scale surveillance like what's described in the article, those decisions are made by administrators. However, it's likely that many faculty either support the use of those systems or are unwilling to speak up against them.
We do have a system called MyPlan, in which instructors are encouraged to raise "flags" for students who miss class, fail to turn in assignments, or do poorly on an assignment or exam. If three flags are raised for a student, their advisor is notified and is expected to reach out to the student. Instructors can also directly request that an office at the university (First Year Experience) directly contact a student. Despite the name of the office, they provide assistance to all students. The problems with MyPlan are that 1) many instructors just aren't aware of it and 2) instructors have to be willing to participate in the system. Because three flags have to be raised before a student's advisor is notified, it typically requires that multiple instructors raise flags about a student.
I also don't think analytics are necessarily useful at determining when there's a problem. We use a course management system called Canvas. It provides some analytics like how much time a student has spent viewing the course, the number of pages and documents they've accessed, and ranks them with respect to other students in the class. I haven't found it to be useful in identifying which students need additional help. I am skeptical that tracking library or cafeteria visits reveal a lot of useful information, either.
The administrators care about metrics like retention rate. If faculty don't want students to be tracked, they need to voluntarily participate in systems like MyPlan, to let advisors know when there are problems with attendance or academic performance. Faculty also shouldn't resort to draconian measures to enforce attendance in their classes. Unfortunately, I do think faculty are somewhat responsible for this issue.
Similar to the system described in this story, there are iClicker remotes that require a physical presence in the room. And there's an iClicker app, which can use the GPS on students' phones to require they be near the classroom in order to be present. Clickers are really useful for active learning, asking students to discuss questions in small groups and then vote. It's also a really good way for instructors to know in real-time if students are understanding the lecture.
Unfortunately, they are often used as a cheap way just to enforce attendance. As an instructor, I don't require students to check in, and I warn students about the privacy implications. While I want my students to attend class, I despise this approach.
I prefer the approach of having in-class exercises, which students do in a group. Sometimes the exercises require students to look up data online and solve a couple of problems in class. Other times, I'll provide the data in my slides or in handouts. The TAs and I walk around and talk to the students while they're working on the problems.
The questions mimic what they get on the homework. For that matter, I reuse in-class exercises on exams, with the exact same questions and data. The students submit their assignment to Canvas as a group, and I give credit for participation. I also have students vote on the questions with their clickers once the TAs and I are done going around the room. I do this once the in-class exercise is done, to discuss the answer with the entire class.
The participation points are a very significant part of the students' grades. In the most recent class I taught, in-class participation was worth 25-30% of the final grade. Because the questions are very similar to the homework problems and can also appear on exams, anyone who skips class is at a big disadvantage.
This does effectively mandate attendance. But it respects the privacy of the students. It also promotes active learning. Students tend to get a lot more out of these types of lectures than just passively watching the instructor talk about slides. Even if the instructor may not be able to get to a lot of the room for very large lecture classes, the general approach can scale up to classes of that size.
There's no need to adopt the draconian measures of tracking students' with their phones. If you have to track students in that manner to get students to attend class, you're doing something very wrong as an instructor.
Shielding Earth might not be as implausible as you expect. One idea that's been seriously proposed is to deploy a magnetic deflector at Earth-Sun Lagrange point 1. NASA proposed a similar idea to shield Mars from the solar wind and prevent its atmosphere from being continually stripped away. Another idea is to place a swarm of mirrors at Lagrange point 1 to reduce incoming solar radiation and mitigate climate change from human activity. Although L1, L2, and L3 are unstable, it's very feasible for spacecraft to orbit L1 and L2. It would be possible to orbit L3, but practical value is quite limited. The idea of shielding Earth is not as implausible as you might think.
The USDM map is updated during weekly shifts that run from Monday to Wednesday. Some are at NDMC in Lincoln, NE, by employees of the University of Nebraska-Lincoln. Sometimes the map is updated at other locations by USDA or NOAA employees.
There are five categories of drought ranging from D0 (near drought) to D4 (exceptional drought), and they're clearly defined based on observations. Despite this, the USDM map is more arbitrary than many might think. If you click that link, you'll see a variety of indicators for what constitutes each drought category. One challenge is what category to select when different indicators are in different categories. It's also a challenge about how to update the map when there's a rapid change in conditions. For example, if there's an area in D4, but the area receives several inches of rain in a few days, USDM authors are reluctant to reduce the drought category too much in a single week.
There's also the issue of what to do in areas in between observations, where it's somewhat subjective how to draw the contours for the drought monitor. Some local regulations and forms of aid for those impacted by droughts are directly tied to USDM categories. There can be a lot of money involved, and those who have money at stake will lobby the USDM authors to update the map in a way that's beneficial to them.
While reports are supposed to be made to state climatologists, who then pass the information along to the USDM author for that shift, that's not always how it works. Sometimes the USDM authors will receive lots of calls directly from various people in a particular county of region, lobbying for the map to be updated in a way that benefits them. I've heard of USDM authors getting lots of calls from farmers in particular counties, in a coordinated effort to get the drought category raised. I believe that some federal assistance becomes available at the D2 threshold, so often these calls are lobbying for the drought category to be raised to D2. If there isn't other data from that particular area, it's subjective and up to the USDM author for that shift how to proceed.
I've never updated the USDM and I don't work at NDMC, but I know people who do. I'm glad I'm not responsible for updating the map, because the shifts can be quite long if there are a lot of updates, and people can become pretty angry if the USDM author doesn't update the map the way those people want it updated.
Everything you've said is false. I work at UNL and I know people who actually work shifts to update the US Drought Monitor (USDM). I'm not involved with that work, but I've learned quite a bit about what drought is and how the USDM is created.
Drought is based on conditions relative to climatological normals for that particular location. Climate is generally averaged over 30 year periods, so droughts are abnormally dry conditions relative to the average over the past 30 years. While the current D4 (exceptional drought) conditions are around the four corners area, which is generally arid, that's just where it happens to be abnormally dry now. You can look back over the drought monitor archive and you'll see drought conditions in many other areas.
Drought occurs when conditions are abnormally dry. Deserts exist where it's normally dry. In any location, water shouldn't be allocated in ways that are unsustainable. The High Plains are semi-arid, but they're not a desert. Agriculture in that region is driven by extracting water from the Ogallala Aquifer at rates far faster than the aquifer can be recharged. The best options are to bring water from other areas, which can be expensive, or to limit water use in a way that's more sustainable.
When water is brought in from other locations, it's referred to as an aqueduct rather than a pipeline, and such things do exist. For example, Los Angeles gets a substantial amount of water from the Los Angeles and Colorado River aqueducts. The Los Angeles aqueduct is 419 miles long, so water is being transported over quite a distance. The original poster is simply recommending a much more extensive aqueduct system to help alleviate droughts. It's reasonable, provided water isn't being transported from other areas is an unsustainable manner.
And no, not all deserts have been deserts for thousands of years. Sometimes that change happens over shorter time scales, though certainly beyond the 30 year definition of climate. For example, the Sandhills of western Nebraska are now semi-arid, with grass growing in sandy soil. Several hundred years ago during the Medieval Warm Period, western Nebraska was quite a bit drier, and the Sandhills were a desert with active sand dunes. Conditions are wetter now, just several hundred years later, and the dunes are stabilized by the grasses. Transition in and out of desert conditions doesn't necessarily require thousands of years.
Unlike you, I have mostly quit Facebook. I'm convinced that social media use is generally unhealthy, especially when Facebook is using algorithms to prioritize news feed content and maximize the amount of time a user spends in the app.
There are two ways to look at this. Using a VPN gives you the option of limiting the ability of your ISP to track you, but at the expense of possibly allowing the VPN service to track you, instead. There are plenty of popular VPN services that track users. Anyone using a free service, whether a VPN or otherwise, should be wary that the service is probably monetizing users through other ways. This is not surprising. If you didn't quit Facebook for other abuses, this probably wouldn't make you do it, either. It's not surprising, nor is it inconsistent with other free VPN services.
The other way to look at this is that it's an antitrust issue. Facebook claims a massive amount of users, almost a third of the world's population. While they're undoubtedly inflating those numbers, there's no denying their market share. This sure seems like anticompetitive behavior that runs afoul of the Sherman Antitrust Act. That should actually be the bigger issue here because, if Facebook used anticompetitive tactics that influenced acquiring competitors, it seems like a strong legal basis for splitting Facebook into smaller companies.
It's possible that the term "encrypted" is being used loosely to encompass the process of salting and hashing passwords.
For users, the problem is that it's hard to know whether any particular site is using good security practices to keep data secure. I use is a password manager (mSecure) that runs locally on my phone, and generate unique random passwords for each site. That way, a breach like this wouldn't allow my data to be compromised on other sites, where I might have reused the password. I don't upload the data from mSecure anywhere, though I keep backups on SD cards. The data and backups are stored with 256-bit Blowfish encryption and a unique passphrase. I know, there's a single point of failure, where all my passwords are stored in one place and protected by a single passphrase, and phones aren't particularly secure. But if I moved the password manager to a laptop or desktop computer, I wouldn't as readily have access to my passwords when I need them. It's relatively convenient, simple to use, and it seems better than many of the alternatives.
Unfortunately, there's no way for a user to know which sites are secure. It seems like everything should be treated as highly vulnerable, and users should protect themselves accordingly.
The bigger the theory the better.