Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Ugh (Score 1) 274

Have you tried monk fruit?

It's available as a liquid or as a highly refined powder. Tastes a lot like honey to me.

I'm phasing out my other sweeteners other than sucralose and monkfruit where I can. Only keeping the sucralose to use in very limited amounts because it tastes the most like sugar, and I can add 1 packet to the monkfruit and shift the taste from the honey-like taste to a more pure sugar sweetness.

Comment Re:Uhh huh, sure they are... (Score 4, Interesting) 406

It makes sense when you realize they're using off-the-shelf walkie talkies instead of encrypted radio in many areas and other older, less useful equipment everywhere. The funding for upgrading and maintaining their forces was diverted to the oligarchy, and they're making do with whatever they can find. The drones made with off-the-shelf components are a good example as well. I mean, does it make sense for the Russian military to all be using open radio bands to discuss battle plans when the Ukrainians can listen in?!?!? Heck no. Of course they have encrypted radio - but not enough, and the ones they have are of different, incompatible types. So for the units of one type in one area to talk to units of a different type that's incoming, they use the open air radio b/c it's the only thing they can do to communicate. Better equipment exists, but they don't have it. Same is likely with this GPS. It was cheap and it worked, so they used it.

Troops are leaving behind expired rations, expired med kits, paper maps from the 90s, and other archaic things. We're not talking expired by a few months, either - expired by decades. Much of the heavy equipment is decades old - some dating back to WWII.

But, their biggest problem is that their supply lines are heavily debilitated. That coupled with the fact that their equipment was designed for nuclear warfare tactics and their generals have no experience with this type of warfare is crippling. They first thought Ukraine would fold under intimidation, then thought the fight would be similar to Syria. It's just been mistake after mistake & they're now lobbing cluster bombs and lots of older, terribly imprecise missiles at the region.

Ukraine has pushed Russia back to the disputed areas, but the war will go on until Russia has had enough. The borderlands are much easier for Russia to resupply and control what land they hold. Barring a nuclear launch, I don't see Russia taking an inch of Ukraine, and quite possibly losing Crimea as well. The Ukrainians are in no mood to negotiate after what Russia has done to their people.

Comment Re:Agree to disagree. (Score 1) 279

The "higher quality digital masters" are almost always 2K - because that's the output resolution of the preferred digital camera used in most productions. It's also the preferred master for 35mm film to digital master transfer. Special effects that are added later are in 720p - because that's the default for rendering them for every major Hollywood blockbuster due to the time it takes to render them. Later, they're up-scaled by a very advanced method, but it's still nowhere near 4K when completed.

Even the blockbusters that used actual film are almost always converted to a 2K digital master that is up-scaled to 4K.

For instance, Iron Man 2 was filmed using 35 mm. It was transferred to a 2K digital master, then 720p special effects were added and the entire thing was up-scaled to 4K with HDR 10 bit, then compressed and sold as a 4K UHD (ultra HD) Bluray.
You'll find Blu-ray after Blu-ray is the same story.
https://thedigitalbits.com/ite...

Now imagine how much CGI was in the Avengers: Endgame movie - and remember all of that CGI was rendered at 720p. If you get a 4K Bluray of that, you're watching mostly 720p animations enhanced (mostly interpolated) to 4K, then compressed for release (which creates artifacts).

Cloverfield Lane was filmed in 6K which is really 4K because of the area of the sensors, but it was converted to 2K as a digital intermediary to have its 720p CGI added to it... which was then up-scaled and color corrected for 4K, then compressed for 4K UHD Blu-ray. So, even though it was filmed in 4K, it's really an up-scaled 2K as well.
https://thedigitalbits.com/ite...

Alien's 40th anniversary release is one of the very few actual 35mm to 4K transfers for the theatrical release (the Director's cut portion is likely 2K up-sampled and the release includes both)
https://thedigitalbits.com/ite...

There just isn't very much actual 4K content out there for movies and TV. You an sometimes find it, but it's very much true that nearly everything you see that says it's 4K has been up-scaled from 2K at best.

Comment Re:Headline doesn't understand how mining works (Score 1) 166

The mining farms pay very close attention to that value because if the cost of mining the very next coin is higher than the coin is worth, they will shut down their mining operation and liquidate. Not right away, of course, but after they've decided that the new norm isn't profitable.

If all the mining farms shut down except for those in China and a few other rare locations where energy is cheap, then there's a very real concern that China, or at least one of only a few agents in China, could control the Bitcoin network - which could lead to forks, fake transactions, or the shutdown of the network entirely.

The key point of the article is that mining coins for profit doesn't appear to be sustainable any longer. If the value of Bitcoin in terms of dollars doesn't go back up, people will start turning off their bitcoin miners which also conduct transactions for the network. We could wake up one day and find there is no network to make a transaction on if the value goes down much farther.

Comment Re:Transistors and AI (Score 1) 170

Bingo.

Today's CPUs and GPUs are essentially silicon, copper, and various doping ions in a thin sheet that we pump electricity through. They're glorified heating elements that happen to do "work" in a useful way. They're a brilliant design for what's essentially lightning in a rock, but they're very limited.

The human brain is made of many kinds of cells in various arrangements -- each cell being a small three dimensional world full of molecular machines and nearly every connection between those cells exponentially grows the power of what they can do together. Its analog computations can be less precise and more error prone, but sometimes vastly faster and extremely power efficient depending on the scenario.

Only a fool would think we won't somehow, someday create our own synthetic cellular machinery to make synthetic living computers with amazing capabilities, not to mention the possibilities of using quantum processors to solve computationally intense algorithms.

Comment Re:The real reason (Score 1) 346

The US intelligence community doesn't have backdoors into all phones. They have backdoors into the phone carriers for certain, though. AT&T, etc have fiber optic runs to spy closets where audio is recorded and speech-to-text tools are used to help search for key words. Snowden wasn't even the first to know about it. I remember when Shia Labeouf talked about it during an interview where he worked with the feds to prepare for a movie. He mentioned government spying, and the feds played him back a recording of a cell phone call he'd made years before to show off what they could do. This is why everyone who actually works for the government in high positions all use encryption for their calls and texts (while simultaneously fighting for backdoors for encryption) -- because those apps circumvent the carrier's recording technology.

Phones are pretty rock solid - especially Apple iphones. I've heard from local law enforcement that there's a huge backlog of iphones and other equipment with strong encryption that the feds can't break into yet. For simple codes on Android, they use a USB that fakes a keyboard input trying all the possibilities 'til it unlocks - doesn't take long for simple numeric codes. iPhones will make you wait between tries and the wait time gets longer with each failed try. Biometric ones are easy to unlock - use a lifted print or a photo if it's a face unlock... but, they still have to unlock it b/c there's no actual backdoor.

Verizon and some other carriers have their own OS modifications for Android, so who knows what they put on their phones when they flash the ROMs to make them work. I assume carrier unlocked factory-default phones would be free of such spyware, but simply making a call on a carrier means the carrier can listen in to the call since they make the connection.

Any funny looking hardware would get scrutinized and would kill a phone maker's business if found, but software can be tricky. Apple is the only company I know of for certain that loads their own un-modified OS on their hardware. Verizon, Sprint, and others tend to tweak the OS and flash the ROM... they could be doing anything, really. Huawei, if it were allowed to play in the US market, would be subject to Verizon, Sprint, AT&T etc... the carriers would mandate what software was on the devices and flash their spyware if there is some there to flash. Huawei could have a super-secret hardware firmware backdoor, kill switch, or the like, but a physical rogue chip would be detected, and any malware would have to navigate through the flashed telcom firmware to operate. The minute malware is discovered, their business would be over. It'd be suicide for them to do that.

The US is just upset that China isn't following US sanctions on other countries in addition to the current trade war. Huawei being Chinese government controlled will always be a threat to US security as at any time, Hauwei could flip a switch, flash an update, and own your device.... but, it wasn't until recently that the US told their govt contractors to ban Hauwei devices. It's all politics at this point.

Comment Re:Thus, perfectly good hardware goes to scrap (Score 1) 111

I sympathize, but there is always a cost/benefit analysis to be done for supporting older hardware and software which will run under the limitations of old hardware. Windows XP SP3 was roughly the final version of XP in 2008, so we're talking about maybe 10 year old netbook that was designed with a 2 to 3 year lifespan to begin with (Atom processors were barely capable of running XP -- I used to manage a few netbooks on an organization's network). I can't even get Google to support Android OS security updates on their own products released more than 3 years ago. The last Intel consumer CPUs that were 32 bit were Atoms in 2011. The last desktop/laptop CPUs that were 32 bit only were Intel Core Duos in 2006.

Someone could continue to support 32 bit Linux for the device - maybe a different distribution. Lubuntu will still support 32 bit linux on the LTS release until 2021, which isn't bad for an Ubuntu flavor supporting what will then be 10 to 15 year old hardware. I'm sure someone will continue to support 32 bit Linux as there are still a lot of imbedded 32 bit processors - one might just have to seek out a more niche distro.

I'm always a little sad to see working hardware tossed into the trash, and I've always tried to find uses for older machines - either as donations to students or families that need one or as simple, single-use devices. Sometimes, though... it's simply not worth the cost of electricity to use them when one can purchase a small low-powered replacement. For instance, I had an old laptop with a cracked and useless screen that was still good to connect to a TV to watch Hulu, Netflix, and other streaming services (though it couldn't handle 1080p), but a simple Roku Stick replaced it for very little cost and huge power savings.

Comment Re:How many people use Lubuntu? (Score 4, Informative) 111

Think of Lubuntu as "lightweight ubuntu." Debian creates most of the base, Canonical polishes most of what's left and adds in some goodies and releases Ubuntu as its main distro with the Gnome Desktop Environment. Then, Canonical and/or their partners also support other "flavors" of Ubuntu which use other Desktop Environments. Think of the DE as just another program - the user interface is just a shell over the rest of the OS.

Some examples:

Kubuntu -- KDE Plasma desktop
Lubuntu -- LXQT desktop
Xubuntu -- XFCE desktop
Ubuntu Mate -- Mate desktop
Ubuntu Budgie -- Budgie desktop

There are often a few other changes to preferred programs like text editors, terminal viewers, file managers, etc. that come with the Desktop environments. Lubuntu, being a lightweight distro meant for older machines with fewer resources especially has some changes to default installations - mostly replacing Ubuntu's default programs with other smaller, less resource-hungry programs so that you can get the most out of a system with a small hard drive and low RAM.

It's still a lot of work to maintain the differences in the packages and the separate desktop environments, but the differences in the Ubuntu flavors largely come down to selecting between a few swappable programs - you can even install a different desktop environment and uninstall your original one and effectively change flavors -- since they're all built on the same basic Ubuntu base built on the same Debian base.

You could think of Lubuntu as a partnership between Debian, Canonical, the LXQT team, and everyone else that contributes to the GNU/LINUX operating system. I don't know the breakdown of funding, but as it's supported by Canonical, I suspect most of the funding is supported the same as the regular Ubuntu release with LXQT mostly supporting their desktop environment.

Comment Re:With spinning disks, you do not know either (Score 1) 358

I always go with Samsung EVO or PRO. Things may be different now, but when I was first in the market for SSDs, Samsung was the only one that designed every part of the device - not a cobbled together mess of components and software from various vendors made into a franken-device that might work ok most of the time. Now, I just buy Samsungs out of habit & the fact I've never had one fail on me. Samsung DID have a huge blunder with one or two specific lines of SSDs, but that was widespread with those specific models, not random deaths on random models.

I've never had to use the Samsung software included other than a firmware update once, but it has lots of tools for diagnostics and recovery. I can't vouch for how well they work since I haven't had to use them.

No drive will last forever, but considering I generally put my apps and OS on the C: Samsung and all my media and Windows profile on separate drives, my write/overwrite rate on the SSD is consistent with allowing it to last until sometime after our Sun turns into a red giant.

Comment Re:Live Bookmarks (Score 1) 199

Man, I thought they killed that feature way back when they axed RSS.

I loved that feature - let me see all the FARK, Slashdot, Cnet, and other techie article headlines with a click and hover & it auto-updated so I didn't have to even visit the actual pages, just picked the story I wanted right from the bookmarks.

I switched to Chrome way, way back after FF made it near impossible for some of my other workflows, extensions, etc to work properly. I also got tired of having to track down an extension or two that quit working due to either changes or lack of support.

Wow, that reminds me. Remember when FF took out the ability to even see the html link on hover that used to help you figure out whether a link was a phishing attack or some other malware site disguised as a credible link? It took me an extension or two just to bring that functionality back. Those guys are crazy. I'll stick with Chrome, Chromium, Opera, and/or Vivaldi. FF is dead to me.

Comment Re:Huh? (Score 3, Insightful) 222

I tend to agree, and apparently so do IBM, Google, et al. Still, the larger the system, the more error prone it becomes. Obviously, we have quantum computers (or at least functioning parts of ones) working today and can entangle up to 50 qubits or more with relative stability... but, the question is whether we can do it at the scale needed to be "useful" (according to this individual) without losing the signal for all the noise.

This person's perspective is that what we naively see as an engineering problem to be resolved with future refinements is actually an issue that can't be resolved because nature at a fundamental particle physics level can't be controlled or tuned to the degree necessary to get one working, nor reasonably checked for accuracy because the states to be checked are beyond astronomical.

Comment Re:Huh? (Score 3, Interesting) 222

I think the key problem is theory vs physical reality. In theory, if you have a set of qubits entangled with zero noise at near absolute zero, you can send a quantum program to the qubits & have them process your data without you worrying about what their individual states are & then capture their completed output.

In reality, how do you entangle enough qubits to be useful? How do you prevent noise or correct for the errors of noise? How do you ensure your qubits are properly entangled? How do you accurately send your quantum program to the qubits for processing? How do you aide in processing the qubits accurately without generating more noise? How do you extract the output without generating more noise? And ultimately, how are you going to ensure that you are entangling 10^300 attributes of your qubits perfectly in the first place, much less correcting for errors in processing them?

I think the TL,DR is that this quantum physicist sees all the places errors can creep in and how difficult it can be to correct for them. The answers he sees coming from the community seems to be to just add more qubits for error correction - or even process the same data with multiple quantum computers or with multiple paths through the same qubits.

I understand his/her frustration. It seems a difficult task to precisely manipulate qubits using modern technology, and an impossible task to know and/or set the states of everything to ultimately know for certain whether an error has been generated.

Comment Re:Most suck. (Score 5, Interesting) 195

As a fellow Nexus 7 2013 owner, I share your pain in finding an improved model after so many years. I like the specs of the M5, but I hear there are 2.4 Ghz wifi / Bluetooth interference issues and there's no 3.5 headphone jack, so the search goes on for me.

I get what you're saying about the marketing, but really... tablets just have a much longer product life cycle and the profits are razor thin, so there aren't many models. Phones which are still often replaced every 2 years (thanks to "new every 2" phone plans) have a much shorter cycle and can be mass produced at a much larger scale.

The tablet market was saturated quickly. Then, e-readers and smart phones cannibalized most of the tablet market. Amazon's Fire HD tablets and other low-end tablets ate the rest of the Android tablet market. Most adults have large smart phones and give their kids the cheap, even larger tablets. (You can get a refurbished Fire HD 10" for only $120... or a Fire HD 8" Kid's Edition for $130)

Me, I want something like the M5, but with better quality wifi/bluetooth... and I'll use a USB C to 3.5 jack if I have to, but I'd rather have the native 3.5 jack. The M5 has double the cores and RAM of my Nexus with a higher def screen and 4x the internal storage plus a card slot for more. NICE! But, it doesn't come with Android 9.... and there's no indication of when it'll get it - if ever. With Nexus devices, Google does OTA updates almost instantaneously upon release, but even Google only supports devices for a couple years, then you're on your own with your unsupported device with gaping security holes.

The Android ecosystem is enough to make me want to pull my hair out over the security issues and lack of support and updates by hardware manufacturers. I'd like to switch to LineageOS, but they're in eternal beta as well.

Slashdot Top Deals

Kleeneness is next to Godelness.

Working...