Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Frames & pixels obsolete. The future is a poly (Score 1) 31

It seems inevitable that a movie should be stored as a giant cube built with myraid 3D polygons ("polycube" for a working term), where the axises of the cube are X, Y, and time. There would be no need for frames or pixels, those are only things the end-user's display device will have to create based on its particular technology.

Converting it for display would be like rapid "slicing of the cheese". A given second can be sliced into 10 frames or a 1000, there is no limit, other than computer processing of the display device.

Frame interpolation for smoothing then wouldn't be needed because there are no frames. Older movies can be converted to a polycube using conversion and interpolation algorithms. It could indicate a "favored frame rate" to reduce interpolation anomalies, which would make nostalgic purists happy.

It should also make producers happier because it gives display devices less reason to have to guess.

Privacy

Manufacturer Remotely Bricks Smart Vacuum After Its Owner Blocked It From Collecting Data (tomshardware.com) 120

"An engineer got curious about how his iLife A11 smart vacuum worked and monitored the network traffic coming from the device," writes Tom's Hardware.

"That's when he noticed it was constantly sending logs and telemetry data to the manufacturer — something he hadn't consented to." The user, Harishankar, decided to block the telemetry servers' IP addresses on his network, while keeping the firmware and OTA servers open. While his smart gadget worked for a while, it just refused to turn on soon after... He sent it to the service center multiple times, wherein the technicians would turn it on and see nothing wrong with the vacuum. When they returned it to him, it would work for a few days and then fail to boot again... [H]e decided to disassemble the thing to determine what killed it and to see if he could get it working again...

[He discovered] a GD32F103 microcontroller to manage its plethora of sensors, including Lidar, gyroscopes, and encoders. He created PCB connectors and wrote Python scripts to control them with a computer, presumably to test each piece individually and identify what went wrong. From there, he built a Raspberry Pi joystick to manually drive the vacuum, proving that there was nothing wrong with the hardware. From this, he looked at its software and operating system, and that's where he discovered the dark truth: his smart vacuum was a security nightmare and a black hole for his personal data.

First of all, it's Android Debug Bridge, which gives him full root access to the vacuum, wasn't protected by any kind of password or encryption. The manufacturer added a makeshift security protocol by omitting a crucial file, which caused it to disconnect soon after booting, but Harishankar easily bypassed it. He then discovered that it used Google Cartographer to build a live 3D map of his home. This isn't unusual, by far. After all, it's a smart vacuum, and it needs that data to navigate around his home. However, the concerning thing is that it was sending off all this data to the manufacturer's server. It makes sense for the device to send this data to the manufacturer, as its onboard SoC is nowhere near powerful enough to process all that data. However, it seems that iLife did not clear this with its customers.

Furthermore, the engineer made one disturbing discovery — deep in the logs of his non-functioning smart vacuum, he found a command with a timestamp that matched exactly the time the gadget stopped working. This was clearly a kill command, and after he reversed it and rebooted the appliance, it roared back to life.

Thanks to long-time Slashdot reader registrations_suck for sharing the article.

Comment My takes on this presentation (Score 1) 6

1. There are a lot of empty seats; a lot.

2. The demo wasn't live, likely due to the huge failure of an event that the Meta one was.

3. They noted that you do all of this 'hands-free', likely an intentional knock at Meta's offering.

4. The examples were...odd. Who the fuck is going to be using this to shop for a fucking rug? Come on; give some real-life examples that are IMPORTANT. None of these were.

5. The entire presentation's style, across multiple different presenters, was...exhausting...halting...jarring...and...really undergraduate level. It was almost as if they were being fed what to say in their earpieces, not from memory and not in a fluid and practiced way.

---

Personally? I love the idea of AR glasses that work well. I want to have live subtitles for humans talking to me as I'm hard of hearing and hearing aids do not work well for me, particularly in public spaces.

I want it to give me important information, respond to my environment in ways that are useful (telling me where I am really isn't that; I know where the fuck I am--tell me what I should be doing or where I should be going next, perhaps?)

I know these are early adopter level devices, but they're just fucking ugly due to their bulk.

I strongly prefer this option to Meta's simply because I don't have to do stupid fucking mime-style hand gestures, but I want this technology to be useful, now, not in 5 years. We're going to see this largely flop just like so many other AR/VR toys out there unless they make this something more than a gimmicky piece of shit.

Slashdot Top Deals

Beware of all enterprises that require new clothes, and not rather a new wearer of clothes. -- Henry David Thoreau

Working...