Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×

Comment Re:JavaScript ... and maybe Python (Score 1) 374

Actually I grew up with C, Pascal, Assembly (for many different processors), C++, then learned and taught Java at University. I enjoyed Matlab, Mathematica, Python and Fortran while studying Physics. Now I've landed on JavaScript and if you avoid the ugly bits, it's fun.

Comment Re:JavaScript ... and maybe Python (Score 3, Informative) 374

This is a discussion about platforms that would buckle under the bulk of a micro-OS and a JS interpreter/VM stack. And that's not even handling the issue that most of these devices use embedded hardware platforms that you need to access with specific assembler calls - how would you do that in JS or Python!?

There are a few JavaScript interpreters that use very minimal resources and have access to all the necessary hardware (wifi, BLE, SPI, UART, i2c, etc), these are Duktape http://duktape.org/, Espruino https://github.com/espruino/Es..., JerryScript https://github.com/jerryscript..., and more. These are all designed for IoT devices. For performance this is an interesting read: https://www.espruino.com/Perfo...

Comment JavaScript ... and maybe Python (Score 1, Insightful) 374

Most mods will probably flag this as trolling. But I believe JavaScript is a great language for IoT. There are a few advantages of using JavaScript, it's actually very easy to get networking to work well and reliably. A programmer will be able to write front-end, server-side/less back-end and IoT back-end all in one language. The code will be portable across all these bases (not always needed, but some functions will be universal). There are now a proliferation of embedded devices that support JavaScript "natively" (ESP32, RedBear Duo, and many more). It might not be as fast as C, but it's fast enough. Here is a project I created using JavaScript at every level: https://www.hackster.io/anemoi...

Submission + - New AI Is Capable of Beating Humans At Doom (denofgeek.com)

An anonymous reader writes: Two students at Carnegie Mellon University have designed an artificial intelligence program that is capable of beating human players in a deathmatch game of 1993's Doom. Guillaume Lample and Devendra Singh Chaplot spent four months developing a program capable of playing first-person shooter games. The program made its debut at VizDoom (an AI competition that centered around the classic shooter) where it took second place despite the fact that their creation managed to beat human participants. That's not the impressive part about this program, however. No, what's really impressive is how the AI learns to play. The creator's full write-up on the program (which is available here) notes that their AI "allows developing bots that play the game using the screen buffer." What that means is that the program learns by interpreting what is happening on the screen as opposed to following a pre-set series of command instructions alone. In other words, this AI learns to play in exactly the same way a human player learns to play. This theory has been explored practically before, but Doom is arguably the most complicated game a program fueled by that concept has been able to succeed at. The AI's creators have already confirmed that they will be moving on to Quake, which will be a much more interesting test of this technologies capabilities given that Quake presents a much more complex 3D environment.

Submission + - A U.S. election-system vendor who uses developers in Serbia (computerworld.com)

dcblogs writes: Voting machines are privately manufactured and developed and, as with other many other IT systems, the code is typically proprietary. The use of proprietary systems in elections has its critics. One Silicon Valley group, the Open Source Election Technology Foundation, is pushing for an election system that shifts from proprietary, vendor-owned systems to one that that is owned "by the people of the United States." One major election technology company, Dominion Voting Systems (DVS), develops its systems in the U.S. and Canada but also has an office in Belgrade, Serbia. It was recently advertising openings for four senior software developers in Belgrade. "Like many of America's largest technology companies — which develop some of the software for their products in places like Asia, India, Ireland and the Mideast — some of our software development is undertaken outside the U.S. and Canada, specifically, in Serbia, where we have conducted operations for 10 years," said firm spokesman Chris Riggall.

Submission + - Remember Second Life? Its Fans Say Education Could be Killer App (chronicle.com)

jyosim writes: Ten years ago Second Life was HUGE, and many colleges erected virtual campuses. Now some of those sit empty, but a group of die-hard fans continue to use it for teaching. Some are already saying that new VR headsets like Oculus will usher in the next generation of online education — that students and profs will soon be entering virtual worlds for new a new kind of classroom experience. But how can this new attempt at VR education succeed where previously efforts failed?

Submission + - World's Longest and Deepest Rail Tunnel Opens Under Swiss Alps

HughPickens.com writes: Sewell Chan writes at the NYT that the world’s longest and deepest rail tunnel has opened in Switzerland, nearly seven decades after it was first proposed. The 35-mile twin-bore Gotthard Base Tunnel is the first flat route through the Alps or any other major mountain range, with a maximum height of 1,801 ft above sea level. It is therefore the deepest railway tunnel in the world, with a maximum depth of approximately 7,500 ft), comparable to that of the deepest mines on earth. Engineers had to dig and blast through 73 different kinds of rock, some as hard as granite and others as soft as sugar. More than 28 million tonnes of rock was excavated, which was then broken down to help make the concrete used to build the tunnel. Because the tunnel fully crosses the Alps, the latter range strongly influencing the European climate and that of Switzerland in particular, it can see drastically different weather conditions at the two ends, with, some days, differences of well over 10 C. The new tunnel clears the way for a high-speed rail link under the Swiss Alps that will revolutionize freight and passenger transportation cutting the current four-hour trip between the economic hubs of Zurich and Milan by about an hour. After testing ends this year, around 260 freight trains and 65 passenger trains are expected to travel through the two-tube tunnel each day, reaching speeds approaching 100 miles an hour for freight and 125 miles an hour with passengers. Passenger trains are expected to eventually reach 155 miles an hour. Goods currently carried by a million trucks a year will eventually be moved by trains instead. But the new world record might not last long: China has announced plans for a 76-mile tunnel between the northern port cities of Dalian and Yantai, under the Bohai Strait.

Submission + - Digital Cable Q2Q Enterprise Decoding?

racerx509 writes: I work for a school district in Georgia and we are trying to improve our IPTV distribution system. Currently, we house 30 RG6 cable taps, going to 30 cable boxes on a server rack in our data center, feeding 30 encoders via component cable cables as well as a few IR blasters to change channels. Each encoder has a multicast stream address, which runs through a big switch, that sends the encoded video data across our network. Clients pick up the stream through an internal AD managed web interface.

The whole setup is slated to move later on this year, so I was hoping to improve things by simplifying the cable boxes, taps and encoders. We convinced the cable company to rent us a hospitality style Q2Q installation at our data center, which outputs Pro Idiom DRM encrypted QAM signals, which will need to be decrypted and re-encoded as MPEG4 before I can stream....

I'd like to leave the consumer cable boxes behind, as they require constant babysitting, a hard power reset after inclement weather, and no way to monitor any of it in our NOC. Also, the IR blasters lack security, which has led to random sporting events showing up on all channels during *insert sport season*.

The cable company isn't being overly helpful, as they do not have a way of monetizing a Q2Q system, the way we are using it, so I'm turning to slashdot.

I've found a few solutions that may work, but all are rather expensive. Price isn't the issue, but I want to know if they actually *work* as advertised. I have some experience, but any help or suggestions would be greatly appreciated. Has anyone here had experience in the hospitality or enterprise IPTV sector?

Comment Re:YAM2C (Yet Another My Two Cents) (Score 1) 341

Good point. The reason I like to use JS everywhere is that I only need to learn one language. And it means I can learn it really well. Until I did full-stack JavaScript I had only done front-end JS and it was pretty wonky. Using Node.js means that you need to learn some of the great parts of JS well (closures, async, etc). This drastically improved my front-end JS. If you used the LAMP method for full stack development, you would need to learn Apache config (although once it's running it's okay), PHP, JavaScript and SQL, and I would not have the time to learn the subtleties of each language.

I agree that the object model and patterns for MongoDB are different, the object mode and patterns for front-end and back-end are very similar in many cases, there is a big overlap. Things you use on all three are:
  - Callbacks
  - Closures
  - Prototypical inheritance (front-end and back-end)
  - Event emitters (front-end and back-end)
  - And many more like described here: https://www.smashingmagazine.c...

Code re-usability is useful at times, I was able to write a library (https://github.com/psiphi75/web-remote-control) that was used on the server, on the front-end and on an embedded device and I guess around 80% of the code is shared across all three. Imagine writing and debugging that code in three different languages.

Comment YAM2C (Yet Another My Two Cents) (Score 1) 341

I dove straight into Node.js to develop a platform around 3 years, I don't regret using Node.js, in fact I am glad that I used it. I essentially used the MEAN stack (MongoDB, Express, Angular and Node.js). It was great to:
1) use JS everywhere: back-end (including the DB) and front-end.
2) use JS: it's fun (for me) - if you use the right parts. And it performs fast enough, on par with PHP, if not faster.
3) have an experienced community - JS and Node.js has gone through it's teething issues already.
4) do async programming - if you do it right, you tend to keep your code more modular.

What was painful:
1) Learning to write JS the "right way" and how to avoid the bad and ugly parts.
2) At the time there was no great CMS, I believe Keystone is the best at the moment, but it looks very light when compared to Wordpress.

What you need to do if you go with Node.js:
1) Learn JS well, learn "The Good, The Bad and The Ugly", such that you can avoid the Bad and Ugly. The good is actually awesome.
2) Understand prototypical inheritance, it's not your classical classes, but it a powerful and memory efficient way create objects.
3) Use a linter to write your code, like eslint, it will help you avoid the bad and ugly parts.

I still use Node.js today, but now for the Internet of Things and my embedded device runs Node.js. JS is everywhere and it's going to remain everywhere.

Slashdot Top Deals

%DCL-MEM-BAD, bad memory VMS-F-PDGERS, pudding between the ears