Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Switch competitor makes sense (Score 2) 59

Nintendo Switch was the best-selling console in terms of units and dollar sales for December 2020 and the full year. The Switch's annual dollar sales in 2020 were the second highest for any platform in US history, only trailing 2008 when the sales of the Wii were higher. https://www.gamespot.com/artic... So it makes complete sense for Steam to try to capture a fraction of that market. The Switch’s hardware specs are rather potato, only playing games at 720p when playing in mobile mode. Even a 5 year old low end gtx 1060 could play every game on the market at 1080p, so it makes sense to make a portable PC console with low end specs for a low price that would sell very well.

Comment Waste (Score 3, Insightful) 150

Article answers nothing. My biggest question is how fast is this really charging? If driving on the road for a hour only gives me a extra mile that’s not worth it. If it give me an extra 60 miles, then that might be worth it. But running copper in all these highways seems prohibitively expensive, better batteries, motors and charging technology would be more beneficial.

Comment Re:Same as Apple then (Score 1) 168

I doubt you can get HD video on 80Kbps with a cap of 500MB per month.

You would be surprised. Wyze WiFi cameras show the transmission rate at all times, and I’m constantly surprised at what little bandwidth good compression needs. You’re not going to get clear 1080p video, but if you’re spying on the neighbor SD or even 360p is more than enough.

Comment Apple could take more (Score 0) 101

Doesn’t anyone remember the absolute shit app stores we had before iPhones? Very few free games, and in app purchase didn’t exist. And apple is right, they could take a percentage of amazon and ebay sales made through the apps if they wanted, but they choose not to. Apple isn’t the bad guy here.

Comment Re: Game changer for EV's (Score 1) 298

1. EV parking will increase as EV ownership increases, but for now EV ownership is a sign you have a house with a garage.

2. Yes, EV capable of charging as fast refueling a car within the next 10-20 years. It’s finding fast chargers available that is the problem, even today.

3. Yes, we will have some EVs with over 500 mile range within the next several years

4. Toll roads

Submission + - SPAM: TSMC Is Considering a 3nm Foundry In Arizona

An anonymous reader writes: Reuters reports that TSMC—Taiwan Semiconductor Manufacturing Company, the chip foundry making advanced processors for Apple, AMD, and Qualcomm—is beefing up its plans to build factories in Arizona while turning away from an advanced plant in Europe. Last year, TSMC announced that it would invest $10-$12 billion to build a new 5 nm capable foundry near Phoenix, Arizona. According to Reuters' sources, TSMC officials are considering trebling the company's investment by building a $25 billion second factory capable of building 3 nm chips. More tentative plans are in the works for 2 nm foundries as the Phoenix campus grows over the next 10-15 years as well. TSMC's focus on the US rather than Europe may have a lot to do with the company's market—in Q1 2021, 67 percent of its sales were in North America, 17 percent were in Asia Pacific, and only 6 percent came from Europe and the Middle East. The majority of TSMC's European clients are auto manufacturers who buy cheaper and less-advanced chips.
Link to Original Source

Submission + - SPAM: Language Models Like GPT-3 Could Herald a New Type of Search Engine

An anonymous reader writes: In 1998 a couple of Stanford graduate students published a paper describing a new kind of search engine: “In this paper, we present Google, a prototype of a large-scale search engine which makes heavy use of the structure present in hypertext. Google is designed to crawl and index the Web efficiently and produce much more satisfying search results than existing systems.” The key innovation was an algorithm called PageRank, which ranked search results by calculating how relevant they were to a user’s query on the basis of their links to other pages on the web. On the back of PageRank, Google became the gateway to the internet, and Sergey Brin and Larry Page built one of the biggest companies in the world. Now a team of Google researchers has published a proposal for a radical redesign that throws out the ranking approach and replaces it with a single large AI language model, such as BERT or GPT-3—or a future version of them. The idea is that instead of searching for information in a vast list of web pages, users would ask questions and have a language model trained on those pages answer them directly. The approach could change not only how search engines work, but what they do—and how we interact with them.

[Donald Metzler and his colleagues at Google Research] are interested in a search engine that behaves like a human expert. It should produce answers in natural language, synthesized from more than one document, and back up its answers with references to supporting evidence, as Wikipedia articles aim to do. Large language models get us part of the way there. Trained on most of the web and hundreds of books, GPT-3 draws information from multiple sources to answer questions in natural language. The problem is that it does not keep track of those sources and cannot provide evidence for its answers. There’s no way to tell if GPT-3 is parroting trustworthy information or disinformation—or simply spewing nonsense of its own making.

Metzler and his colleagues call language models dilettantes—“They are perceived to know a lot but their knowledge is skin deep.” The solution, they claim, is to build and train future BERTs and GPT-3s to retain records of where their words come from. No such models are yet able to do this, but it is possible in principle, and there is early work in that direction. There have been decades of progress on different areas of search, from answering queries to summarizing documents to structuring information, says Ziqi Zhang at the University of Sheffield, UK, who studies information retrieval on the web. But none of these technologies overhauled search because they each address specific problems and are not generalizable. The exciting premise of this paper is that large language models are able to do all these things at the same time, he says.

Link to Original Source

Slashdot Top Deals

Prediction is very difficult, especially of the future. - Niels Bohr

Working...