DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

'Breakthrough' LI-RAM Material Can Store Data With Light ( 104

A Vancouver researcher has patented a new material that uses light instead of electricity to store data. An anonymous reader writes: LI-RAM -- that's light induced magnetoresistive random-access memory -- promises supercomputer speeds for your cellphones and laptops, according to Natia Frank, the materials scientist at the University of Victoria who developed the new material as part of an international effort to reduce the heat and power consumption of modern processors. She envisions a world of LI-RAM mobile devices which are faster, thinner, and able to hold much more data -- all while consuming less power and producing less heat.

And best of all, they'd last twice as long on a single charge (while producing almost no heat), according to a report on CTV News, which describes this as "a breakthrough material" that will not only make smartphones faster and more durable, but also more energy-efficient. The University of Victoria calculates that's 10% of the world's electricity is consumed by "information communications technology," so LI-RAM phones could conceivably cut that figure in half.

They also report that the researcher is "working with international electronics manufacturers to optimize and commercialize the technology, and says it could be available on the market in the next 10 years."

Japan Unveils Next-Generation, Pascal-Based AI Supercomputer ( 121

The Tokyo Institute of Technology has announced plans to launch Japan's "fastest AI supercomputer" this summer. The supercomputer is called Tsubame 3.0 and will use Nvidia's latest Pascal-based Tesla P100 GPU accelerators to double its performance over its predecessor, the Tsubame 2.5. Slashdot reader kipperstem77 shares an excerpt from a report via The Next Platform: With all of those CPUs and GPUs, Tsubame 3.0 will have 12.15 petaflops of peak double precision performance, and is rated at 24.3 petaflops single precision and, importantly, is rated at 47.2 petaflops at the half precision that is important for neural networks employed in deep learning applications. When added to the existing Tsubame 2.5 machine and the experimental immersion-cooled Tsubame-KFC system, TiTech will have a total of 6,720 GPUs to bring to bear on workloads, adding up to a total of 64.3 aggregate petaflops at half precision. (This is interesting to us because that means Nvidia has worked with TiTech to get half precision working on Kepler GPUs, which did not formally support half precision.)

World's Largest Hedge Fund To Replace Managers With Artificial Intelligence ( 209

An anonymous reader quotes a report from The Guardian: The world's largest hedge fund is building a piece of software to automate the day-to-day management of the firm, including hiring, firing and other strategic decision-making. Bridgewater Associates has a team of software engineers working on the project at the request of billionaire founder Ray Dalio, who wants to ensure the company can run according to his vision even when he's not there, the Wall Street Journal reported. The firm, which manages $160 billion, created the team of programmers specializing in analytics and artificial intelligence, dubbed the Systematized Intelligence Lab, in early 2015. The unit is headed up by David Ferrucci, who previously led IBM's development of Watson, the supercomputer that beat humans at Jeopardy! in 2011. The company is already highly data-driven, with meetings recorded and staff asked to grade each other throughout the day using a ratings system called "dots." The Systematized Intelligence Lab has built a tool that incorporates these ratings into "Baseball Cards" that show employees' strengths and weaknesses. Another app, dubbed The Contract, gets staff to set goals they want to achieve and then tracks how effectively they follow through. These tools are early applications of PriOS, the over-arching management software that Dalio wants to make three-quarters of all management decisions within five years. The kinds of decisions PriOS could make include finding the right staff for particular job openings and ranking opposing perspectives from multiple team members when there's a disagreement about how to proceed. The machine will make the decisions, according to a set of principles laid out by Dalio about the company vision.

IBM On Track To Get More Than 7,000 US Patents In 2016 ( 34

IBM wants to put the patent war in perspective. Big Blue said that it is poised to get the most U.S. patents of any tech company for the 24th year in a row. From a report on VentureBeat: In 2015, IBM received more than 7,355 patents, down slightly from 7,534 in 2014. A spokesperson for IBM said the company is on track to receive well over 7,000 patents in 2016. In 2016, IBM is also hitting another interesting milestone, with more than 1,000 patents for artificial intelligence and cognitive computing. IBM has been at it for more than a century, and it is seeking patents in key strategic areas -- such as AI and cognitive computing. In fact, one-third of IBM's researchers are dedicated to cognitive computing. IBM CEO Ginni Rometty said during the World of Watson conference in October that the company expects to reach more than 1 billion consumers via Watson by the end of 2017. (Watson is the supercomputer that beat the world's best Jeopardy player in 2011.)

Erich Bloch, Who Helped Develop IBM Mainframe, Dies At 91 ( 40

shadowknot writes: The New York Times is reporting (Warning: may be paywalled; alternate source) that Erich Bloch who helped to develop the IBM Mainframe has died at the age of 91 as a result of complications from Alzheimer's disease. From the article: "In the 1950s, he developed the first ferrite-core memory storage units to be used in computers commercially and worked on the IBM 7030, known as Stretch, the first transistorized supercomputer. 'Asked what job each of us had, my answer was very simple and very direct,' Mr. Bloch said in 2002. 'Getting that sucker working.' Mr. Bloch's role was to oversee the development of Solid Logic Technology -- half-inch ceramic modules for the microelectronic circuitry that provided the System/360 with superior power, speed and memory, all of which would become fundamental to computing."

Japan Eyes World's Fastest-Known Supercomputer, To Spend Over $150M On It ( 35

Japan plans to build the world's fastest-known supercomputer in a bid to arm the country's manufacturers with a platform for research that could help them develop and improve driverless cars, robotics and medical diagnostics. From a Reuters report: The Ministry of Economy, Trade and Industry will spend 19.5 billion yen ($173 million) on the previously unreported project, a budget breakdown shows, as part of a government policy to get back Japan's mojo in the world of technology. The country has lost its edge in many electronic fields amid intensifying competition from South Korea and China, home to the world's current best-performing machine. In a move that is expected to vault Japan to the top of the supercomputing heap, its engineers will be tasked with building a machine that can make 130 quadrillion calculations per second -- or 130 petaflops in scientific parlance -- as early as next year, sources involved in the project told Reuters. At that speed, Japan's computer would be ahead of China's Sunway Taihulight that is capable of 93 petaflops. "As far as we know, there is nothing out there that is as fast," said Satoshi Sekiguchi, a director general at Japan's âZNational Institute of Advanced Industrial Science and Technology, where the computer will be built.

China's New Policing Computer Is Frontend Cattle Prod, Backend Supercomputer ( 69

Earlier this year, we learned about China's first "intelligent security robot," which was said to include "electrically charged riot control tool." We now know what this robot is up to, and what its developed unit looks like. Reader dcblogs writes: China recently deployed what it calls a "security robot" in a Shenzhen airport. It's named AnBot and patrols around the clock. It is a cone-shaped robot that includes a cattle prod. The U.S.-China Economic and Security Review Commission, which look at autonomous system deployments in a report last week, said AnBot, which has facial recognition capability, is designed to be linked with China's latest supercomputers. AnBot may seem like a 'Saturday Night Live' prop, but it's far from it. The back end of this "intelligent security robot" is linked to China's Tianhe-2 supercomputer, where it has access to cloud services. AnBot conducts patrols, recognizes threats and has multiple cameras that use facial recognition. These cloud services give the robots petascale processing power, well beyond onboard processing capabilities in the robot. The supercomputer connection is there "to enhance the intelligent learning capabilities and human-machine interface of these devices," said the U.S.-China Economic and Security Review.

A British Supercomputer Can Predict Winter Weather a Year In Advance ( 177

The national weather service of the U.K. claims it can now predict the weather up to a year in advance. An anonymous reader quotes The Stack: The development has been made possible thanks to supercomputer technology granted by the UK Government in 2014. The £97 million high-performance computing facility has allowed researchers to increase the resolution of climate models and to test the retrospective skill of forecasts over a 35-year period starting from 1980... The forecasters claim that new supercomputer-powered techniques have helped them develop a system to accurately predict North Atlantic Oscillation -- the climatic phenomenon which heavily impacts winters in the U.K.
The researchers apparently tested their supercomputer on 36 years worth of data, and reported proudly that they could predict winter weather a year in advance -- with 62% accuracy.

Nvidia Calls Out Intel For Cheating In Xeon Phi vs GPU Benchmarks ( 58

An anonymous reader writes: Nvidia has called out Intel for juicing its chip performance in specific benchmarks -- accusing Intel of publishing some incorrect "facts" about the performance of its long-overdue Knights Landing Xeon Phi cards. Nvidia's primary beef is with the following Intel slide, which was presented at a high performance computing conference (ISC 2016). Nvidia disputes Intel's claims that Xeon Phi provides "2.3x faster training" for neural networks and that it has "38 percent better scaling" across nodes. It looks like Intel opted for the classic using-an-old-version-of-some-benchmarking-software manoeuvre. Intel claimed that a Xeon Phi system is 2.3 times faster at training a neural network than a comparable Maxwell GPU system; Nvidia says that if Intel used an up-to-date version of the benchmark (Caffe AlexNet), the Maxwell system is actually 30 percent faster. And of course, Maxwell is Nvidia's last-gen part; the company says a comparable Pascal-based system would be 90 percent faster. On the 38-percent-better-scaling point, Nvidia says that Intel compared 32 of its new Xeon Phi servers against four-year-old Nvidia Kepler K20 servers being used in ORNL's Titan supercomputer. Nvidia states that modern GPUs, paired with a newer interconnect, scale "almost linearly up to 128 GPUs."

DARPA Will Stage an AI Fight in Las Vegas For DEF CON ( 89

An anonymous Slashdot reader writes: "A bunch of computers will try to hack each other in Vegas for a $2 million prize," reports Tech Insider calling it a "historic battle" that will coincide with "two of the biggest hacking conferences, Blackhat USA and DEFCON". DARPA will supply seven teams with a supercomputer. Their challenge? Create an autonomous A.I. system that can "hunt for security vulnerabilities that hackers can exploit to attack a computer, create a fix that patches that vulnerability and distribute that patch -- all without any human interference."

"The idea here is to start a technology revolution," said Mike Walker, DARPA's manager for the Cyber Grand Challenge contest. Yahoo Tech notes that it takes an average of 312 days before security vulnerabilities are discovered -- and 24 days to patch it. "if all goes well, the CGC could mean a future where you don't have to worry about viruses or hackers attacking your computer, smartphone or your other connected devices. At a national level, this technology could help prevent large-scale attacks against things like power plants, water supplies and air-traffic infrastructure.

It's being billed as "the world's first all-machine hacking tournament," with a prize of $2 million for the winner, while the second and third place tem will win $1 million and $750,000.

How Richard Feynman's Diagrams Almost Saved Space ( 42

An anonymous Slashdot reader shares a fond remembrance of Richard Feynman written by Nobel prize-winner Frank Wilczek, describing not only the history of dark energy and field theory, but how Feynman's influential diagrams "embody a deep shift in thinking about how the universe is put together... a beautiful new way to think about fundamental processes". Richard Feynman looked tired when he wandered into my office. It was the end of a long, exhausting day in Santa Barbara, sometime around 1982... I described to Feynman what I thought were exciting if speculative new ideas such as fractional spin and anyons. Feynman was unimpressed, saying: "Wilczek, you should work on something real..."

Looking to break the awkward silence that followed, I asked Feynman the most disturbing question in physics, then as now: "There's something else I've been thinking a lot about: Why doesn't empty space weigh anything?"

Feynman replied "I once thought I had that one figured out. It was beautiful..." then launched into a "surreal" monologue about how "there's nothing there!" But Wilczek remembers that "The calculations that eventually got me a Nobel Prize in 2004 would have been literally unthinkable without Feynman diagrams, as would my calculations that established a route to production and observation of the Higgs particle." His article culminates with a truly beautiful supercomputer-generated picture showing gluon field fluctuations as we now understand them today, and demonstrating the kind of computer-assisted calculations which in coming years "will revolutionize our quantitative understanding of nuclear physics over a broad front."

Fujitsu Picks 64-Bit ARM For Post-K Supercomputer ( 30

An anonymous reader writes: At the International Supercomputing Conference 2016 in Frankfurt, Germany, Fujitsu revealed its Post-K machine will run on ARMv8 architecture. The Post-K machine is supposed to have 100 times more application performance than the K Supercomputer -- which would make it a 1,000 PFLOPS beast -- and is due to go live in 2020. The K machine is the fifth fastest known super in the world, it crunches 10.5 PFLOPS, needs 12MW of power, and is built out of 705,000 Sparc64 VIIIfx cores.InfoWorld has more details.

China Builds World's Fastest Supercomputer Without U.S. Chips ( 247

Reader dcblogs writes: China on Monday revealed its latest supercomputer, a monolithic system with 10.65 million compute cores built entirely with Chinese microprocessors. This follows a U.S. government decision last year to deny China access to Intel's fastest microprocessors. There is no U.S.-made system that comes close to the performance of China's new system, the Sunway TaihuLight. Its theoretical peak performance is 124.5 petaflops (Linpack is 93 petaflops), according to the latest biannual release today of the world's Top500 supercomputers. It has been long known that China was developing a 100-plus petaflop system, and it was believed that China would turn to U.S. chip technology to reach this performance level. But just over a year ago, in a surprising move, the U.S. banned Intel from supplying Xeon chips to four of China's top supercomputing research centers. The U.S. initiated this ban because China, it claimed, was using its Tianhe-2 system for nuclear explosive testing activities. The U.S. stopped live nuclear testing in 1992 and now relies on computer simulations. Critics in China suspected the U.S. was acting to slow that nation's supercomputing development efforts. There has been nothing secretive about China's intentions. Researchers and analysts have been warning all along that U.S. exascale (an exascale is 1,000 petaflops) development, supercomputing's next big milestone, was lagging.

Olli is a 3D Printed, IBM Watson-Powered, Self-Driving Minibus ( 50

An anonymous reader writes from a report via Phys.Org: Arizona-based startup Local Motors unveiled Olli -- a 3D-printed minibus capable of carrying 12 people. It's powered by IBM's supercomputer platform Watson and is designed as an on-demand transportation solution that passengers can summon with a mobile app. The company claims it can be "printed" to specification in "micro factories" in a matter of hours. They say it is ready to go as soon as regulations allow it to hit the streets. While Local Motors has developed the system to control the driving, IBM's Watson system is used to provide the user interface so passengers can have "conversations" with Olli. "Watson is bringing an understanding to the vehicle," said IBM's Bret Greenstein. "If you have someplace you need to be you can say that in your own words. A vehicle that understands human language, where you can walk in and say, 'I'd like to get to work,' that lets you as a passenger relax and enjoy your journey," he said. The vehicle relies on more than 30 sensors and streams of data from IBM's cloud. Olli will be demonstrated in National Harbor, Maryland, over the next few months with additional trials expected in Las Vegas and Miami.

$30M Stampede 2 Supercomputer To Provide 18 Petaflops of Power To Researchers Nationwide ( 44

An anonymous reader writes: Funded by grants from the National Science Foundation and built at the University of Texas at Austin, the Stampede 2 supercomputer looks to contend with the global supercomputer Top 5. With 18 petaflops of processing power, it aims to help any researcher with a problem requiring intense number crunching. For example, atomic and atmospheric science simulations would take years to work-out on a desktop PC but only days on a supercomputer. Texas Advanced Computing Center director Dan Stanzione said in a UT press release, "Stampede has been used for everything from determining earthquake risks to help set building codes for homes and commercial buildings, to computing the largest mathematical proof ever constructed." The Stampede 2 is about twice as powerful as the original Stampede, which was activated in March of 2013. Instead of the 22nm fabrication tech in the original Stampede, the Stampede 2 will feature 14nm Xeon Phi chips codenamed "Knights Landing" forming 72 cores compared the original system's 61 cores. With double the RAM, storage and data bandwidth, the Stampede 2 can shift up to 100 gigabits per second, and its DDR4 RAM can perform fast enough to work as a third-level cache as well as fulfill ordinary memory roles. In addition, it will feature 3D Xpoint non-volatile memory. It will be at least a year before the Stampede 2 is powered up since it just received funding.

Tech CEOs Declare This the Era of Artificial Intelligence ( 178

You will be hearing a lot about AI and machine learning in the coming years. At Recode's iconic conference this week, a number of top executives revealed -- and reiterated -- their increasingly growing efforts to capture the nascent technology category. From a Reuters report (condensed): Sundar Pichai, chief executive of Alphabet's Google, said he sees a "huge opportunity" in AI. Google first started applying the technology through "deep neural networks" to voice recognition software about three to four years ago and is ahead of rivals such as, Apple, and Microsoft in machine learning, Pichai said.
Amazon CEO Jeff Bezos predicted a profound impact on society over the next 20 years. "It's really early but I think we're on the edge of a golden era. It's going to be so exciting to see what happens," he said.
IBM CEO Ginni Rometty said the company has been working on artificial technology, which she calls a cognitive system, since 2005 when it started developing its Watson supercomputer.
Artificial intelligence and machine learning will create computers so sophisticated and godlike that humans will need to implant "neural laces" in their brains to keep up, Tesla Motors and SpaceX CEO Elon Musk told a crowd of tech leaders this week.
Microsoft, which was absent from the event, is also working on bots and AI technologies. One company that is seemingly off the picture is Apple.

Computer Generates Largest Math Proof Ever At 200TB of Data ( 143

An anonymous reader quotes a report from Phys.Org: A trio of researchers has solved a single math problem by using a supercomputer to grind through over a trillion color combination possibilities, and in the process has generated the largest math proof ever -- the text of it is 200 terabytes in size. The math problem has been named the boolean Pythagorean Triples problem and was first proposed back in the 1980's by mathematician Ronald Graham. In looking at the Pythagorean formula: a^2 + b^2 = c^2, he asked, was it possible to label each a non-negative integer, either blue or red, such that no set of integers a, b and c were all the same color. To solve this problem the researchers applied the Cube-and-Conquer paradigm, which is a hybrid of the SAT method for hard problems. It uses both look-ahead techniques and CDCL solvers. They also did some of the math on their own ahead of giving it over to the computer, by using several techniques to pare down the number of choices the supercomputer would have to check, down to just one trillion (from 10^2,300). Still the 800 processor supercomputer ran for two days to crunch its way through to a solution. After all its work, and spitting out the huge data file, the computer proof showed that yes, it was possible to color the integers in multiple allowable ways -- but only up to 7,824 -- after that point, the answer became no. Is the proof really a proof if it does not answer why there is a cut-off point at 7,825, or even why the first stretch is possible? Does it really exist?
Open Source

Infographic: Ubuntu Linux Is Everywhere 185

prisoninmate writes: To celebrate the launch of Ubuntu 16.04 LTS, due for release later this month, on April 21, Canonical put together an interesting infographic, showing the world how popular Ubuntu is. From the infographic, it looks like there are over 60 million Ubuntu images launched by Docker users, 14 million Vagrant images of Ubuntu 14.04 LTS from HashiCorp, 20 million launches of Ubuntu instances during 2015 in public and private clouds, as well as bare metal, and 2 million new Ubuntu Cloud instances launched in November 2015. Ubuntu is used on the International Space Station, on the servers of popular online services like Netflix, Snapchat, Pinterest, Reddit, Dropbox, PayPal, Wikipedia, and Instagram, in Google, Tesla, George Hotz, and Uber cars. It is also employed at Bloomberg, Weta Digital and Walmart, at the Brigham Young University to control the Mars Rover, and it is even behind the largest supercomputer in the world.

Supercomputers Help Researchers Improve Severe Hail Storm Forecasts ( 23

aarondubrow writes: Researchers working on the Severe Hail Analysis, Representation and Prediction (SHARP) project at the University of Oklahoma used the Stampede supercomputer to gain a better understanding of the conditions that cause severe hail to form, and to produce hail forecasts with far greater accuracy than those currently used operationally. The model the team used is six times more resolved that the National Weather Service's highest-resolution forecasts and applies machine learning algorithms to improve its predictions. The researchers will publish their results in an upcoming issue of the American Meteorological Society journal Weather and Forecasting.
Hardware Hacking

Using Kexec Allows Starting Linux In PlayStation 4 70

jones_supa writes: Team fail0verflow, the hacker group who made Sony PlayStation 4, has introduced another method to start Linux in the game console. Instead of the previous exploit which was based on a security hole in an old PS4 firmware version, the new trick allows a kexec call to start Linux through Orbis OS (the FreeBSD-based system software of PS4). The code can be found in GitHub. Maybe this will lead to more and better PlayStation clusters.

Slashdot Top Deals