Compare cell phone plans using Wirefly's innovative plan comparison tool ×
Intel

Nvidia Calls Out Intel For Cheating In Xeon Phi vs GPU Benchmarks (arstechnica.com) 58

An anonymous reader writes: Nvidia has called out Intel for juicing its chip performance in specific benchmarks -- accusing Intel of publishing some incorrect "facts" about the performance of its long-overdue Knights Landing Xeon Phi cards. Nvidia's primary beef is with the following Intel slide, which was presented at a high performance computing conference (ISC 2016). Nvidia disputes Intel's claims that Xeon Phi provides "2.3x faster training" for neural networks and that it has "38 percent better scaling" across nodes. It looks like Intel opted for the classic using-an-old-version-of-some-benchmarking-software manoeuvre. Intel claimed that a Xeon Phi system is 2.3 times faster at training a neural network than a comparable Maxwell GPU system; Nvidia says that if Intel used an up-to-date version of the benchmark (Caffe AlexNet), the Maxwell system is actually 30 percent faster. And of course, Maxwell is Nvidia's last-gen part; the company says a comparable Pascal-based system would be 90 percent faster. On the 38-percent-better-scaling point, Nvidia says that Intel compared 32 of its new Xeon Phi servers against four-year-old Nvidia Kepler K20 servers being used in ORNL's Titan supercomputer. Nvidia states that modern GPUs, paired with a newer interconnect, scale "almost linearly up to 128 GPUs."
Security

DARPA Will Stage an AI Fight in Las Vegas For DEF CON (yahoo.com) 89

An anonymous Slashdot reader writes: "A bunch of computers will try to hack each other in Vegas for a $2 million prize," reports Tech Insider calling it a "historic battle" that will coincide with "two of the biggest hacking conferences, Blackhat USA and DEFCON". DARPA will supply seven teams with a supercomputer. Their challenge? Create an autonomous A.I. system that can "hunt for security vulnerabilities that hackers can exploit to attack a computer, create a fix that patches that vulnerability and distribute that patch -- all without any human interference."

"The idea here is to start a technology revolution," said Mike Walker, DARPA's manager for the Cyber Grand Challenge contest. Yahoo Tech notes that it takes an average of 312 days before security vulnerabilities are discovered -- and 24 days to patch it. "if all goes well, the CGC could mean a future where you don't have to worry about viruses or hackers attacking your computer, smartphone or your other connected devices. At a national level, this technology could help prevent large-scale attacks against things like power plants, water supplies and air-traffic infrastructure.

It's being billed as "the world's first all-machine hacking tournament," with a prize of $2 million for the winner, while the second and third place tem will win $1 million and $750,000.
Space

How Richard Feynman's Diagrams Almost Saved Space (quantamagazine.org) 42

An anonymous Slashdot reader shares a fond remembrance of Richard Feynman written by Nobel prize-winner Frank Wilczek, describing not only the history of dark energy and field theory, but how Feynman's influential diagrams "embody a deep shift in thinking about how the universe is put together... a beautiful new way to think about fundamental processes". Richard Feynman looked tired when he wandered into my office. It was the end of a long, exhausting day in Santa Barbara, sometime around 1982... I described to Feynman what I thought were exciting if speculative new ideas such as fractional spin and anyons. Feynman was unimpressed, saying: "Wilczek, you should work on something real..."

Looking to break the awkward silence that followed, I asked Feynman the most disturbing question in physics, then as now: "There's something else I've been thinking a lot about: Why doesn't empty space weigh anything?"

Feynman replied "I once thought I had that one figured out. It was beautiful..." then launched into a "surreal" monologue about how "there's nothing there!" But Wilczek remembers that "The calculations that eventually got me a Nobel Prize in 2004 would have been literally unthinkable without Feynman diagrams, as would my calculations that established a route to production and observation of the Higgs particle." His article culminates with a truly beautiful supercomputer-generated picture showing gluon field fluctuations as we now understand them today, and demonstrating the kind of computer-assisted calculations which in coming years "will revolutionize our quantitative understanding of nuclear physics over a broad front."
Hardware

Fujitsu Picks 64-Bit ARM For Post-K Supercomputer (theregister.co.uk) 30

An anonymous reader writes: At the International Supercomputing Conference 2016 in Frankfurt, Germany, Fujitsu revealed its Post-K machine will run on ARMv8 architecture. The Post-K machine is supposed to have 100 times more application performance than the K Supercomputer -- which would make it a 1,000 PFLOPS beast -- and is due to go live in 2020. The K machine is the fifth fastest known super in the world, it crunches 10.5 PFLOPS, needs 12MW of power, and is built out of 705,000 Sparc64 VIIIfx cores.InfoWorld has more details.
China

China Builds World's Fastest Supercomputer Without U.S. Chips (computerworld.com) 247

Reader dcblogs writes: China on Monday revealed its latest supercomputer, a monolithic system with 10.65 million compute cores built entirely with Chinese microprocessors. This follows a U.S. government decision last year to deny China access to Intel's fastest microprocessors. There is no U.S.-made system that comes close to the performance of China's new system, the Sunway TaihuLight. Its theoretical peak performance is 124.5 petaflops (Linpack is 93 petaflops), according to the latest biannual release today of the world's Top500 supercomputers. It has been long known that China was developing a 100-plus petaflop system, and it was believed that China would turn to U.S. chip technology to reach this performance level. But just over a year ago, in a surprising move, the U.S. banned Intel from supplying Xeon chips to four of China's top supercomputing research centers. The U.S. initiated this ban because China, it claimed, was using its Tianhe-2 system for nuclear explosive testing activities. The U.S. stopped live nuclear testing in 1992 and now relies on computer simulations. Critics in China suspected the U.S. was acting to slow that nation's supercomputing development efforts. There has been nothing secretive about China's intentions. Researchers and analysts have been warning all along that U.S. exascale (an exascale is 1,000 petaflops) development, supercomputing's next big milestone, was lagging.
AI

Olli is a 3D Printed, IBM Watson-Powered, Self-Driving Minibus (phys.org) 50

An anonymous reader writes from a report via Phys.Org: Arizona-based startup Local Motors unveiled Olli -- a 3D-printed minibus capable of carrying 12 people. It's powered by IBM's supercomputer platform Watson and is designed as an on-demand transportation solution that passengers can summon with a mobile app. The company claims it can be "printed" to specification in "micro factories" in a matter of hours. They say it is ready to go as soon as regulations allow it to hit the streets. While Local Motors has developed the system to control the driving, IBM's Watson system is used to provide the user interface so passengers can have "conversations" with Olli. "Watson is bringing an understanding to the vehicle," said IBM's Bret Greenstein. "If you have someplace you need to be you can say that in your own words. A vehicle that understands human language, where you can walk in and say, 'I'd like to get to work,' that lets you as a passenger relax and enjoy your journey," he said. The vehicle relies on more than 30 sensors and streams of data from IBM's cloud. Olli will be demonstrated in National Harbor, Maryland, over the next few months with additional trials expected in Las Vegas and Miami.
Power

$30M Stampede 2 Supercomputer To Provide 18 Petaflops of Power To Researchers Nationwide (techcrunch.com) 44

An anonymous reader writes: Funded by grants from the National Science Foundation and built at the University of Texas at Austin, the Stampede 2 supercomputer looks to contend with the global supercomputer Top 5. With 18 petaflops of processing power, it aims to help any researcher with a problem requiring intense number crunching. For example, atomic and atmospheric science simulations would take years to work-out on a desktop PC but only days on a supercomputer. Texas Advanced Computing Center director Dan Stanzione said in a UT press release, "Stampede has been used for everything from determining earthquake risks to help set building codes for homes and commercial buildings, to computing the largest mathematical proof ever constructed." The Stampede 2 is about twice as powerful as the original Stampede, which was activated in March of 2013. Instead of the 22nm fabrication tech in the original Stampede, the Stampede 2 will feature 14nm Xeon Phi chips codenamed "Knights Landing" forming 72 cores compared the original system's 61 cores. With double the RAM, storage and data bandwidth, the Stampede 2 can shift up to 100 gigabits per second, and its DDR4 RAM can perform fast enough to work as a third-level cache as well as fulfill ordinary memory roles. In addition, it will feature 3D Xpoint non-volatile memory. It will be at least a year before the Stampede 2 is powered up since it just received funding.
AI

Tech CEOs Declare This the Era of Artificial Intelligence (fortune.com) 178

You will be hearing a lot about AI and machine learning in the coming years. At Recode's iconic conference this week, a number of top executives revealed -- and reiterated -- their increasingly growing efforts to capture the nascent technology category. From a Reuters report (condensed): Sundar Pichai, chief executive of Alphabet's Google, said he sees a "huge opportunity" in AI. Google first started applying the technology through "deep neural networks" to voice recognition software about three to four years ago and is ahead of rivals such as Amazon.com, Apple, and Microsoft in machine learning, Pichai said.
Amazon CEO Jeff Bezos predicted a profound impact on society over the next 20 years. "It's really early but I think we're on the edge of a golden era. It's going to be so exciting to see what happens," he said.
IBM CEO Ginni Rometty said the company has been working on artificial technology, which she calls a cognitive system, since 2005 when it started developing its Watson supercomputer.
Artificial intelligence and machine learning will create computers so sophisticated and godlike that humans will need to implant "neural laces" in their brains to keep up, Tesla Motors and SpaceX CEO Elon Musk told a crowd of tech leaders this week.
Microsoft, which was absent from the event, is also working on bots and AI technologies. One company that is seemingly off the picture is Apple.
Education

Computer Generates Largest Math Proof Ever At 200TB of Data (phys.org) 143

An anonymous reader quotes a report from Phys.Org: A trio of researchers has solved a single math problem by using a supercomputer to grind through over a trillion color combination possibilities, and in the process has generated the largest math proof ever -- the text of it is 200 terabytes in size. The math problem has been named the boolean Pythagorean Triples problem and was first proposed back in the 1980's by mathematician Ronald Graham. In looking at the Pythagorean formula: a^2 + b^2 = c^2, he asked, was it possible to label each a non-negative integer, either blue or red, such that no set of integers a, b and c were all the same color. To solve this problem the researchers applied the Cube-and-Conquer paradigm, which is a hybrid of the SAT method for hard problems. It uses both look-ahead techniques and CDCL solvers. They also did some of the math on their own ahead of giving it over to the computer, by using several techniques to pare down the number of choices the supercomputer would have to check, down to just one trillion (from 10^2,300). Still the 800 processor supercomputer ran for two days to crunch its way through to a solution. After all its work, and spitting out the huge data file, the computer proof showed that yes, it was possible to color the integers in multiple allowable ways -- but only up to 7,824 -- after that point, the answer became no. Is the proof really a proof if it does not answer why there is a cut-off point at 7,825, or even why the first stretch is possible? Does it really exist?
Open Source

Infographic: Ubuntu Linux Is Everywhere 185

prisoninmate writes: To celebrate the launch of Ubuntu 16.04 LTS, due for release later this month, on April 21, Canonical put together an interesting infographic, showing the world how popular Ubuntu is. From the infographic, it looks like there are over 60 million Ubuntu images launched by Docker users, 14 million Vagrant images of Ubuntu 14.04 LTS from HashiCorp, 20 million launches of Ubuntu instances during 2015 in public and private clouds, as well as bare metal, and 2 million new Ubuntu Cloud instances launched in November 2015. Ubuntu is used on the International Space Station, on the servers of popular online services like Netflix, Snapchat, Pinterest, Reddit, Dropbox, PayPal, Wikipedia, and Instagram, in Google, Tesla, George Hotz, and Uber cars. It is also employed at Bloomberg, Weta Digital and Walmart, at the Brigham Young University to control the Mars Rover, and it is even behind the largest supercomputer in the world.
Education

Supercomputers Help Researchers Improve Severe Hail Storm Forecasts (nsf.gov) 23

aarondubrow writes: Researchers working on the Severe Hail Analysis, Representation and Prediction (SHARP) project at the University of Oklahoma used the Stampede supercomputer to gain a better understanding of the conditions that cause severe hail to form, and to produce hail forecasts with far greater accuracy than those currently used operationally. The model the team used is six times more resolved that the National Weather Service's highest-resolution forecasts and applies machine learning algorithms to improve its predictions. The researchers will publish their results in an upcoming issue of the American Meteorological Society journal Weather and Forecasting.
Hardware Hacking

Using Kexec Allows Starting Linux In PlayStation 4 70

jones_supa writes: Team fail0verflow, the hacker group who made Sony PlayStation 4, has introduced another method to start Linux in the game console. Instead of the previous exploit which was based on a security hole in an old PS4 firmware version, the new trick allows a kexec call to start Linux through Orbis OS (the FreeBSD-based system software of PS4). The code can be found in GitHub. Maybe this will lead to more and better PlayStation clusters.
News

Biological Supercomputers Powered By ATP Could Be A Reality Some Day (dispatchtribunal.com) 66

hypnosec writes: Our cells are powered by Adenosine triphosphate (ATP) and according to a new study, they could be a power source for the next generation of biological supercomputers capable of processing information very quickly and accurately using parallel networks in the same way that massive electronic super computers do. Published in the Proceedings of the National Academy of Sciences (PNAS), the paper describes a model of biological computer that is effectively a very complex network in a very small area, and is based on a combination of geometrical modeling and engineering know-how (on the nano scale). Researchers involved with the study claim that it is the first step in showing that this kind of biological supercomputer can actually work.
AI

Marvin Minsky, Pioneer In Artificial Intelligence, Dies at 88 (nytimes.com) 76

An anonymous reader sends word that Marvin Lee Minsky, co-founder of the Massachusetts Institute of Technology's AI laboratory, has died. The Times reports: "Marvin Minsky, who combined a scientist’s thirst for knowledge with a philosopher’s quest for truth as a pioneering explorer of artificial intelligence, work that helped inspire the creation of the personal computer and the Internet, died on Sunday night in Boston. He was 88. Well before the advent of the microprocessor and the supercomputer, Professor Minsky, a revered computer science educator at M.I.T., laid the foundation for the field of artificial intelligence by demonstrating the possibilities of imparting common-sense reasoning to computers."
Supercomputing

Seymour Cray and the Development of Supercomputers (linuxvoice.com) 54

An anonymous reader writes: Linux Voice has a nice retrospective on the development of the Cray supercomputer. Quoting: "Firstly, within the CPU, there were multiple functional units (execution units forming discrete parts of the CPU) which could operate in parallel; so it could begin the next instruction while still computing the current one, as long as the current one wasn't required by the next. It also had an instruction cache of sorts to reduce the time the CPU spent waiting for the next instruction fetch result. Secondly, the CPU itself contained 10 parallel functional units (parallel processors, or PPs), so it could operate on ten different instructions simultaneously. This was unique for the time." They also discuss modern efforts to emulate the old Crays: "...what Chris wanted was real Cray-1 software: specifically, COS. Turns out, no one has it. He managed to track down a couple of disk packs (vast 10lb ones), but then had to get something to read them in the end he used an impressive home-brew robot solution to map the information, but that still left deciphering it. A Norwegian coder, Yngve Ådlandsvik, managed to play with the data set enough to figure out the data format and other bits and pieces, and wrote a data recovery script."
Security

Quantum Computer Security? NASA Doesn't Want To Talk About It (csoonline.com) 86

itwbennett writes: At a press event at NASA's Advanced Supercomputer Facility in Silicon Valley on Tuesday, the agency was keen to talk about the capabilities of its D-Wave 2X quantum computer. 'Engineers from NASA and Google are using it to research a whole new area of computing — one that's years from commercialization but could revolutionize the way computers solve complex problems,' writes Martyn Williams. But when questions turned to the system's security, a NASA moderator quickly shut things down [VIDEO], saying the topic was 'for later discussion at another time.'
Intel

Intel Launches 72-Core Knight's Landing Xeon Phi Supercomputer Chip (hothardware.com) 179

MojoKid writes: Intel announced a new version of their Xeon Phi line-up today, otherwise known as Knight's Landing. Whatever you want to call it, the pre-production chip is a 72-core coprocessor solution manufactured on a 14nm process with 3D Tri-Gate transistors. The family of coprocessors is built around Intel's MIC (Many Integrated Core) architecture which itself is part of a larger PCI-E add-in card solution for supercomputing applications. Knight's Landing succeeds the current version of Xeon Phi, codenamed Knight's Corner, which has up to 61 cores. The new Knight's Landing chip ups the ante with double-precision performance exceeding 3 teraflops and over 8 teraflops of single-precision performance. It also has 16GB of on-package MCDRAM memory, which Intel says is five times more power efficient as GDDR5 and three times as dense.
Bitcoin

Immersion Cooling Drives Server Power Densities To Insane New Heights (datacenterfrontier.com) 80

1sockchuck writes: By immersing IT equipment in liquid coolant, a new data center is reaching extreme power densities of 250 kW per enclosure. At 40 megawatts, the data center is also taking immersion cooling to an entirely new scale, building on a much smaller proof-of-concept from a Hong Kong skyscraper. The facility is being built by Bitcoin specialist BitFury and reflects how the harsh economics of industrial mining have prompted cryptocurrency firms to focus on data center design to cut costs and boost power. But this type of radical energy efficiency may soon be key to America's effort to build an exascale computer and the increasingly extreme data-crunching requirements for cloud and analytics.
Biotech

Complex Living Brain Simulation Replicates Sensory Rat Behaviour (cell.com) 63

New submitter physick writes: The Blue Brain project at EPFL, Switzerland today published the results of more than 10 years work in reconstructing a cellular model of a piece of the somatosensory cortex of a juvenile rat. The paper in Cell describes the process of painstakingly assembling tens of thousands of digital neurons, establishing the location of their synapses, and simulating the resulting neocortical microcircuit on an IBM Blue Gene supercomputer. “This is a first draft reconstruction of a piece of neocortex and it’s beautiful,” said Henry Markram, director of the Blue Brain Project at the Swiss Federal Institute of Technology in Lausanne. “It’s like a fundamental building block of the brain.”
IBM

IBM's Watson Is Now Analyzing Your Vacation Photos 117

jfruh writes: IBM's Jeopardy-winning supercomputer Watson is now suite of cloud-based services that developers can use to add cognitive capabilities to applications, and one of its powers is visual analysis. Visual Insights analyzes images and videos posted to services like Twitter, Facebook and Instagram, then looks for patterns and trends in what people have been posting. Watson turns what it gleans into structured data, making it easier to load into a database and act upon — which is clearly appealing to marketers and just as clearly carries disturbing privacy implications.

Slashdot Top Deals