×
Graphics

Players Seek 'No Man's Sky' Refunds, Sony's Content Director Calls Them Thieves (tweaktown.com) 467

thegarbz writes: As was covered previously on Slashdot the very hyped up game No Man's Sky was released to a lot of negative reviews about game-crashing bugs and poor interface choices. Now that players have had more time to play the game it has become clear that many of the features hyped by developers are not present in the game, and users quickly started describing the game as "boring".

Now, likely due to misleading advertising, Steam has begun allowing refunds for No Man's Sky regardless of playtime, and there are reports of players getting refunds on the Play Station Network as well despite Sony's strict no refund policy.
Besides Sony, Amazon is also issuing refunds, according to game sites. In response, Sony's former Strategic Content Director, Shahid Kamal Ahmad, wrote on Twitter, "If you're getting a refund after playing a game for 50 hours you're a thief." He later added "Here's the good news: Most players are not thieves. Most players are decent, honest people without whose support there could be no industry."

In a follow-up he acknowledged it was fair to consider a few hours lost to game-breaking crashes, adding "Each case should be considered on its own merits and perhaps I shouldn't be so unequivocal."
Businesses

Canon Unveils EOS 5D Mark IV DSLR (canonrumors.com) 160

It's been a little more than 4 year since Canon unveiled the EOS 5D Mark III. Today, Canon took the wraps off its successor -- the EOS 5D Mark IV. The Mark IV features a 34-megapixel, full-frame CMOS sensor and Digic 6+ processor with support for capturing 4K video at 23.98, 24, 25 and 30 fps. In addition, it features a 61-point autofocus system, built-in digital lens optimizer, NFC, Wi-Fi and an ISO range of 100-32,000. The continuous shooting mode is set at 7 fps, compared to 6 fps on the 5D Mark III. It will also take both CompactFlash and SD cards, and there is GPS included in the body for geotagging images. Canon is selling the Mark IV in early September for $3,499 for the body only. They're also selling two new L-series EF lenses -- the Canon EF 16-35mm f/2.8L III USM Ultra-Wide Zoom Lens and EF 24-105mm f/4L IS II USM Standard Zoom Lens. President and COO, Canon U.S.A., Inc, Yichi Ishizuka said in a statement: "Canon's EOS 5D series of DSLR cameras has a history of being at the forefront of still and video innovation. And today, we add to this family of cameras the EOS 5D Mark IV -- the first in our 5D series to offer 4K video and built-in Wi-Fi and NFC connectivity. In developing this new DSLR camera, we listened to the requests of current EOS users to create for them a modern, versatile camera designed to help them create and share beautiful still and video imagery." Here's a blast from the past: Canon's EOS 1Ds Mark II. Slashdot reader LoudMusic submitted this story back in 2004, highlighting the camera's "802.11a/g and wired networking capabilities."
Microsoft

Microsoft Details Its 24-Core 'Holographic Processor' Used In HoloLens (pcworld.com) 113

The processor powering Microsoft's HoloLens augmented reality headset has been a mystery -- until now. During the annual Hot Chips conference in Cupertino, California, Microsoft revealed some juicy details about the secretive chip. PCWorld reports: "The HoloLens' HPU is a custom 28nm coprocessor designed by TSMC, The Register reports. The chip packs 24 Tensilica digital signal processor (DSP) cores. As opposed to more general-purpose CPU cores, DSPs are a specialized technology designed for rapidly processing data flowing in from the world -- a no doubt invaluable asset while rendering augmented reality environments in real time. Microsoft's HPU also contains roughly 65 million logic gates, 8MB of SDRAM, and 1GB of traditional DDR3 RAM. It draws less than 10W of power, and features PCIe and standard serial interfaces. The HPU's dedicated hardware is up to 200 times faster than performing the same calculations via software on the less-specialized 14nm Intel Cherry Trail CPU. Microsoft added custom instructions to the DSP cores that allow the HPU to churn through HoloLens-specific tasks even faster, The Register reports. The HPU can perform roughly 1 trillion calculations per second, and the data it passes to the CPU requires little additional processing."
Intel

Intel Demos Kaby Lake 7th Gen Core Series Running Overwatch At IDF (hothardware.com) 56

Reader MojoKid writes: Intel unveiled a number of new product innovations out at IDF last week, but the company also stuck to its core product march by teasing its next gen Core series processor. Kaby Lake is the follow-up product to current, 6th Generation Skylake-based Core processors. With Kaby Lake, Intel is adding native support for USB 3.1 Gen 2, along with a more powerful graphics architecture for improved 3D performance and 4K video processing. Kaby Lake will also bring with it native HDCP 2.2 support and hardware acceleration for HEVC Main10/10-bit and VP9 10-bit video decoding. To drive some of those points home, Intel showed off Overwatch running on a next-gen Dell XPS 13 built around a 7th Gen ULV Core i5 processor, in addition to a HP notebook smoothly playing back 4K HDR video. Kaby Lake 7th Generation Core-based products should start arriving to market in the fall.
Sony

Sony To Debut Two New PlayStation 4 Consoles Next Month, Says WSJ (cnet.com) 86

An anonymous reader writes: Sony could be about to announce two new PlayStation 4 consoles, according to the Wall Street Journal. Both units are set to be introduced next month, people familiar with the matter told the newspaper. The upgraded console outlined by the company earlier this year, capable of outputting 4K-resolution graphics, could be joined by a slimmed-down, cheaper version of the console. That would give gamers options to suit their varying budgets and demands. Sony is set to hold a press conference at the PlayStation Theater in New York on September 7 where the company is expected to outline the consoles in further detail.
Displays

MIT Announces VR and AR Hackathon (uploadvr.com) 12

Calling it "A weekend that transforms the future of immersive technologies," MIT's Media Lab is hosting a big Augmented Reality/Virtual Reality hackathon. An anonymous Slashdot reader quotes this report from UploadVR: Game jams, hackathons, and meetups are more popular than ever in the budding VR and AR communities...to focus on creativity and functionality, rather than getting bogged down by polishing and prepping something for launch.

The MIT Media Lab is officially announcing its backing of the appropriately titled Reality, Virtually Hackathon. The hackathon is organized by a multitude of VR/AR experts, developers, industry executives, and MIT students, alumni, and Ph.D. candidates and will take place at the MIT campus.

Sponsors include Microsoft and the AT&T Developer Program, and applications for the hackathon are due by Wednesday, September 7, 2016. I'm wondering if any Slashdot readers have tried writing (or using) VR apps.
GUI

Fedora 25 To Run Wayland By Default Instead Of X.Org Server (phoronix.com) 151

An anonymous reader writes: Fedora 25 will finally be the first release for this Linux distribution -- and the first tier-one desktop Linux OS at large -- that is going ahead and using Wayland by default. Wayland has been talked about for years as a replacement to the xorg-server and finally with the upcoming Fedora 25 release this is expected to become a reality. The X.Org Server will still be present on Fedora systems for those running into driver problems or other common issues.
Fedora's steering committee agreed to the change provided the release notes "are clear about how to switch back to X11 if needed." In addition, according to the Fedora Project's wiki, "The code will automatically fall back to Xorg in cases where Wayland is unavailable (like NVIDIA)."
Graphics

NVIDIA Drops Pascal Desktop GPUs Into Laptops With Mobile GeForce GTX 10-Series (hothardware.com) 42

MojoKid writes: NVIDIA's new Pascal core graphics architecture is being driven throughout the company's entire product portfolio, as is typically the case. Today, NVIDIA brings Pascal to notebooks with the introduction of the NVIDIA Mobile GeForce GTX 10-Series. What's interesting is that the first laptop-targeted GPUs are actually quite similar to their desktop counterparts. In fact, all three of the Mobile GeForce GTX 10-Series graphics processors NVIDIA is announcing today come sans the traditional "M" tacked on the end of their model numbers. As it turns out, the migration to a 16nm manufacturing process with Pascal has been kind to NVIDIA and the Mobile GeForce GTX 1080 and Mobile GeForce GTX 1060 have nearly identical specs to their desktop counterparts, from CUDA core counts, to boost, and memory clock speeds. However, the Mobile GeForce GTX 1070 actually has a few more CUDA cores at 2048, versus 1920 for the desktop GTX 1070 (with slightly lower clocks). By tweaking boost clock peaks and MXM module power requirements, NVIDIA was able to get these new Pascal mobile GPUs into desktop replacement class machines and even 5-pound, 15-inch class standard notebook designs (for the 1060). In the benchmarks, the new Mobile GeForce GTX 10-Series blows pretty much any previous discrete notebook graphics chip out of the water and smooth 4K or 120Hz gaming is now possible on notebook platforms.
Intel

Intel Unveils Project Alloy 'Merged Reality' Wireless Headset (hothardware.com) 43

MojoKid writes: Intel CEO Bryan Krzanich took to the stage at the Moscone Center in San Francisco today to kick off this year's Intel Developers Forum. Kyrzanich unveiled a number of new projects and products including a product code-named "Project Alloy." The device is an un-tethered, merged reality Head Mounted Device (HMD) that combines compute, graphics, multiple RealSense modules, various sensors, and batteries into a self-contained headset that offers a full six degrees of freedom. Unlike the Oculus Rift and HTC Vive, Project Alloy does not need to be wired to a PC or other device and it does not require externally mounted sensors to define a virtual space. Instead, it uses RealSense cameras to map the actual physical world you're in while wearing the HMD. The RealSense cameras also allow the device to bring real-world objects into the virtual world, or vice versa. The cameras and sensors used in Project Alloy offer full depth sensing, so obstacles can be mapped, and people and objects within camera range -- like your hand, for example -- can be brought into the virtual world and accurately tracked. During a live, on-stage demo performed by Intel's Craig Raymond, Craig's hand was tracked and all five digits, complete with accurate bones and joint locations, were brought into the the VR/AR experience. Project Alloy will be supported by Microsoft's Windows Holographics Shell framework.
Intel

Intel's Joule is Its Most Powerful Dev Kit Yet (engadget.com) 55

Devindra Hardawar, writing for Engadget: We've seen plenty of unique dev kits from Intel, including the SD card-sized Edison, but not one as powerful as this. Intel announced Joule today, a tiny maker board that will allow developers to test RealSense-powered concepts and, hopefully, bring the to the market faster than before. The company says the tiny, low-powered Joule would be ideal for testing concepts in robotics, AR, VR, industrial IoT and a slew of other industries. And it also looks like it could be an interesting way for students to dabble in RealSense's depth-sensing technology in schools. There will be two Joule kits to choose from: the 550x, which includes a 1.5GHz quad-core Atom T5500 processor, 3GB of RAM and 8GB of storage; and the 570x, which packs in a 1.7Ghz quad-core Atom T5700 CPU (with burst speeds up to 2.4GHz), 4GB of RAM and 16GB of storage. Both models include "laptop-class" 802.11AC wireless, Intel graphics with 4K capture and display support, and a Linux-based OS.
Classic Games (Games)

Hacked Hobbit Pinball Machine Joins IoT, Broadcasts Itself Over Twitch (lachniet.com) 45

Random web surfers could send a text message or even upload an image to be displayed on the back glass of Mark Lachniet's pinball machine, according to Mael517, while the machine itself webcast footage of both its playing field and backglass using Twitch. Interestingly, all the extra functionality was coded directly into the machine, according to Lachniet, who added only the webcam and an ethernet cord. The Hobbit [machine] has a whole bunch of hardware that I don't really understand and can barely fix... However, it has a computer in its guts, and this I can mostly understand.
After identifying the pinball machine's motherboard, CPU, operating system (Ubuntu) and an SQL database, Lachniet was able to backup its software, and then create his own modifications. He envisions more possibilities -- for example, the ability to announce high scores on social media accounts or allow remote servicing of the machine. Lachniet even sees the possibility of a world-wide registry of pinball game scores with each player's location overlaid on Google Maps "so you could view pinball hot spots and where the high scores were coming from," and maybe even networking machines together to allow real-time global competition."
Windows

No Man's Sky Launches On Steam and GOG and It's Off To A Rocky Start (arstechnica.com) 157

An anonymous reader writes from a report via Ars Technica: No Man's Sky, an indie "video game that promises 18 quintillion planets" from a "small development team," has launched today for Windows PC gamers via Steam or GOG. Unfortunately, the "worldwide simultaneous launch on all kinds of PCs" is off to a rocky start -- as evidenced by the "mostly negative" Steam reviews. Many gamers have complained about frame rate hitches and total system crashes. Ars Technica reports: "Even users with high-end solutions like the GTX 1080 or two GTX 980Ti cards in SLI mode are reporting major stutters -- on a game that runs on a comparatively so-so PS4 console with a mostly consistent 30 FPS refresh. The game's PC version defaults to a 30 FPS cap, which can be disabled in the normal options menus. But with this setting turned on, the game can't help but hitch down to an apparent 20 FPS on a regular basis, not to mention throw up frequent display hitches of half a second at a time. Removing that frame rate cap can get play up to a smooth 60 frames per second, and we enjoyed more consistent frame rates without the cap. But even those frame rates can bounce down to 30 or less at random intervals. The game also suffers from freezing hitches, even without apparent spikes in visible geometry like creatures or spaceships." Ars also mentions that the on-screen prompts don't update the button remapping accordingly. There's been some frustration among PC gamers who have had to learn the hard way that the game's floating-menu interface was built with joysticks in mind. Mouse scroll wheels don't seem to work to scroll through text and between menus, and players are required to hold-to-confirm every menu interaction in the game. What's more is that alt-tabbing out of the game is a "guaranteed crash." For those looking for more information about the game, The Atlantic has a captivating report describing the game as if it were like reading a book.
Earth

Perseid Meteor Shower Peaks Tonight With Up To 200 Meteors Per Hour (latimes.com) 62

The Perseid meteor shower happens ever year in August, but this year it will be especially spectacular with twice as many shooting stars streaking across the night sky. Los Angeles Times reports: "In past years, stargazers would have seen up to one meteor each minute, on average, in a very dark sky. But this year, there's even more reason to stay up late or crawl out of bed in the middle of the night. 'We're expecting 160 to 200 meteors per hour,' said Bill Cooke, head of NASA's Meteoroid Environment Office. This year's 'outburst' of shooting stars was set into motion more than a year ago, when Jupiter passed closer than usual to the stream of dusty debris left in the wake of the comet Swift-Tuttle. Jupiter's gravity field tugged a large clump of the tiny particles closer to Earth's eventual path. These intense displays happen once a decade or so, Cooke said. The next one won't be until 2027 or 2028." The best viewing experience will be away from the city. Since it takes roughly 30-45 minutes for your eyes to adjust to the darkness, it's recommended you don't pull out your smartphone or excessively shine your flashlight around. The Los Angeles Times has a neat infographic of the Perseid meteor shower.
Businesses

HPE Acquires SGI For $275 Million (venturebeat.com) 100

An anonymous reader writes: Hewlett Packard Enterprise has announced today that it has acquired SGI for $275 million in cash and debt. VentureBeat provides some backstory on the company that makes servers, storage, and software for high-end computing: "SGI (originally known as Silicon Graphics) was cofounded in 1981 by Jim Clark, who later cofounded Netscape with Marc Andreessen. It filed for Chapter 11 bankruptcy in 2009 after being de-listed from the New York Stock Exchange. In 2009 it was acquired by Rackable Systems, which later adopted the SGI branding. SGI's former campus in Mountain View, California, is now the site of the Googleplex. SGI, which is now based in Milpitas, California, brought in $533 million in revenue in its 2016 fiscal year and has 1,100 employees, according to the statement. HPE thinks buying SGI will be neutral in terms of its financial impact in the year after the deal is closed, which should happen in the first quarter of HPE's 2017 fiscal year, and later a catalyst for growth." HP split into two separate companies last year, betting that the smaller parts will be nimbler and more able to reverse four years of declining sales.
Operating Systems

Arch Linux Is Now Officially Powered by Linux Kernel 4.7, Update Your Systems 54

Marius Nestor, writing for Softpedia: After a few weeks from its official release, it finally happened, Linux kernel 4.7 has just landed in the stable software repositories of the popular, lightweight and highly customizable Arch Linux operating system. Linux kernel 4.7 is the most stable and advanced kernel branch, and only a few GNU/Linux distributions have adopted since its launch on July 24, 2016. It's still marked as "mainline" not "stable" or "longterm" on the kernel.org website, which means that it didn't receive a maintenance update at the moment of writing this article. As for its new features, Linux kernel 4.7 comes with an updated AMDGPU graphics driver with support for AMD Radeon RX 480 GPUs, LoadPin, a brand new security module that ensures all modules loaded by the kernel originate from the same filesystem, and support for upgrading firmware using the EFI "Capsule" mechanism. Linux kernel 4.7 also marks the sync_file fencing mechanism used in the Android mobile operating system as stable and ready for production, implements support for generating virtual USB Device Controllers in USB/IP, supports parallel directory lookups, and introduces the "schedutil" frequency governor, which is faster and more accurate than the current ones.
Desktops (Apple)

Apple Said To Plan First Pro Laptop Overhaul in Four Years (bloomberg.com) 304

It's been a while since Apple upgraded most of its computer lineups. It has come to a point, where it's being advised that the Cupertino-based company should stop selling the dated inventories. But the wait will be over later this year, says Mark Gurman, the reporter with the best track record in Apple's ecosystem. Reporting for Bloomberg, Gurman says that the company will be overhauling its MacBook Pro laptop line for the first time in over four years, packing it with a range of interesting features. From the report: The updated notebooks will be thinner, include a touch screen strip for function keys, and will be offered with more powerful and efficient graphics processors for expert users such as video gamers, said the people, who asked not to be named. The most significant addition to the new MacBook Pro is a secondary display above the keyboard that replaces the standard function key row. Instead of physical keys, a strip-like screen will present functions on an as-needed basis that fit the current task or application. The smaller display will use Organic Light-Emitting Diodes, a thinner, lighter and sharper screen technology, KGI Securities analyst Ming-Chi Kuo said earlier this year. Apple's goal with the dedicated function display is to simplify keyboard shortcuts traditionally used by experienced users. The panel will theoretically display media playback controls when iTunes is open, while it could display editing commands like cut and paste during word processing tasks, the people said. The display also allows Apple to add new buttons via software updates rather than through more expensive, slower hardware refreshes. [...] Apple is using one of AMD's "Polaris" graphics chips because the design offers the power efficiency and thinness necessary to fit inside the slimmer Apple notebook, the person said.
The Almighty Buck

They Quite Literally Don't Make Games the Way They Used To (theguardian.com) 158

The days of two developers making games in a shed are over, an article on The Guardian says. Spend any time with your grandparents and at some stage the age-old phase "they don't make them like they use to" will pop up as nostalgia gets the better of them. Usually it's just the rose-tinted glasses talking, but for video games it's a fact: they quite literally don't make them like they used to. Back in the 1980s, when the industry was in its infancy, games were often created by two-person teams consisting of one programmer and one artist. In the 1990s, sprites gave way to 3D modelling, and development teams mushroomed in size, hoovering up specialists in disciplines across animation, level design, character modelling and artificial intelligence. Today, creating the most advanced, triple-A games has become too big a task for a single developer leading to the rise of what is best described as a modular approach, where different developers work on different parts of a single game. The article adds: One developer that is pioneering the modern modular approach is no spring chicken. Set up in 1984, Newcastle-based Reflections swiftly established a reputation for bringing cutting-edge graphics to side-scrollers such as Shadow of the Beast and the gloriously named Brian the Lion. It then morphed into a driving-game specialist, thanks primarily to the Destruction Derby and Driver franchises. French publisher Ubisoft acquired the studio in 2006, expanding its remit way beyond its previous practice of churning out a new Driver game every three years or so. Reflections is crafting the vehicle components of the upcoming Watch Dogs 2 and Ghost Recon Wildlands and has just finished the Underground downloadable content (DLC) pack for The Division. It's finishing Grow Up, the sequel to 2015's Grow Home -- ironically, a small, innovative download game made by a 90s-style 10-person team.
Media

NASA: Revolutionary Camera Recording Propulsion Data Completes Test (theverge.com) 81

An anonymous reader quotes a report from The Verge: NASA has created a camera that can film slow motion footage of booming rocket engines with higher dynamic range than ever before. It's called the High Dynamic Range Stereo X camera, or HiDyRS-X (PDF), and late last week the agency released some of its footage to the public for the first time. The three-minute clip shows the most recent test of one of the boosters for NASA's upcoming Space Launch System rocket in unprecedented detail. SLS will use two of these 17-story tall solid rocket boosters, each of which is capable of burning 5.5 tons of propellant per second to create 3.6 million pounds of thrust. The problem when it comes to filming tests like these (and eventually, launches) is that the plumes of fire they produce are extremely bright. This usually leaves camera operators with two choices. They can either expose the footage for the bright plume, which will leave everything else in the shot looking dark and underexposed. Or they can expose for everything else in the shot, which leaves the plume looking bright white and void of detail. The HiDyRS-X camera solves this problem because the camera can capture all of this detail in one shot, and it does this in a fairly clever way. Where regular high-speed cameras usually only captures video one exposure at a time, HiDyRS-X can capture multiple exposures at a time. NASA did however report some failures with the test: the camera's automatic timer failed to go off, thus failing to record the igniting of the rocket, and the pressure being generated from the booster knocked the camera's power source loose.
Displays

One Billion Monitors Vulnerable to Hijacking and Spying (vice.com) 157

"We can now hack the monitor and you shouldn't have blind trust in those pixels coming out of your monitor..." a security researcher tells Motherboard. "If you have a monitor, chances are your monitor is affected." An anonymous Slashdot reader quotes Motherboard's article: if a hacker can get you to visit a malicious website or click on a phishing link, they can then target the monitor's embedded computer, specifically its firmware...the computer that controls the menu to change brightness and other simple settings on the monitor. The hacker can then put an implant there programmed to wait...for commands sent over by a blinking pixel, which could be included in any video or a website. Essentially, that pixel is uploading code to the monitor. At that point, the hacker can mess with your monitor...

[T]his could be used to both spy on you, but also show you stuff that's actually not there. A scenario where that could dangerous is if hackers mess with the monitor displaying controls for a power plant, perhaps faking an emergency. The researchers warn that this is an issue that could potentially affect one billion monitors, given that the most common brands all have processors that are vulnerable...

"We now live in a world where you can't trust your monitor," one researcher told Motherboard, which added "we shouldn't consider monitors as untouchable, unhackable things."
Portables (Apple)

Apple Should Stop Selling Four-Year-Old Computers (theverge.com) 472

It's been a while since Apple upgraded its MacBook Air, MacBook Pro, and Mac Pro models. Four years, one month, and twenty-four days, to be exact, in case of the MacBook Pro. Apple is inexplicably still selling the exact same models for its Mac line that it introduced in 2012. Pretty much every Windows OEM has had an Intel Skylake-powered processor in its laptops for more than a year now, but Apple's computing lineup is still shipping with the three-to-four years old processor, and graphics card. Things have gotten so bad, that MacRumors' Buying Guide, which is considered to be an "online institution" among Apple nerds, has flagged all of Apple laptops as "Don't Buy" In a column, The Verge's Sam Byford says that Apple should stop selling the old laptops. He writes: Apple iterates quickly and consistently in mobile because the rate of technological progress is so much more dramatic in that arena. The company does amazing work to keep its iPhones and iPads ahead of competitors, performance-wise. Simple Intel processor upgrades are less important to laptops these days, however, and I'm finding this 2012 MacBook Pro fine to work from right now -- faster than my 2015 MacBook, at least, which is enough for my needs. But that doesn't mean it isn't unconscionable for Apple to continue to sell outdated products to people who may not know any better. Is the company really saving that much money by using 2012 processors and 4GB of RAM as standard? Even an update to Intel's Haswell chips from 2013 would have brought huge battery life improvements. Apple is bound by the whims of its suppliers to a certain extent, and it may not always make sense for the company to upgrade its products with every single new chip or GPU that comes out. But there's a certain point at which it just starts to look like absent-mindedness, and many Mac computers are well past that point now. [...] If Apple doesn't want to keep its products reasonably current, that's its prerogative. But if that truly is the case, maybe it shouldn't sell them at all.It's also ironic, coming from a company whose executive not long ago made fun of people who had five years old computer. Folks at Accidental Tech Podcast also discussed the same recently.

Slashdot Top Deals