×
Intel

Why Stacking Chips Like Pancakes Could Mean a Huge Leap for Laptops (cnet.com) 46

For decades, you could test a computer chip's mettle by how small and tightly packed its electronic circuitry was. Now Intel believes another dimension is as big a deal: how artfully a group of such chips can be packaged into a single, more powerful processor. From a report: At the Hot Chips conference Monday, Intel Chief Executive Pat Gelsinger will shine a spotlight on the company's packaging prowess. It's a crucial element to two new processors: Meteor Lake, a next-generation Core processor family member that'll power PCs in 2023, and Ponte Vecchio, the brains of what's expected to be the world's fastest supercomputer, Aurora.

"Meteor Lake will be a huge technical innovation," thanks to how it packages, said Real World Tech analyst David Kanter. For decades, staying on the cutting edge of chip progress meant miniaturizing chip circuitry. Chipmakers make that circuitry with a process called photolithography, using patterns of light to etch tiny on-off switches called transistors onto silicon wafers. The smaller the transistors, the more designers can add for new features like accelerators for graphics or artificial intelligence chores. Now Intel believes building these chiplets into a package will bring the same processing power boost as the traditional photolithography technique.

Facebook

After Mockery, Mark Zuckerberg Promises Better Metaverse Graphics, Posts New Avatar (cnn.com) 63

What do you when people hate your $10 billion selfie? "Mark Zuckerberg, in response to a torrent of critical memes mocking the graphics of Meta's newest project, has heard his critics — and changed his selfie," reports CNN: Zuckerberg debuted Horizon Worlds, a virtual reality social app, in France and Spain earlier this week, sharing a somewhat flat, goofy digital avatar in front of an animated Eiffel Tower and la Sagrada Família.

The internet immediately jumped in, mocking what many users viewed as (hopefully) preliminary graphics for a venture that Meta has spent at least $10 billion in the last year.

New York Times tech columnist Kevin Roose compared the graphics to "worse than a 2008 Wii game" on Twitter. Slate used the term " buttcheeks." Twitter was less kind: "eye-gougingly ugly" and "an international laughing stock" popping up. Many compared it to early 90's graphics and pointed out how lifeless and childish the Zuckerberg selfie looked. It quickly won the designation "dead eyes."

Well, Zuckerberg has apparently seen the memes, because on Friday he announced there are major updates coming — along with new avatar graphics.

In a CNBC report on how Zuckerberg "is getting dragged on the internet for how ugly the graphics of this game are," they'd actually quoted a Forbes headline that asked, "Does Mark Zuckerberg not understand how bad his metaverse is?"
Intel

Intel Drops DirectX 9 Support On Xe, Arc GPUs, Switches To DirectX 12 Emulation (tomshardware.com) 45

An anonymous reader quotes a report from Ars Technica: Native DX9 hardware support is officially gone from Intel's Xe integrated graphics solutions on 12th Gen CPUs and A-Series Arc Alchemist discrete GPUs. To replace it, all DirectX 9 support will be transferred to DirectX 12 in the form of emulation. Emulation will run on an open-source conversion layer known as "D3D9On12" from Microsoft. Conversion works by sending 3D DirectX 9 graphics commands to the D3D9On12 layer instead of the D3D9 graphics driver directly. Once the D3D9On12 layer receives commands from the D3D9 API, it will convert all commands into D3D12 API calls. So basically, D3D9On12 will act as a GPU driver all on its own instead of the actual GPU driver from Intel. Microsoft says this emulation process has become a relatively performant implementation of DirectX 9. As a result, performance should be nearly as good, if not just as good, as native DirectX 9 hardware support.
Open Source

NVIDIA Publishes 73k Lines Worth Of 3D Header Files For Fermi Through Ampere GPUs (phoronix.com) 6

In addition to NVIDIA being busy working on transitioning to an open-source GPU kernel driver, yesterday they made a rare public open-source documentation contribution... NVIDIA quietly published 73k lines worth of header files to document the 3D classes for their Fermi through current-generation Ampere GPUs. Phoronix's Michael Larabel reports: To NVIDIA's Open-GPU-Docs portal they have posted the 73k lines worth of 3D class header files covering RTX 30 "Ampere" GPUs back through the decade-old GeForce 400/500 "Fermi" graphics processors. These header files define the classes used to program the 3D engine of the GPU, the texture header and texture sampler layout are documented, and other 3D-related programming bits. Having all of these header files will be useful to the open-source Nouveau driver developers to save on their reverse-engineering and guessing/uncertainty over certain bits.

NVIDIA's Open GPU Kernel Driver is for only GeForce RTX 20 "Turing" series and newer, so it's great seeing NVIDIA now posting this documentation going back to Fermi which is squarely to help the open-source community / Nouveau. [...] The timing of NVIDIA opening these 3D classes back to Fermi is interesting and potentially tied to SIGGRAPH 2022 happening this week. Those wanting to grab NVIDIA's latest open-source GPU documentation can find it via this GitHub repository.

Businesses

Crypto-Driven GPU Crash Makes Nvidia Miss Q2 Projections By $1.4 Billion (arstechnica.com) 46

In preliminary second-quarter financial results announced today, Nvidia's year-over-year growth is "down from a previously forecasted $8.1 billion, a miss of $1.4 billion," reports Ars Technica. "Nvidia blamed this shortfall on weaker-than-expected demand for its gaming products, including its GeForce graphics processors." The full results won't arrive until the end of the month. From the report: Nvidia pointed to "a reduction in channel partner sales," meaning that partners like Evga, MSI, Asus, Zotac, Gigabyte, and others were selling fewer new GPUs than anticipated. This drop can be attributed partly to a crash in the value of mining-based cryptocurrencies like Bitcoin and Ethereum -- fewer miners are buying these cards, and miners looking to unload their GPUs on the secondhand market are also giving gamers a cheaper source for graphics cards. "As we expect the macroeconomic conditions affecting sell-through to continue, we took actions with our Gaming partners to adjust channel prices and inventory," said Nvidia CEO Jensen Huang. That means we may see further price drops for existing GeForce GPUs, which have already been dropping in price throughout the year. Some cards still haven't reverted to their originally advertised prices, but they're getting closer all the time.

In better news for Nvidia, the small overall increase in revenue [$6.7 billion] is driven almost exclusively by the company's data center business, including GPU-accelerated AI and machine learning applications and GPU acceleration for cloud-hosted virtual machines. Nvidia's data center revenue is projected to be up 61 percent from last year, from $2.37 billion to $3.81 billion. Nvidia will supposedly launch its next-generation RTX 4000 series GPUs later this year. Based on the new Lovelace architecture, these GPUs may appeal to some gamers who originally sat out the RTX 3000 series due to shortages and inflated prices and are now avoiding the GPUs because they know a replacement is around the corner.

Intel

Intel Unveils Arc Pro GPUs (tomshardware.com) 23

Intel's Arc graphics cards aren't just for gamers, it seems, as the previously CPU-exclusive company has taken the lid off a new line of professional GPUs to complement the existing Arc line -- well, existing in China, maybe. From a report:The new cards are called Arc Pro, and target those who use their graphics cards for more than shooting bad guys. Maybe they won't be among the best graphics cards for gaming, but the AV1 encoding at least might get some takers. Intel today unveiled one mobile professional GPU, the A30M, and two desktop models: the single-slot A40 and double-slot A50. Both desktop cards are described as being for small form-factor machines, which makes us suspect Intel may have some much larger cards up its sleeve.

All the newly announced GPUs feature built-in ray tracing hardware, machine learning capabilities and industry-first AV1 hardware encoding acceleration. Google's royalty-free, open source alternative to HEVC, AV1 hasn't gained a lot of traction on the web so far despite promises from Netflix and YouTube, with its main use being in Google's Duo video calling despite beating HEVC for compression quality. It's always been very slow to encode, however, so a good hardware accelerator and Intel's backing could see it take off.

GNU is Not Unix

There Were 19 New GNU Releases Last Month (fsf.org) 30

"Nineteen new GNU releases in the last month," reads a "July GNU Spotlight" announcement from the Free Software Foundation.

Here's (edited and condensed) descriptions of some of the highlights:
  • GNU Datamash (version 1.8) — a command-line program performing basic numeric, textual, and statistical operations on input textual data files (designed to work within standard pipelines).
  • GNUnet (version 0.17.2) — a framework for secure peer-to-peer networking. "The high-level goal is to provide a strong foundation of free software for a global, distributed network that provides security and privacy. GNUnet in that sense aims to replace the current internet protocol stack. Along with an application for secure publication of files, it has grown to include all kinds of basic applications for the foundation of a GNU internet."
  • GnuTLS (version 3.7.7) — A secure communications library implementing the SSL, TLS and DTLS protocols, provided in the form of a C library.
  • Jami (version 20220726.1515.da8d1da) — a GNU package for universal communication that respects the freedom and privacy of its users, using distributed hash tables for establishing communication. ("This avoids keeping centralized registries of users and storing personal data.")
  • GNU Nettle (version 3.8.1) — a low-level cryptographic library. It is designed to fit in easily in almost any context. It can be easily included in cryptographic toolkits for object-oriented languages or in applications themselves.
  • GNU Octave (version 7.2.0) — a high-level interpreted language specialized for numerical computations, for both linear and non-linear applications and with great support for visualizing results.
  • R (version 4.2.1) — a language and environment for statistical computing and graphics, along with robust support for producing publication-quality data plots. "A large amount of 3rd-party packages are available, greatly increasing its breadth and scope."
  • TRAMP (version 2.5.3) — a GNU Emacs package allowing you to access files on remote machines as though they were local files. "This includes editing files, performing version control tasks and modifying directory contents with dired. Access is performed via ssh, rsh, rlogin, telnet or other similar methods."

Click here to see the other new releases and download information.

The FSF announcement adds that "A number of GNU packages, as well as the GNU operating system as a whole, are looking for maintainers and other assistance."


Graphics

Raspberry Pi 4 Expands 3D Potential With Vulkan Update (arstechnica.com) 53

The Raspberry Pi 4 has hit a major graphics milestone, adding support for a more modern Vulkan 3D APIa. Ars Technica reports: Raspberry Pi CEO Eben Upton announced the Pi 4's Vulkan 1.2 conformance on Monday. Support isn't available yet in downloadable Pi-friendly operating systems but should be coming soon. For most people using their Pi as a server, a DIY controller, or a light desktop, Vulkan 1.2 conformance won't be noticeable. Desktop graphics on the standard Raspberry Pi OS are powered by OpenGL, the older graphics API that Vulkan is meant to replace. There is one group that benefits, says Upton: games and other 3D Android applications. Android uses Vulkan as its low-overhead graphics API.

As with most Raspberry Pi advancements, there could be unforeseen opportunities unleashed by this seemingly tiny change. Vulkan 1.2 support gives developers the same 3D-graphics interface (if not anywhere near the same power) as 2019 NVIDIA graphics cards, 2020 Intel chips with integrated graphics, and dozens of other devices. With a Vulkan 1.0 driver installed, developer Iago Toral was able in 2020 to get the original Quake trilogy mostly running on a Pi 4, with not-too-shabby frame rates.

Cloud

Will the US Army, Not Meta, Build an 'Open' Metaverse? (venturebeat.com) 35

Just five weeks before his death in 2001, Douglas Adams made a mind-boggling pronouncement. "We are participating in a 3.5 billion-year program to turn dumb matter into smart matter..." He gave the keynote address for an embedded systems conference at San Francisco's Moscone Center... Adams dazzled the audience with a vision of a world where information devices are ultimately "as plentiful as chairs...." When the devices of the world were networked together, they could create a "soft earth" — a shared software model of the world assembled from all the bits of data. Communicating in real time, the soft earth would be alive and developing — and with the right instruments, humankind could just as easily tap into a soft solar system.
It's 21 years later, in a world where the long-time global software company Bohemia Interactive Simulations claims to be "at the forefront of simulation training solutions for defense and civilian organizations." And writing in VentureBeat, their chief commercial officer argues that "We do not yet have a shared imagination for the metaverse and the technology required to build it," complaining that big-tech companies "want to keep users reliant on their tech within a closed, commercialized ecosystem." I envision an open virtual world that supports thousands of simultaneous players and offers valuable, immersive use cases.

The scope of this vision requires an open cloud architecture with native support for cloud scalability. By prioritizing cloud development and clear goal-setting, military organizations have taken significant leaps toward building an actual realization of this metaverse. In terms of industry progress towards the cloud-supported, scalable metaverse, no organization has come further than the U.S. Army.

Their Synthetic Training Environment (STE) has been in development since 2017. The STE aims to replace all legacy simulation programs and integrate different systems into a single, connected system for combined arms and joint training. The STE fundamentally differs from traditional, server-based approaches. For example, it will host a 1:1 digital twin of the Earth on a cloud architecture that will stream high fidelity (photo-realistic) terrain data to connected simulations. New terrain management platforms such as Mantle ETM will ensure that all connected systems operate on exactly the same terrain data. For example, trainees in a tank simulator will see the same trees, bushes and buildings as the pilot in a connected flight simulator, facilitating combined arms operations.

Cloud scalability (that is, scaling with available computational power) will allow for a better real-world representation of essential details such as population density and terrain complexity that traditional servers could not support. The ambition of STE is to automatically pull from available data resources to render millions of simulated entities, such as AI-based vehicles or pedestrians, all at once.... [D]evelopers are creating a high-fidelity, digital twin of the entire planet.

Commercial metaverses created for entertainment or commercial uses may not require an accurate representation of the earth.... Still, the military metaverse could be a microcosm of what may soon be a large-scale, open-source digital world that is not controlled or dominated by a few commercial entities....

STE success will pave the way for any cloud-based, open-source worlds that come after it, and will help prove that the metaverse's value extends far beyond that of a marketing gimmick.

Graphics

As Intel Gets Into Discrete GPUs, It Scales Back Support For Many Integrated GPUs (arstechnica.com) 47

An anonymous reader quotes a report from Ars Technica: Intel is slowly moving into the dedicated graphics market, and its graphics driver releases are looking a lot more like Nvidia's and AMD's than they used to. For its dedicated Arc GPUs and the architecturally similar integrated GPUs that ship with 11th- and 12th-generation Intel CPUs, the company promises monthly driver releases, along with "Day 0" drivers with specific fixes and performance enhancements for just-released games. At the same time, Intel's GPU driver updates are beginning to de-emphasize what used to be the company's bread and butter: low-end integrated GPUs. The company announced yesterday that it would be moving most of its integrated GPUs to a "legacy support model," which will provide quarterly updates to fix security issues and "critical" bugs but won't include the game-specific fixes that newer GPUs are getting.

The change affects a wide swath of GPUs, which are not all ancient history. Among others, the change affects all integrated GPUs in the following processor generations, from low-end unnumbered "HD/UHD graphics" to the faster Intel Iris-branded versions: 6th-generation Core (introduced 2015, codenamed Skylake), 7th-generation Core (introduced 2016, codenamed Kaby Lake), 8th-generation Core (introduced 2017-2018, codenamed Kaby Lake-R, Whiskey Lake, and Coffee Lake), 9th-generation Core (introduced 2018, codenamed Coffee Lake), 10th-generation Core (introduced 2019-2020, codenamed Comet Lake and Ice Lake), and various N4000, N5000, and N6000-series Celeron and Pentium CPUs (introduced 2017-2021, codenamed Gemini Lake, Elkhart Lake, and Jasper Lake).

Intel is still offering a single 1.1GB driver package that supports everything from its newest Iris Xe GPUs to Skylake-era integrated graphics. However, the install package now contains one driver for newer GPUs that are still getting new features and a second driver for older GPUs on the legacy support model. The company uses a similar approach for driver updates for its Wi-Fi adapters, including multiple driver versions in the same download package to support multiple generations of hardware.
"The upshot is that these GPUs' drivers are about as fast and well-optimized as they're going to get, and the hardware isn't powerful enough to play many of the newer games that Intel provides fixes for in new GPU drivers anyway," writes Ars Technica's Andrew Cunningham. "Practically speaking, losing out on a consistent stream of new gaming-centric driver updates is unlikely to impact the users of these GPUs much, especially since Intel will continue to fix problems as they occur."
Graphics

Coding Mistake Made Intel GPUs 100X Slower in Ray Tracing (tomshardware.com) 59

Intel Linux GPU driver developers have released an update that results in a massive 100X boost in ray tracing performance. This is something to be celebrated, of course. However, on the flip side, the driver was 100X slower than it should have been because of a memory allocation oversight. Tom's Hardware reports: Linux-centric news site Phoronix reports that a fix merged into the open-source Intel Mesa Vulkan driver was implemented by Intel Linux graphics driver engineering stalwart Lionel Landwerlin on Thursday. The developer wryly commented that the merge request, which already landed in Mesa 22.2, would deliver "Like a 100x (not joking) improvement." Intel has been working on Vulkan raytracing support since late 2020, but this fix is better late than never.

Usually, the Vulkan driver would ensure temporary memory used for Vulkan raytracing work would be in local memory, i.e., the very fast graphics memory onboard the discrete GPU. A line of code was missing, so this memory allocation housekeeping task wasn't set. Thus, the Vulkan driver would shift ray tracing data to slower offboard system memory and back. Think of the continued convoluted transfers to this slower memory taking place, slowing down the raytracing performance significantly. It turns out, as per our headline, that setting a flag for "ANV_BO_ALLOC_LOCAL_MEM" ensured that the VRAM would be used instead, and a 100X performance boost was the result.
"Mesa 22.2, which includes the new code, is due to be branched in the coming days and will be included in a bundle of other driver refinements, which should reach end-users by the end of August," adds the report.
AMD

AMD Just Leaked Its Nvidia RTX Voice Competitor in a (Now Deleted) Video (theverge.com) 8

AMD looks to be on the cusp of releasing a competitor to RTX Voice, a feature for Nvidia graphics cards that cancels out background noise when you're on a call or otherwise using your mic. From a report: That's according to a trailer that AMD posted to its YouTube channel (apparently in error), Tom's Hardware reports. Thankfully, a copy of the trailer was downloaded before it was deleted by Reddit user u/zenobian and uploaded to the AMD subreddit. The leaked trailer suggests that AMD's Noise Suppression feature will work very similarly to Nvidia's RTX Voice (which has subsequently been rolled into Nvidia's Broadcast app). It uses "a real-time deep learning algorithm" to offer "two-way noise-reduction" that filters background noise out of both outgoing and incoming microphone audio, and is apparently built into AMD's existing Adrenalin software.
Cloud

GeForce Now Rolling Out 120FPS Cloud Gaming To All Compatible Android Smartphones (9to5google.com) 15

Nvidia has just announced that GeForce Now is picking up support for 120fps gameplay on all Android smartphones, after previously limiting the functionality to only a few select models. 9to5Google reports: GeForce Now is a cloud gaming service that allows players to stream PC games from marketplaces such as Steam and the Epic Games Store, among others, to virtually any device. It's a great way to expand the gaming experience on your PC over to a mobile phone or your TV, or just to play games that your PC isn't powerful enough to run on its own. The service is free, but you can pay to get longer sessions and better quality.

Last year, the service picked up its RTX 3080 tier, which offers the power of the still-hard-to-find graphics card, but through the cloud. While it's a pricey option, it was quickly found to be the gold standard of cloud gaming thanks to minimal input latency, higher resolution, and faster refresh rate. It's that faster refresh rate that's boosting GeForce Now for Android players this week, with 120fps expanding to all Android phones with faster refresh rates. If your phone has a 120Hz display, you can now stream games at 120fps.
The official list of supported devices can be found here.

Nvidia says that the expanded support will arrive "over the coming weeks" and that the experience could vary from device to device.
Ubuntu

The Dell XPS Developer Edition Will Soon Arrive With Ubuntu Linux 22.04 (zdnet.com) 31

The Dell XPS 13 Plus Developer Edition with Ubuntu 22.04 Long Term Support (LTS) will arrive on August 23rd. "This means, of course, Canonical and Dell officially have been certified for Ubuntu 22.04 LTS," writes ZDNet's Steven Vaughan-Nichols. "So if you already have a current XPS 13 Plus, you can install Ubuntu 22.04 and automatically receive the same hardware-optimized experience that will ship with the new Developer Edition." From the report: What this certification means is that all of XPS's components have been tested to deliver the best possible experience out of the box. Ubuntu-certified devices are based on Long Term Support (LTS) releases and therefore receive updates for up to 10 years. So if you actually still have an XPS 13 that came with Ubuntu back in the day, it's still supported today. [...] Dell and Canonical have been at this for years. Today's Dell's Developer Editions are the official continuation of Project Sputnik. This initiative began 10 years ago to create high-end Dell systems with Ubuntu preinstalled. These were, and are, designed with programmer input and built for developers.

As Jaewook Woo, Dell's product manager, Linux, explained: "XPS is an innovation portal for Dell -- from its application of cutting-edge technology to experimentation of new user interfaces and experiential design. By bringing the enhanced performance and power management features of Ubuntu 22.04 LTS to our most advanced premium laptop, Dell and Canonical reinforce our joint commitment to continue delivering the best computing experience for developers using Ubuntu."

The forthcoming Dell XPS Plus Developer Edition's specifications are impressive. The base configuration is powered by a 12th-generation Intel i5 1240P processor that runs up to 4.4GHz. For graphics, it uses Intel Iris Xe Graphics. This backs up the 13.4-inch 1920x1200 60Hz display. For storage, it uses a 512GB SSD. The list price is $1,389.

Desktops (Apple)

Linux Distro For Apple Silicon Macs Is Already Up and Running On the Brand-New M2 (arstechnica.com) 129

An anonymous reader quotes a report from Ars Technica: Unlike Intel Macs, Apple silicon Macs were designed to run only Apple's software. But the developers on the Asahi Linux team have been working to change that, painstakingly reverse-engineering support for Apple's processors and other Mac hardware and releasing it as a work-in-progress distro that can actually boot up and run on bare metal, no virtualization required. The Asahi Linux team put out a new release today with plenty of additions and improvements. Most notably, the distro now supports the M1 Ultra and the Mac Studio and has added preliminary support for the M2 MacBook Pro (which has been tested firsthand by the team) and the M2 MacBook Air (which hasn't been tested but ought to work). Preliminary Bluetooth support for all Apple silicon Macs has also been added, though the team notes that it works poorly when connected to a 2.4GHz Wi-Fi network because "Wi-Fi/Bluetooth coexistence isn't properly configured yet."

There are still many other things that aren't working properly, including the USB-A ports on the Studio, faster-than-USB-2.0 speeds from any Type-C/Thunderbolt ports, and GPU acceleration, but progress is being made on all of those fronts. GPU work in particular is coming along, with a "prototype driver" that is "good enough to run real graphics applications and benchmarks" already up and running, though it's not included in this release. The Asahi team has said in the past that it expects support for new chips to be relatively easy to add to Asahi since Apple's chip designers frequently reuse things and don't make extensive hardware changes unless there's a good reason for it. Adding basic support for the M2 to Asahi happened over the course of a single 12-hour development session, and just "a few days" of additional effort were needed to get the rest of the hardware working as well as it does with M1-based Macs.

Graphics

SF Writer/Digital Art/NFT Pioneer Herbert W. Franke Dies at Age 95 (artnews.com) 20

On July 7th Art News explained how 95-year-old Austrian artist Herbert W. Franke "has recently become a sensation within the art world the crypto space," describing the digital pioneer as a computer artist using algorithms and computer programs to visualize math as art. Last month, the physicist and science fiction writer was behind one of the most talked about digital artworks at a booth by the blockchain company Tezos at Art Basel. Titled MONDRIAN (1979), the work paid tribute to artist Piet Mondrian's iconic geometric visuals using a program written on one of the first home computers.

Days before this, Franke, who studied physics in Vienna following World War II and started working at Siemens in 1953, where he conducted photographic experiments after office hours, launched 100 images from his famed series "Math Art" (1980-95) as NFTs on the Quantum platform. The drop was meant to commemorate his birthday on May 14 and to raise funds for his foundation. The NFTs sold out in 30 seconds, with the likes of pioneering blockchain artist Kevin Abosch purchasing a few.

In one of his last interviews, Franke told the site that blockchain "is a totally new environment, and this technology is still in its early stages, like at the beginning of computer art. But I am convinced that it has opened a new door for digital art and introduced the next generation to this new technology." It echoed something he'd said in his first book, published in 1957, which he later quoted in the interview (a full 65 years later). "Technology is usually dismissed as an element hostile to art. I want to try to prove that it is not..."

This morning, long-time Slashdot reader Qbertino wrote: The German IT news site heise reports (article in German) that digital art pioneer, SF author ("The Mind Net") and cyberspace avantgardist Herbert W. Franke has died at age 95. His wife recounted on his Twitter account: "Herbert loved to call himself the dinosaur of computer art. I am [...] devastated to announce that our beloved dinosaur has left the earth.

"He passed away knowing there is a community of artists and art enthusiasts deeply caring about his art and legacy."
Among much pioneering work he founded one of the worlds first digital art festivals "Ars Electronica" in Austria in 1979.

Franke's wife is still running the Art Meets Science web site dedicated to Franke's work. Some highlights from its biography of Franke's life: Herbert W. Franke, born in Vienna on May 14, 1927, studied physics and philosophy at the University of Vienna and received his doctorate in 1951... An Apple II was his first personal computer which he bought 1980. He developed a program as early as 1982 that used a midi interface to control moving image sequences through music....

Only in recent years has "art from the machine" begun to interest traditional museums as a branch of modern art. Franke, who from the beginning was firmly convinced of the future importance of this art movement, has also assembled a collection of computer graphics that is unique in the world, documenting 50 years of this development with works by respected international artists, supplemented by his own works....

As a physicist, Franke was predestined to bring science and technology closer to the general public in popular form due to his talent as a writer, which became apparent early on. About one-third of his nearly fifty books, as well as uncounted journal articles...

Franke's novels and stories are not about predicting future technologies, nor about forecasting our future way of life, but rather about the intellectual examination of possible models of our future and their philosophical as well as ethical interpretation. In this context, however, Franke attaches great importance to the seriousness of scientific or technological assessments of the future in the sense of a feasibility analysis. In his opinion, a serious and meaningful discussion about future developments can basically only be conducted on this basis. In this respect, Franke is not a typical representative of science fiction, but rather a visionary who, as a novelist, deals with relevant questions of social future and human destiny on a high intellectual level.

Technology

Samsung Develops GDDR6 DRAM With 24Gbps Speed for Graphics Cards (zdnet.com) 20

Samsung said on Thursday that it has developed a new GDDR6 (graphics double data rate) DRAM with a data transfer rate of 24 gigabits per second (Gbps). From a report: A premium graphics card that packs the chips will support a data processing rate of up to 1.1 terabytes (TB), equivalent to processing 275 movies in Full HD resolution within a second, the South Korean tech giant said. Samsung said the DRAM was comprised of 16Gb chips using its third-generation 10nm process node, which also incorporates extreme ultraviolet (EUV) lithography during their production. The company also applied high-k metal gates, or the use of metals besides silicon dioxide to make the gate hold more charge, on the DRAM. Samsung said this allowed its latest DRAM to operate at a rate over 30% faster than its 18Gbps GGDR6 DRAM predecessor.
Open Source

Gtk 5 Might Drop X.11 Support, Says GNOME Dev (theregister.com) 145

One of the GNOME developers has suggested that the next major release of Gtk could drop support for the X window system. The Register reports: Emmanuele Bassi opened a discussion last week on the GNOME project's Gitlab instance that asked whether the developers could drop X.11 support in the next release of Gtk. At this point, it is only a suggestion, but if it gets traction, this could significantly accelerate the move to the Wayland display server and the end of X.11.

Don't panic: Gtk 5 is not imminent. Gtk is a well-established toolkit, originally designed for the GIMP bitmap editing program back in 1998. Gtk 4 arrived relatively recently, shortly before the release of GNOME 40 in 2021. GNOME 40 has new user-interface guidelines, and as a part of this, Gtk 4 builds GNOME's Adwaita theme into the toolkit by means of the new libadwaita library, which is breaking the appearance of some existing apps.

Also, to be fair, as we recently covered, the X window system is very old now and isn't seeing major changes, although new releases of parts of it do still happen. This discussion is almost certain to get wildly contentious, and the thread on Gitlab has been closed to further comments for now. If this idea gains traction, one likely outcome might well be a fork of Gtk, just as happened when GNOME 3 came out. [...] A lot of the features of the current version, X.11, are no longer used or relevant to most users. Even so, X.12 is barely even in the planning stages yet.

Medicine

Smart Contact Lens Prototype Puts a Micro LED Display On Top of the Eye (arstechnica.com) 37

An anonymous reader quotes a report from Ars Technica: Since 2015, a California-based company called Mojo Vision has been developing smart contact lenses. Like smart glasses, the idea is to put helpful AR graphics in front of your eyes to help accomplish daily tasks. Now, a functioning prototype brings us closer to seeing a final product. In a blog post this week, Drew Perkins, the CEO of Mojo Vision, said he was the first to have an "on-eye demonstration of a feature-complete augmented reality smart contact lens." In an interview with CNET, he said he's been wearing only one contact at a time for hour-long durations. Eventually, Mojo Vision would like users to be able to wear two Mojo Lens simultaneously and create 3D visual overlays, the publication said. According to his blog, the CEO could see a compass through the contact and an on-screen teleprompter with a quote written on it. He also recalled viewing a green, monochromatic image of Albert Einstein to CNET.

At the heart of the lens is an Arm M0 processor and a Micro LED display with 14,000 pixels per inch. It's just 0.02 inches (0.5 mm) in diameter with a 1.8-micron pixel pitch. Perkins claimed it's the "smallest and densest display ever created for dynamic content." Developing the contact overall included a focus on physics and electronics miniaturization, Perkins wrote. Mojo Lens developed its power management system with "medical-grade micro-batteries" and a proprietary power management integrated circuit. The Mojo Lens also uses a custom-configured magnetometer (CNET noted this drives the compass Perkins saw), accelerometer, and gyroscope for tracking. [...]

A contact lens sounds like it has the potential to be even more discreet than AR headgear posing as regular Ray-Bans. But the current prototype uses a "relay accessory," as Mojo Vision's rep put it, worn around the neck. It includes a processor, GPU, and 5 GHz radio for sending and receiving data to and from the lens. According to CNET, the accessory also sends information "back to computers that track the eye movement data for research." Perkins' blog said this tech required custom ASIC designs. [...] The current prototype also uses a hat with an integrated antenna for easier connecting, CNET reported; though, we'd expect this to be omitted from a final product.
"There's no firm release date for the Mojo Lens, which could be the first AR contact lens to reach consumers," adds Ars. "Near-term goals include getting potential partners, investors, and journalists to try the smart lens."
Programming

Are Today's Programmers Leaving Too Much Code Bloat? (positech.co.uk) 296

Long-time Slashdot reader Artem S. Tashkinov shares a blog post from indie game programmer who complains "The special upload tool I had to use today was a total of 230MB of client files, and involved 2,700 different files to manage this process." Oh and BTW it gives error messages and right now, it doesn't work. sigh.

I've seen coders do this. I know how this happens. It happens because not only are the coders not doing low-level, efficient code to achieve their goal, they have never even SEEN low level, efficient, well written code. How can we expect them to do anything better when they do not even understand that it is possible...? It's what they learned. They have no idea what high performance or constraint-based development is....

Computers are so fast these days that you should be able to consider them absolute magic. Everything that you could possibly imagine should happen between the 60ths of a second of the refresh rate. And yet, when I click the volume icon on my microsoft surface laptop (pretty new), there is a VISIBLE DELAY as the machine gradually builds up a new user interface element, and eventually works out what icons to draw and has them pop-in and they go live. It takes ACTUAL TIME. I suspect a half second, which in CPU time, is like a billion fucking years....

All I'm doing is typing this blog post. Windows has 102 background processes running. My nvidia graphics card currently has 6 of them, and some of those have sub tasks. To do what? I'm not running a game right now, I'm using about the same feature set from a video card driver as I would have done TWENTY years ago, but 6 processes are required. Microsoft edge web view has 6 processes too, as does Microsoft edge too. I don't even use Microsoft edge. I think I opened an SVG file in it yesterday, and here we are, another 12 useless pieces of code wasting memory, and probably polling the cpu as well.

This is utter, utter madness. Its why nothing seems to work, why everything is slow, why you need a new phone every year, and a new TV to load those bloated streaming apps, that also must be running code this bad. I honestly think its only going to get worse, because the big dumb, useless tech companies like facebook, twitter, reddit, etc are the worst possible examples of this trend....

There was a golden age of programming, back when you had actual limitations on memory and CPU. Now we just live in an ultra-wasteful pit of inefficiency. Its just sad.

Long-time Slashdot reader Z00L00K left a comment arguing that "All this is because everyone today programs on huge frameworks that have everything including two full size kitchen sinks, one for right handed people and one for left handed." But in another comment Slashdot reader youn blames code generators, cut-and-paste programming, and the need to support multiple platforms.

But youn adds that even with that said, "In the old days, there was a lot more blue screens of death... Sure it still happens but how often do you restart your computer these days." And they also submitted this list arguing "There's a lot more functionality than before."
  • Some software has been around a long time. Even though the /. crowd likes to bash Windows, you got to admit backward compatibility is outstanding
  • A lot of things like security were not taken in consideration
  • It's a different computing environment.... multi tasking, internet, GPUs
  • In the old days, there was one task running all the time. Today, a lot of error handling, soft failures if the app is put to sleep
  • A lot of code is due to to software interacting one with another, compatibility with standards
  • Shiny technology like microservices allow scaling, heterogenous integration

So who's right and who's wrong? Leave your own best answers in the comments.

And are today's programmers leaving too much code bloat?


Slashdot Top Deals