Operating Systems

ArcaOS (OS/2 Warp OEM) 5.1.1 Has Been Released (arcanoae.com) 88

"IBM stopped supporting OS/2 at the end of 2006," write the makers of ArcaOS, an OEM distribution of OS/2's discontinued Warp operating system.

And now long-time Slashdot reader martiniturbide tells us that ArcaOS 5.1.1 has been released, and that many of it's components have been updated too. From this week's announcement: ArcaOS 5.1.1 continues to support installation on the latest generation of UEFI-based systems, as well as the ability to install to GPT-based disk layouts. This enables ArcaOS 5.1.1 to install on a wide array of modern hardware. Of course, ArcaOS 5.1.1 is just as much at home on traditional BIOS-based systems, offering enhanced stability and performance across both environments....

Need more convincing? How about a commercial operating system which doesn't spy on you, does not report your online activity to anyone, and gives you complete freedom to choose the applications you want to use, however you want to use them? How about an operating system which isn't tied to any specific hardware manufacturer, allowing you to choose the platform which is right for you, and fits perfectly well in systems with less than 4GB of memory or even virtual machines?

Red Hat Software

Red Hat Plans to Add AI to Fedora and GNOME 49

In his post about the future of Fedora Workstation, Christian F.K. Schaller discusses how the Red Hat team plans to integrate AI with IBM's open-source Granite engine to enhance developer tools, such as IDEs, and create an AI-powered Code Assistant. He says the team is also working on streamlining AI acceleration in Toolbx and ensuring Fedora users have access to tools like RamaLama. From the post: One big item on our list for the year is looking at ways Fedora Workstation can make use of artificial intelligence. Thanks to IBMs Granite effort we know have an AI engine that is available under proper open source licensing terms and which can be extended for many different usecases. Also the IBM Granite team has an aggressive plan for releasing updated versions of Granite, incorporating new features of special interest to developers, like making Granite a great engine to power IDEs and similar tools. We been brainstorming various ideas in the team for how we can make use of AI to provide improved or new features to users of GNOME and Fedora Workstation. This includes making sure Fedora Workstation users have access to great tools like RamaLama, that we make sure setting up accelerated AI inside Toolbx is simple, that we offer a good Code Assistant based on Granite and that we come up with other cool integration points. "I'm still not sure how I feel about this approach," writes designer/developer and blogger, Bradley Taunt. "While IBM Granite is an open source model, I still don't enjoy so much artificial 'intelligence' creeping into core OS development. This also isn't something optional on the end-users side, like a desktop feature or package. This sounds like it's going to be built directly into the core system."

"Red Hat has been pushing hard towards AI and my main concern is having this influence other operating system dev teams. Luckily things seems AI-free in BSD land. For now, at least."
Supercomputing

Quantum Computer Built On Server Racks Paves the Way To Bigger Machines (technologyreview.com) 27

An anonymous reader quotes a report from MIT Technology Review: A Canadian startup called Xanadu has built a new quantum computer it says can be easily scaled up to achieve the computational power needed to tackle scientific challenges ranging from drug discovery to more energy-efficient machine learning. Aurora is a "photonic" quantum computer, which means it crunches numbers using photonic qubits -- information encoded in light. In practice, this means combining and recombining laser beams on multiple chips using lenses, fibers, and other optics according to an algorithm. Xanadu's computer is designed in such a way that the answer to an algorithm it executes corresponds to the final number of photons in each laser beam. This approach differs from one used by Google and IBM, which involves encoding information in properties of superconducting circuits.

Aurora has a modular design that consists of four similar units, each installed in a standard server rack that is slightly taller and wider than the average human. To make a useful quantum computer, "you copy and paste a thousand of these things and network them together," says Christian Weedbrook, the CEO and founder of the company. Ultimately, Xanadu envisions a quantum computer as a specialized data center, consisting of rows upon rows of these servers. This contrasts with the industry's earlier conception of a specialized chip within a supercomputer, much like a GPU. [...]

Xanadu's 12 qubits may seem like a paltry number next to IBM's 1,121, but Tiwari says this doesn't mean that quantum computers based on photonics are running behind. In his opinion, the number of qubits reflects the amount of investment more than it does the technology's promise. [...] Xanadu's next goal is to improve the quality of the photons in the computer, which will ease the error correction requirements. "When you send lasers through a medium, whether it's free space, chips, or fiber optics, not all the information makes it from the start to the finish," he says. "So you're actually losing light and therefore losing information." The company is working to reduce this loss, which means fewer errors in the first place. Xanadu aims to build a quantum data center, with thousands of servers containing a million qubits, in 2029.
The company published its work on chip design optimization and fabrication in the journal Nature.
Oracle

Oracle Faces Java Customer Revolt After 'Predatory' Pricing Changes (theregister.com) 136

Nearly 90% of Oracle Java customers are looking to abandon the software maker's products following controversial licensing changes made in 2023, according to research firm Dimensional Research.

The exodus reflects growing frustration with Oracle's shift to per-employee pricing for its Java platform, which critics called "predatory" and could increase costs up to five times for the same software, Gartner found. The dissatisfaction runs deepest in Europe, where 92% of French and 95% of German users want to switch to alternative providers like Bellsoft Liberica, IBM Semeru, or Azul Platform Core.
Games

Complexity Physics Finds Crucial Tipping Points In Chess Games (arstechnica.com) 12

An anonymous reader quotes a report from Ars Technica: The game of chess has long been central to computer science and AI-related research, most notably in IBM's Deep Blue in the 1990s and, more recently, AlphaZero. But the game is about more than algorithms, according to Marc Barthelemy, a physicist at the Paris-Saclay University in France, with layers of depth arising from the psychological complexity conferred by player strategies. Now, Barthelmey has taken things one step further by publishing a new paper in the journal Physical Review E that treats chess as a complex system, producing a handy metric that can help predict the proverbial "tipping points" in chess matches. [...]

For his analysis, Barthelemy chose to represent chess as a decision tree in which each "branch" leads to a win, loss, or draw. Players face the challenge of finding the best move amid all this complexity, particularly midgame, in order to steer gameplay into favorable branches. That's where those crucial tipping points come into play. Such positions are inherently unstable, which is why even a small mistake can have a dramatic influence on a match's trajectory. Barthelemy has re-imagined a chess match as a network of forces in which pieces act as the network's nodes, and the ways they interact represent the edges, using an interaction graph to capture how different pieces attack and defend one another. The most important chess pieces are those that interact with many other pieces in a given match, which he calculated by measuring how frequently a node lies on the shortest path between all the node pairs in the network (its "betweenness centrality").

He also calculated so-called "fragility scores," which indicate how easy it is to remove those critical chess pieces from the board. And he was able to apply this analysis to more than 20,000 actual chess matches played by the world's top players over the last 200 years. Barthelemy found that his metric could indeed identify tipping points in specific matches. Furthermore, when he averaged his analysis over a large number of games, an unexpected universal pattern emerged. "We observe a surprising universality: the average fragility score is the same for all players and for all openings," Barthelemy writes. And in famous chess matches, "the maximum fragility often coincides with pivotal moments, characterized by brilliant moves that decisively shift the balance of the game." Specifically, fragility scores start to increase about eight moves before the critical tipping point position occurs and stay high for some 15 moves after that.
"These results suggest that positional fragility follows a common trajectory, with tension peaking in the middle game and dissipating toward the endgame," writes Barthelemy. "This analysis highlights the complex dynamics of chess, where the interaction between attack and defense shapes the game's overall structure."
Operating Systems

How the OS/2 Flop Went On To Shape Modern Software (theregister.com) 167

"It's fair to say that by 1995, OS/2 was dead software walking," remembers a new article from the Register (which begins with a 1995 Usenet post from Gordon Letwin, Microsoft's lead architect on the OS/2 project).

But the real question is why this Microsoft-IBM collaboration on a DOS-replacing operating system ultimately lost out to Windows...? If OS/2 1.0 had been an 80386 OS, and had been able to multitask DOS apps, we think it would have been a big hit.... OS/2's initial 1980s versions were 16-bit products, at IBM's insistence. That is when the war was lost. That is when OS/2 flopped. Because its initial versions were even more crippled than the Deskpro 386...

Because OS/2 1.x flopped, Microsoft launched a product that fixed the key weakness of OS/2 1.x. That product was Windows 3, which worked perfectly acceptably on 286 machines, but if you ran the same installed copy on a 32-bit 386 PC, it worked better. Windows 3.0 could use the more sophisticated hardware of a 386 to give better multitasking of the market-dominating DOS apps...

IBM's poor planning shaped the PC industry of the 1990s more than Microsoft's successes. Windows 3.0 wasn't great, but it was good enough. It reversed people's perception of Windows after the failures of Windows 1 and Windows 2. Windows 3 achieved what OS/2 had intended to do. It transformed IBM PC compatibles from single-tasking text-only computers into graphical computers, with poor but just about usable multitasking...

Soon after Windows 3.0 turned out to be a hit, OS/2 NT was rebranded as Windows NT. Even the most ardent Linux enthusiast must c\oncede that Windows NT did quite well over three decades.

Back in 1995, the Register's author says they'd moved from OS/2 to Windows 95 "while it was still in beta.

"The UI was far superior, more hardware worked, and Doom ran much better."
IBM

IBM and GlobalFoundries Settle Multibillion-Dollar Trade Secret and Contract Lawsuits (theregister.com) 3

The Register's Jude Karabus reports: IBM and semiconductor maker GlobalFoundries have settled all of their litigation against each other, including breach of contract, patent, and trade secret suits, the pair say. The details of the settlement are confidential. All that both companies were prepared to say in yesterday's statements was that the deal they'd agreed would resolve "all litigation matters, inclusive of breach of contract, trade secrets, and intellectual property claims between the two companies." They added that the settlement would allow the companies to "explore new opportunities for collaboration in areas of mutual interest." In 2021, IBM sued GlobalFoundries for $2.5 billion, accusing it of failing to deliver on 10nm and 7nm chip production commitments, which disrupted IBM's hardware roadmap. GlobalFoundries poaching engineers countersued in 2023, alleging IBM misused trade secrets and poached engineers to support partnerships with Intel and Rapidus, potentially compromising proprietary technologies.
Printer

Xerox To Buy Printer Maker Lexmark From Chinese Owners in $1.5 Billion Deal (xerox.com) 30

Xerox has agreed to acquire printer maker Lexmark for $1.5 billion, bringing the Kentucky-based company back under U.S. ownership after seven years of Chinese control.

The deal, announced Monday, will be financed through cash and debt, creating a vertically integrated printing equipment manufacturer and service provider. Lexmark, formed from IBM in 1991, was previously acquired by Chinese investors including Ninestar for $2.54 billion in 2016. The merger comes as Xerox faces declining equipment sales and a 50% year-to-date stock drop, with its market value at just over $1 billion.
Google

Scientific Breakthrough Gives New Hope To Building Quantum Computers (ft.com) 83

Google has achieved a major breakthrough in quantum error correction that could enable practical quantum computers by 2030, the company announced in a paper published Monday in Nature. The research demonstrated significant error reduction when scaling up from 3x3 to 7x7 grids of quantum bits, with errors dropping by half at each step. The advance addresses quantum computing's core challenge of maintaining stable quantum states, which typically last only microseconds.

Google's new quantum chip, manufactured in-house, maintains quantum states for nearly 100 microseconds -- five times longer than previous versions. The company aims to build a full-scale system with about 1 million qubits, projecting costs around $1 billion by decade's end.

IBM, Google's main rival, questioned the scalability of Google's "surface code" error correction approach, claiming it would require billions of qubits. IBM is pursuing an alternative three-dimensional design requiring new connector technology expected by 2026. The breakthrough parallels the first controlled nuclear chain reaction in 1942, according to MIT physics professor William Oliver, who noted that both achievements required years of engineering to realize theoretical predictions from decades earlier.

Further reading: Google: Meet Willow, our state-of-the-art quantum chip.
Programming

Stanford Research Reveals 9.5% of Software Engineers 'Do Virtually Nothing' (x.com) 237

A Stanford study of over 50,000 software engineers across hundreds of companies has found that approximately 9.5% of engineers perform minimal work while drawing full salaries, potentially costing tech companies billions annually.

The research showed the issue is most prevalent in remote work settings, where 14% of engineers were classified as "ghost engineers" compared to 6% of office-based staff. The study evaluated productivity through analysis of private Git repositories and simulated expert assessments of code commits.

Major tech companies could be significantly impacted, with IBM estimated to have 17,100 underperforming engineers at an annual cost of $2.5 billion. Across the global software industry, the researchers estimate the total cost of underperforming engineers could reach $90 billion, based on a conservative 6.5% rate of "ghost engineers" worldwide.
Supercomputing

IBM Boosts the Amount of Computation You Can Get Done On Quantum Hardware (arstechnica.com) 30

An anonymous reader quotes a report from Ars Technica: There's a general consensus that we won't be able to consistently perform sophisticated quantum calculations without the development of error-corrected quantum computing, which is unlikely to arrive until the end of the decade. It's still an open question, however, whether we could perform limited but useful calculations at an earlier point. IBM is one of the companies that's betting the answer is yes, and on Wednesday, it announced a series of developments aimed at making that possible. On their own, none of the changes being announced are revolutionary. But collectively, changes across the hardware and software stacks have produced much more efficient and less error-prone operations. The net result is a system that supports the most complicated calculations yet on IBM's hardware, leaving the company optimistic that its users will find some calculations where quantum hardware provides an advantage. [...]

Wednesday's announcement was based on the introduction of the second version of its Heron processor, which has 133 qubits. That's still beyond the capability of simulations on classical computers, should it be able to operate with sufficiently low errors. IBM VP Jay Gambetta told Ars that Revision 2 of Heron focused on getting rid of what are called TLS (two-level system) errors. "If you see this sort of defect, which can be a dipole or just some electronic structure that is caught on the surface, that is what we believe is limiting the coherence of our devices," Gambetta said. This happens because the defects can resonate at a frequency that interacts with a nearby qubit, causing the qubit to drop out of the quantum state needed to participate in calculations (called a loss of coherence). By making small adjustments to the frequency that the qubits are operating at, it's possible to avoid these problems. This can be done when the Heron chip is being calibrated before it's opened for general use.

Separately, the company has done a rewrite of the software that controls the system during operations. "After learning from the community, seeing how to run larger circuits, [we were able to] almost better define what it should be and rewrite the whole stack towards that," Gambetta said. The result is a dramatic speed-up. "Something that took 122 hours now is down to a couple of hours," he told Ars. Since people are paying for time on this hardware, that's good for customers now. However, it could also pay off in the longer run, as some errors can occur randomly, so less time spent on a calculation can mean fewer errors. Despite all those improvements, errors are still likely during any significant calculations. While it continues to work toward developing error-corrected qubits, IBM is focusing on what it calls error mitigation, which it first detailed last year. [...] The problem here is that using the function is computationally difficult, and the difficulty increases with the qubit count. So, while it's still easier to do error mitigation calculations than simulate the quantum computer's behavior on the same hardware, there's still the risk of it becoming computationally intractable. But IBM has also taken the time to optimize that, too. "They've got algorithmic improvements, and the method that uses tensor methods [now] uses the GPU," Gambetta told Ars. "So I think it's a combination of both."

Red Hat Software

Red Hat is Acquiring AI Optimization Startup Neural Magic (techcrunch.com) 4

Red Hat, the IBM-owned open source software firm, is acquiring Neural Magic, a startup that optimizes AI models to run faster on commodity processors and GPUs. From a report: The terms of the deal weren't disclosed. MIT research scientist Alex Matveev and professor Nir Shavit founded Somerville, Massachusetts-based Neural Magic in 2018, inspired by their work in high-performance execution engines for AI. Neural Magic's software aims to process AI workloads on processors and GPUs at speeds equivalent to specialized AI chips (e.g. TPUs). By running models on off-the-shelf processors, which usually have more available memory, the company's software can realize these performance gains.

Big tech companies like AMD and a host of other startups, including NeuReality, Deci, CoCoPie, OctoML and DeepCube, offer some sort of AI optimization software. But Neural Magic is one of the few with a free platform and a collection of open source tools to complement it. Neural Magic had so far managed to raise $50 million in venture capital from backers like Andreessen Horowitz, New Enterprise Associations, Amdocs, Comcast Ventures, Pillar VC and Ridgeline Ventures.

Security

Amazon Confirms Employee Data Stolen After Hacker Claims MOVEit Breach (techcrunch.com) 5

Amazon has confirmed that employee data was compromised after a "security event" at a third-party vendor. From a report: In a statement given to TechCrunch on Monday, Amazon spokesperson Adam Montgomery confirmed that employee information had been involved in a data breach. "Amazon and AWS systems remain secure, and we have not experienced a security event. We were notified about a security event at one of our property management vendors that impacted several of its customers including Amazon. The only Amazon information involved was employee work contact information, for example work email addresses, desk phone numbers, and building locations," Montgomery said.

Amazon declined to say how many employees were impacted by the breach. It noted that the unnamed third-party vendor doesn't have access to sensitive data such as Social Security numbers or financial information and said the vendor had fixed the security vulnerability responsible for the data breach. The confirmation comes after a threat actor claimed to have published data stolen from Amazon on notorious hacking site BreachForums. The individual claims to have more than 2.8 million lines of data, which they say was stolen during last year's mass-exploitation of MOVEit Transfer.

The Courts

IBM Sued Again In Storm Over Weather Channel Data Sharing (theregister.com) 20

IBM is facing a new lawsuit alleging that its Weather Channel website shared users' personal data with third-party ad partners without consent, violating the Video Privacy Protection Act (VPPA). The Register reports: In the absence of a comprehensive federal privacy law, the complaint [PDF] claims Big Blue violated America's Video Privacy Protection Act (VPPA), enacted in 1988 in response to the disclosure of Supreme Court nominee Robert Bork's videotape rental records. IBM was sued in 2019 (PDF) by then Los Angeles City Attorney Mike Feuer over similar allegations: That its Weather Channel mobile app collected and shared location data without disclosure. The IT titan settled that claim in 2020. A separate civil action against IBM's Weather Channel was filed in 2020 and settled in 2023 (PDF).

This latest legal salvo against alleged Weather Channel-enabled data collection takes issue with the sensitive information made available through the company's website to third-party ad partners mParticle and AppNexus/Xandr (acquired by Microsoft in 2022). The former provides customer analytics, and the latter is an advertising and marketing platform. The complaint, filed on behalf of California plaintiff Ed Penning, contends that by watching videos on the Weather Channel website, those two marketing firms received Penning's full name, gender, email address, precise geolocation, the name, and the URLs of videos he watched, without his permission or knowledge.

It explains that the plaintiff's counsel retained a private research firm last year to analyze browser network traffic during video sessions on the Weather Channel website. The research firm is said to have confirmed that the website provided the third-party ad firms with information that could be used to identify people and the videos that they watched. The VPPA prohibits video providers from sharing "personally identifiable information" about clients without their consent. [...] The lawsuit aspires to be certified as a class action. Under the VPPA, a successful claim allows for actual damages (if any) and statutory damages of $2,500 for each violation of the law, as well as attorney's fees.

The Internet

Ward Christensen, BBS Inventor and Architect of Our Online Age, Dies At Age 78 (arstechnica.com) 41

An anonymous reader quotes a report from Ars Technica: On Friday, Ward Christensen, co-inventor of the computer bulletin board system (BBS), died at age 78 in Rolling Meadows, Illinois. Christensen, along with Randy Suess, created the first BBS in Chicago in 1978, leading to an important cultural era of digital community-building that presaged much of our online world today. Friends and associates remember Christensen as humble and unassuming, a quiet innovator who never sought the spotlight for his groundbreaking work. Despite creating one of the foundational technologies of the digital age, Christensen maintained a low profile throughout his life, content with his long-standing career at IBM and showing no bitterness or sense of missed opportunity as the Internet age dawned.

"Ward was the quietest, pleasantest, gentlest dude," said BBS: The Documentary creator Jason Scott in a conversation with Ars Technica. Scott documented Christensen's work extensively in a 2002 interview for that project. "He was exactly like he looks in his pictures," he said, "like a groundskeeper who quietly tends the yard." Tech veteran Lauren Weinstein initially announced news of Christensen's passing on Sunday, and a close friend of Christensen's confirmed to Ars that Christensen died peacefully in his home. The cause of death has not yet been announced.

Pior to creating the first BBS, Christensen invented XMODEM, a 1977 file transfer protocol that made much of the later BBS world possible by breaking binary files into packets and ensuring that each packet was safely delivered over sometimes unstable and noisy analog telephone lines. It inspired other file transfer protocols that allowed ad-hoc online file sharing to flourish.

Supercomputing

IBM Opens Its Quantum-Computing Stack To Third Parties (arstechnica.com) 7

An anonymous reader quotes a report from Ars Technica, written by John Timmer: [P]art of the software stack that companies are developing to control their quantum hardware includes software that converts abstract representations of quantum algorithms into the series of commands needed to execute them. IBM's version of this software is called Qiskit (although it was made open source and has since been adopted by other companies). Recently, IBM made a couple of announcements regarding Qiskit, both benchmarking it in comparison to other software stacks and opening it up to third-party modules. [...] Right now, the company is supporting six third-party Qiskit functions that break down into two categories.

The first can be used as stand-alone applications and are focused on providing solutions to problems for users who have no expertise programming quantum computers. One calculates the ground-state energy of molecules, and the second performs optimizations. But the remainder are focused on letting users get more out of existing quantum hardware, which tends to be error prone. But some errors occur more often than others. These errors can be due to specific quirks of individual hardware qubits or simply because some specific operations are more error prone than others. These can be handled in two different ways. One is to design the circuit being executed to avoid the situations that are most likely to produce an error. The second is to examine the final state of the algorithm to assess whether errors likely occurred and adjust to compensate for any. And third parties are providing software that can handle both of these.

One of those third parties is Q-CTRL, and we talked to its CEO, Michael Biercuk. "We build software that is really focused on everything from the lowest level of hardware manipulation, something that we call quantum firmware, up through compilation and strategies that help users map their problem onto what has to be executed on hardware," he told Ars. (Q-CTRL is also providing the optimization tool that's part of this Qiskit update.) "We're focused on suppressing errors everywhere that they can occur inside the processor," he continued. "That means the individual gate or logic operations, but it also means the execution of the circuit. There are some errors that only occur in the whole execution of a circuit as opposed to manipulating an individual quantum device." Biercuk said Q-CTRL's techniques are hardware agnostic and have been demonstrated on machines that use very different types of qubits, like trapped ions. While the sources of error on the different hardware may be distinct, the manifestations of those problems are often quite similar, making it easier for Q-CTRL's approach to work around the problems.

Those work-arounds include things like altering the properties of the microwave pulses that perform operations on IBM's hardware, and replacing the portion of Qiskit that converts an algorithm to a series of gate operations. The software will also perform operations that suppress errors that can occur when qubits are left idle during the circuit execution. As a result of all these differences, he claimed that using Q-CTRL's software allows the execution of more complex algorithms than are possible via Qiskit's default compilation and execution. "We've shown, for instance, optimization with all 156 qubits on [an IBM] system, and importantly -- I want to emphasize this word -- successful optimization," Biercuk told Ars. "What it means is you run it and you get the right answer, as opposed to I ran it and I kind of got close."

Businesses

As IBM Pushes For More Automation, Its AI Simply Not Up To the Job of Replacing Staff (theregister.com) 38

An anonymous reader shares a report: IBM's plan to replace thousands of roles with AI presently looks more like outsourcing jobs to India, at the expense of organizational competency. That view of Big Blue was offered to The Register after our report on the IT giant's latest layoffs, which resonated so strongly with several IBM employees that they contacted The Register with thoughts on the job cuts. Our sources have asked not to be identified to protect their ongoing relationships with Big Blue. Suffice to say they were or are employed as senior technologists in business units that span multiple locations and were privy to company communications: These are not views from the narrow entrance to a single cubicle. We're going to refer to three by the pseudonyms Alex, Blake, and Casey.

"I always make this joke about IBM," said Alex. "It is: 'IBM doesn't want people to work for them.' Every six months or so they are doing rounds of [Resource Actions -- IBM-speak for layoffs] or forcing folks into impossible moves, which result in separation." That's consistent with CEO Arvind Krishna's commitment last year to replace around 7,800 jobs with AI. But our sources say Krishna's plan is on shaky ground: IBM's AI isn't up to the job of replacing people, and some of the people who could fix that have been let go. Alex observed that over the past four years, IBM management has constantly pushed for automation and the use of AI. "With AI tools writing that code for us ... why pay for senior-level staff when you can promote a youngster who doesn't really know any better at a much lower price?" he said. "Plus, once you have a seasoned programmer write code that is by law the company's IP and it is fed into an AI library, it basically learns it and the author is no longer needed." But our sources tell us that scenario has yet to be realized inside IBM.

Privacy

Chinese Spies Spent Months Inside Aerospace Engineering Firm's Network Via Legacy IT (theregister.com) 16

The Register's Jessica Lyons reports: Chinese state-sponsored spies have been spotted inside a global engineering firm's network, having gained initial entry using an admin portal's default credentials on an IBM AIX server. In an exclusive interview with The Register, Binary Defense's Director of Security Research John Dwyer said the cyber snoops first compromised one of the victim's three unmanaged AIX servers in March, and remained inside the US-headquartered manufacturer's IT environment for four months while poking around for more boxes to commandeer. It's a tale that should be a warning to those with long- or almost-forgotten machines connected to their networks; those with shadow IT deployments; and those with unmanaged equipment. While the rest of your environment is protected by whatever threat detection you have in place, these legacy services are perfect starting points for miscreants.

This particular company, which Dwyer declined to name, makes components for public and private aerospace organizations and other critical sectors, including oil and gas. The intrusion has been attributed to an unnamed People's Republic of China team, whose motivation appears to be espionage and blueprint theft. It's worth noting the Feds have issued multiple security alerts this year about Beijing's spy crews including APT40 and Volt Typhoon, which has been accused of burrowing into American networks in preparation for destructive cyberattacks.

After discovering China's agents within its network in August, the manufacturer alerted local and federal law enforcement agencies and worked with government cybersecurity officials on attribution and mitigation, we're told. Binary Defense was also called in to investigate. Before being caught and subsequently booted off the network, the Chinese intruders uploaded a web shell and established persistent access, thus giving them full, remote access to the IT network -- putting the spies in a prime position for potential intellectual property theft and supply-chain manipulation. If a compromised component makes it out of the supply chain and into machinery in production, whoever is using that equipment or vehicle will end up feeling the brunt when that component fails, goes rogue, or goes awry.

"The scary side of it is: With our supply chain, we have an assumed risk chain, where whoever is consuming the final product -- whether it is the government, the US Department of the Defense, school systems â" assumes all of the risks of all the interconnected pieces of the supply chain," Dwyer told The Register. Plus, he added, adversarial nations are well aware of this, "and the attacks continually seem to be shifting left." That is to say, attempts to meddle with products are happening earlier and earlier in the supply-chain pipeline, thus affecting more and more victims and being more deep-rooted in systems. Breaking into a classified network to steal designs or cause trouble is not super easy. "But can I get into a piece of the supply chain at a manufacturing center that isn't beholden to the same standards and accomplish my goals and objectives?" Dwyer asked. The answer, of course, is yes. [...]

IBM

IBM is Quietly Axing Thousands of Jobs (theregister.com) 53

IBM has been laying off a substantial number of employees this week and is trying to keep it quiet, The Register reported Wednesday, citing its sources. From the report: One IBM employee told The Register that IBM Cloud experienced "a massive layoff" in the past few days that affected thousands of people. "Unlike traditional layoffs, this one was done in secret," the insider said. "My manager told me that they were required to sign an NDA not to talk about the specifics."

Multiple posts on layoff-focused message boards and corroborating accounts with other sources familiar with the IT giant's operations suggest the cuts are large. Asked to confirm the layoffs, an IBM spokesperson told The Register, "Early this year, IBM disclosed a workforce rebalancing charge that would represent a very low single digit percentage of IBM's global workforce, and we still expect to exit 2024 at roughly the same level of employment as we entered with."

IBM

IBM Acquires Kubernetes Cost Optimization Startup Kubecost (techcrunch.com) 9

IBM has acquired Kubecost, a FinOps startup that helps teams at companies like Allianz, Audi, Rakuten, and GitLab monitor and optimize their Kubernetes clusters with a focus on efficiency and, ultimately, cost. From a report: Tuesday's announcement follows IBM's $4.3 billion acquisition of Apptio in 2023, another company in the FinOps space. In previous years, we also saw IBM acquire companies like cloud app and network management firm Turbonomic and application performance management startup Instana. Now with the acquisition of KubeCost, IBM continues this effort to bolster its IT and FinOps capabilities as enterprises increasingly look to better manage their increasingly complex cloud and on-prem infrastructure.

Slashdot Top Deals