×
Programming

JetBrains to Reimagine IntelliJ as Text Editor, Add Machine Learning (devclass.com) 41

From a report: JetBrains has added further destinations to the IntelliJ-based roadmap it sketched out last year, promising more localization, machine learning and Git integration amongst a range of other goodies for the Java IDE...

The Prague-based firm's CTO Dimitry Jemerov said users had long asked to be able to use its IDEs for "general purpose text editing". While this is possible to some degree currently, in some situations it created a temporary project file, leading to disk clutter and "other inconveniences". However, recent performance improvements mean "the possibility of using our IDEs as lightweight text editors has become more plausible, so we're now building a dedicated mode for editing non-project files. In this mode, the IDE will work more like a simple text editor." This will be faster, he promised, but the feature set will be very limited and "you'll be able to easily switch to the full project mode if you need to use features such as refactoring or debugging...

Other upcoming features include more machine learning. Jemerov said this was already being used to improve code completion, but would now be rolled out for other completion features. "We're teaching ML completion to make better use of the context for ranking completion suggestions and to generate completion variants that go beyond a single identifier (full-line completion)". That might take a while, he said, but was a "major area where we are investing our efforts."

Open Source

Framework Developer 'Ragequits' Open Source Community, Citing Negative Comments, 'Very Few Provide Help' (theregister.co.uk) 122

The maintainer of the popular Rust web framework Actix has quit the project -- though he's backed off threats to make its code private and delete its repository, instead appointing a new maintainer. "Be a maintainer of large open source project is not a fun task," he'd complained last week on GitHub. "You alway face with rude and hate, everyone knows better how to build software, nobody wants to do home work and read docs and think a bit and very few provide any help...

"You felt betrayed after you put so much effort and then to hear all this shit comments, even if you understand that that is usual internet behavior.... Nowadays supporting actix project is not fun, and be[ing] part of rust community is not fun as well."

The Register reports: Actix Web was developed by Nikolay Kim, who is also a senior software engineer at Microsoft, though the Actix project is not an official Microsoft project. Actix Web is based on Actix, a framework for Rust based on the Actor model, also developed by Kim. The web framework is important to the Rust community partly because it addresses a common use case (development web applications) and partly because of its outstanding performance. For some tests, Actix tops the Techempower benchmarks.

The project is open source and while it is popular, there has been some unhappiness among users about its use of "unsafe" code... Safe code is protected from common bugs (and more importantly, security vulnerabilities) arising from issues like variables which point to uninitialized memory, or variables which are used after the memory allocated to them has been freed, or attempting to write data to a variable which exceeds the memory allocated. Code in Rust is safe by default, but the language also supports unsafe code, which can be useful for interoperability or to improve performance.

There is extensive use of unsafe code in Actix, leading to debate about what should be fixed. Kim was not always receptive to proposed changes... Kim said that he did not ignore or delete issues arbitrarily, but only because he felt he had a better or more creative solution than the one proposed -- while also acknowledging that the "removing issue was a stupid idea." He also threatened to "make [Actix] repos private and then delete them...." Since then, matters have improved. The Github repository was restored and Kim said, "I realized, a lot of people depend on actix. And it would be unfair to just delete repos... I hope new community of developers emerge. And good luck!"

The developer news site DevClass wrote that "The apparent 'ragequit' has prompted questions about the dynamics within the open source community." Over 120 GitHub users have now signed a sympathetic letter to Nikolay from "users, contributors, and followers of your work in the Rust community," saying "We are extremely disappointed at the level of abuse directed towards you."

"Working on open source projects should be rewarding, and your work has empowered thousands of developers across the world to build web services with Rust. It's incredibly tragic for someone who has contributed so much to the community, to be made to feel so unwelcome that they feel that they have no other choice than to leave. This is not the kind of community we want."
Programming

Jira Software Gets Better Roadmaps (techcrunch.com) 21

Atlassian today announced an update to Jira Software, its popular project and issue-tracking tool, that brings a number of major updates to the roadmapping feature it first introduced back in 2018. From a report: Back in 2018, Atlassian also launched its rebuilt version of Jira Software, which took some of its cues from Trello, and today's release builds upon this. "When we launched that new Jira experience back in October 2018, I think we had a really good idea of what we were trying to do with it and where we were taking it," said Jake Brereton, the head of marketing for Jira Software. "And I think if you fast-forward 14 months to where we are today, we just had some really strong validation in a number of areas that are over the target and that that investment we made was worth it."

With this release then, Jira Software's roadmapping tool is getting progress bars that show you the overall status of every roadmap item and that give you a lot more information about the overall state of the project at a glance. Also new here are hierarchy levels that let you unfold the roadmap item to get more in-depth information about the stories and tasks that are part of an item. You can also now map dependencies by simply dragging and dropping items, something that was missing from the first release but that was surely high on the list for many users. Atlassian is also introducing new filters and a number of UI enhancements.

Python

Chinese Academic Suspended After His 'Fully Independently Developed' Programming Language Found To Be Based on Python (ft.com) 107

One of China's top science research institutes has suspended an academic after finding that his "fully independently developed" programming language was based on a widely-used precursor, Python [Editor's note: the link may be paywalled; alternative source]. From a report: Liu Lei, a researcher at the Institute of Computing Technology (ICT) at the Chinese Academy of Sciences, announced last week that his research group had "independently" developed a new programming language, named Mulan after the legendary heroine, and touted as having "applications for artificial intelligence and the internet of things." Days later, Mr Liu wrote an apology to domestic media for "exaggerating" his achievements. Mr Liu admitted that Mulan was based on Python, a programming language whose components are freely available under an "open-source" licence, and that it was primarily designed for teaching programming to children, not for AI applications.
Programming

Are Software Designers Ignoring The Needs of the Elderly? (vortex.com) 205

"[A]t the very time that it's become increasingly difficult for anyone to conduct their day to day lives without using the Net, some categories of people are increasingly being treated badly by many software designers," argues long-time Slashdot reader Lauren Weinstein:
The victims of these attitudes include various special needs groups — visually and/or motor impaired are just two examples — but the elderly are a particular target. Working routinely with extremely elderly persons who are very active Internet users (including in their upper 90s!), I'm particularly sensitive to the difficulties that they face keeping their Net lifelines going. Often they're working on very old computers, without the resources (financial or human) to permit them to upgrade. They may still be running very old, admittedly risky OS versions and old browsers — Windows 7 is going to be used by many for years to come, despite hitting its official "end of life" for updates a few days ago.

Yet these elderly users are increasingly dependent on the Net to pay bills (more and more firms are making alternatives increasingly difficult and in some cases expensive), to stay in touch with friends and loved ones, and for many of the other routine purposes for which all of us now routinely depend on these technologies....

There's an aspect of this that is even worse. It's attitudes! It's the attitudes of many software designers that suggest they apparently really don't care about this class of users much — or at all. They design interfaces that are difficult for these users to navigate. Or in extreme cases, they simply drop support for many of these users entirely, by eliminating functionality that permits their old systems and old browsers to function.

He cites the example of Discourse, the open source internet forum software, which recently announced they'd stop supporting Internet Explorer. Weinstein himself hates Microsoft's browser, "Yet what of the users who don't understand how to upgrade? Who don't have anyone to help them upgrade? Are we to tell them that they matter not at all?"

So he confronted Stack Exchange co-founder Jeff Atwood (who is also one of the co-founders of Discourse) on Twitter — and eventually found himself blocked.

"Far more important though than this particular case is the attitude being expressed by so many in the software community, an attitude that suggests that many highly capable software engineers don't really appreciate these users and the kinds of problems that many of these users may have, that can prevent them from making even relatively simple changes or upgrades to their systems — which they need to keep using as much as anyone — in the real world."
Programming

Introducing JetBrains Mono, 'A Typeface for Developers' (jetbrains.com) 73

Long-time Slashdot reader destinyland writes:
JetBrains (which makes IDEs and other tools for developers and project managers) just open sourced a new "typeface for developers."

JetBrains Mono offers taller lowercase letters while keeping all letters "simple and free from unnecessary details... The easier the forms, the faster the eye perceives them and the less effort the brain needs to process them." There's a dot inside zeroes (but not in O's), and distinguishing marks have also been added to the lowercase L (to distinguish it from both 1's and a capital I). Even the shape of the comma has been made more angular so it's easier to distinguish from a period.

"The shape of ovals approaches that of rectangular symbols. This makes the whole pattern of the text more clear-cut," explains the font's web site. "The outer sides of ovals ensure there are no additional obstacles for your eyes as they scan the text vertically."

And one optional feature even lets you merge multi-character ligatures like -> and ++ into their corresponding symbol. (138 code-specific ligatures are included with the font.)

Math

Major Breakthrough In Quantum Computing Shows That MIP* = RE (arxiv.org) 28

Slashdot reader JoshuaZ writes:
In a major breakthrough in quantum computing it was shown that MIP* equals RE. MIP* is the set of problems that can be efficiently demonstrated to a classical computer interacting with multiple quantum computers with any amount of shared entanglement between the quantum computers. RE is the set of problems which are recursive; this is essentially all problems which can be computed.

This result comes through years of deep development of understanding interactive protocols, where one entity, a verifier, has much less computing power than another set of entities, provers, who wish to convince the verifier of the truth of a claim. In 1990, a major result was that a classical computer with a polynomial amount of time could be convince of any claim in PSPACE by interacting with an arbitrarily powerful classical computer. Here PSPACE is the set of problems solvable by a classical computer with a polynomial amount of space. Subsequent results showed that if one allowed a verifier able to interact with multiple provers, the verifier could be convinced of a solution of any problem in NEXPTIME, a class conjectured to be much larger than PSPACE. For a while, it was believed that in the quantum case, the set of problems might actually be smaller, since multiple quantum computers might be able to use their shared entangled qubits to "cheat" the verifier. However, this has turned out not just to not be the case, but the exact opposite: MIP* is not only large, it is about as large as a computable class can naturally be.

This result while a very big deal from a theoretical standpoint is unlikely to have any immediate applications since it supposes quantum computers with arbitrarily large amounts of computational power and infinite amounts of entanglement.

The paper in question is a 165 tour de force which includes incidentally showing that the The Connes embedding conjecture, a 50 year old major conjecture from the theory of operator algebras, is false.

Electronic Frontier Foundation

EFF Files Amicus Brief In Google v. Oracle, Arguing APIs Are Not Copyrightable (eff.org) 147

Areyoukiddingme writes: EFF has filed an amicus brief with the U.S. Supreme Court in Google v. Oracle, arguing that APIs are not copyrightable. From the press release: "The Electronic Frontier Foundation (EFF) today asked the U.S. Supreme Court to rule that functional aspects of Oracle's Java programming language are not copyrightable, and even if they were, employing them to create new computer code falls under fair use protections. The court is reviewing a long-running lawsuit Oracle filed against Google, which claimed that Google's use of certain Java application programming interfaces (APIs) in its Android operating system violated Oracle's copyrights. The case has far-reaching implications for innovation in software development, competition, and interoperability.

In a brief filed today, EFF argues that the Federal Circuit, in ruling APIs were copyrightable, ignored clear and specific language in the copyright statute that excludes copyright protection for procedures, processes, and methods of operation. 'Instead of following the law, the Federal Circuit decided to rewrite it to eliminate almost all the exclusions from copyright protection that Congress put in the statute,' said EFF Legal Director Corynne McSherry. 'APIs are not copyrightable. The Federal Circuit's ruling has created a dangerous precedent that will encourage more lawsuits and make innovative software development prohibitively expensive. Fortunately, the Supreme Court can and should fix this mess.'" Oral arguments before the U.S. Supreme Court are scheduled for March 2020, and a decision by June.

Programming

'We're Approaching the Limits of Computer Power -- We Need New Programmers Now' (theguardian.com) 306

Ever-faster processors led to bloated software, but physical limits may force a return to the concise code of the past. John Naughton: Moore's law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. "In terms of size of transistor," he said, "you can see that we're approaching the size of atoms, which is a fundamental barrier, but it'll be two or three generations before we get that far -- but that's as far out as we've ever been able to see. We have another 10 to 20 years before we reach a fundamental limit." We've now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, there's been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called "cores" -- in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.

But computing involves a combination of hardware and software and one of the predictable consequences of Moore's law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there's a legend that for years afterwards he could recite the entire program by heart. There are thousands of stories like this from the early days of computing. But as Moore's law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed.

Programming

How Is Computer Programming Different Today Than 20 Years Ago? (medium.com) 325

This week a former engineer for the Microsoft Windows Core OS Division shared an insightful (and very entertaining) list with "some changes I have noticed over the last 20 years" in the computer programming world. Some excerpts: - Some programming concepts that were mostly theoretical 20 years ago have since made it to mainstream including many functional programming paradigms like immutability, tail recursion, lazily evaluated collections, pattern matching, first class functions and looking down upon anyone who don't use them...

- 3 billion devices run Java. That number hasn't changed in the last 10 years though...

- A package management ecosystem is essential for programming languages now. People simply don't want to go through the hassle of finding, downloading and installing libraries anymore. 20 years ago we used to visit web sites, downloaded zip files, copied them to correct locations, added them to the paths in the build configuration and prayed that they worked.

- Being a software development team now involves all team members performing a mysterious ritual of standing up together for 15 minutes in the morning and drawing occult symbols with post-its....

- Since we have much faster CPUs now, numerical calculations are done in Python which is much slower than Fortran. So numerical calculations basically take the same amount of time as they did 20 years ago...

- Even programming languages took a side on the debate on Tabs vs Spaces....

- Code must run behind at least three levels of virtualization now. Code that runs on bare metal is unnecessarily performant....

- A tutorial isn't really helpful if it's not a video recording that takes orders of magnitude longer to understand than its text.

- There is StackOverflow which simply didn't exist back then. Asking a programming question involved talking to your colleagues.

- People develop software on Macs.

In our new world where internet connectivity is the norm and being offline the exception, "Security is something we have to think about now... Because of side-channel attacks we can't even trust the physical processor anymore."

And of course, "We don't use IRC for communication anymore. We prefer a bloated version called Slack because we just didn't want to type in a server address...."
Education

Are We Teaching Engineers the Wrong Way to Think? (zdnet.com) 125

Tech columnist Chris Matyszczyk summarizes the argument of four researchers who are warning about the perils of pure engineer thought: They write, politely: "Engineers enter the workforce with important analysis skills, but may struggle to 'think outside the box' when it comes to creative problem-solving." The academics blame the way engineers are educated.

They explain there are two sorts of thinking -- convergent and divergent. The former is the one with which engineers are most familiar. You make a list of steps to be taken to solve a problem and you take those steps. You expect a definite answer. Divergent thinking, however, requires many different ways of thinking about a problem and leads to many potential solutions. These academics declare emphatically: "Divergent thinking skills are largely ignored in engineering courses, which tend to focus on a linear progression of narrow, discipline-focused technical information."

Ah, that explains a lot, doesn't it? Indeed, these researchers insist that engineering students "become experts at working individually and applying a series of formulas and rules to structured problems with a 'right' answer."

Oddly, I know several people at Google just like that.

Fortunately, the researchers are also proposing this solution:

"While engineers need skills in analysis and judgment, they also need to cultivate an open, curious, and kind attitude, so they don't fixate on one particular approach and are able to consider new data."
Databases

'Top Programming Skills' List Shows Employers Want SQL (dice.com) 108

Former Slashdot contributor Nick Kolakowski is now a senior editor at Dice Insights, where he's just published a list of the top programming skills employers were looking for during the last 30 days.
If you're a software developer on the hunt for a new gig (or you're merely curious about what programming skills employers are looking for these days), one thing is clear: employers really, really, really want technologists who know how to build, maintain, and scale everything database- (and data-) related.

We've come to that conclusion after analyzing data about programming skills from Burning Glass, which collects and organizes millions of job postings from across the country.

The biggest takeaway? "When it comes to programming skills, employers are hungriest for SQL." Here's their ranking of the top most in-demand skills:
  1. SQL
  2. Java
  3. "Software development"
  4. "Software engineering"
  5. Python
  6. JavaScript
  7. Linux
  8. Oracle
  9. C#
  10. Git

The list actually includes the top 18 programming skills, but besides languages like C++ and .NET, it also includes more generalized skills like "Agile development," "debugging," and "Unix."

But Nick concludes that "As a developer, if you've mastered database and data-analytics skills, that makes you insanely valuable to a whole range of companies out there."


Bug

This Year's Y2K20 Bug Came Directly From 'A Lazy Fix' to the Y2K Bug (newscientist.com) 160

Slashdot reader The8re still remembers the Y2K bug. Now he shares a New Scientist article explaining how it led directly to this year's Y2020 bug -- which affected more than just parking meters: WWE 2K20, a professional wrestling video game, also stopped working at midnight on 1 January 2020. Within 24 hours, the game's developers, 2K, issued a downloadable fix. Another piece of software, Splunk, which ironically looks for errors in computer systems, was found to be vulnerable to the Y2020 bug in November. The company rolled out a fix to users the same week -- which include 92 of the Fortune 100, the top 100 companies in the US....

The Y2020 bug, which has taken many payment and computer systems offline, is a long-lingering side effect of attempts to fix the Y2K, or millennium bug. Both stem from the way computers store dates. Many older systems express years using two numbers -- 98, for instance, for 1998 -- in an effort to save memory. The Y2K bug was a fear that computers would treat 00 as 1900, rather than 2000. Programmers wanting to avoid the Y2K bug had two broad options: entirely rewrite their code, or adopt a quick fix called "windowing", which would treat all dates from 00 to 20, as from the 2000s, rather than the 1900s. An estimated 80 percent of computers fixed in 1999 used the quicker, cheaper option. "Windowing, even during Y2K, was the worst of all possible solutions because it kicked the problem down the road," says Dylan Mulvin at the London School of Economics....

Another date storage problem also faces us in the year 2038. The issue again stems from Unix's epoch time: the data is stored as a 32-bit integer, which will run out of capacity at 3.14 am on 19 January 2038.

Stats

2019's Fastest Growing Programming Language Was C, Says TIOBE (tiobe.com) 106

Which programming language saw the biggest jump on TIOBE's index of language popularity over the last year?

Unlike last year -- it's not Python. An anonymous reader quotes TIOBE.com: It is good old language C that wins the award this time with an yearly increase of 2.4%... The major drivers behind this trend are the Internet of Things (IoT) and the vast amount of small intelligent devices that are released nowadays...

Runners up are C# (+2.1%), Python (+1.4%) and Swift (+0.6%)...

Other interesting winners of 2019 are Swift (from #15 to #9) and Ruby (from #18 to #11). Swift is a permanent top 10 player now and Ruby seems [destined] to become one soon.

Some languages that were supposed to break through in 2019 didn't: Rust won only 3 positions (from #33 to #30), Kotlin lost 3 positions (from #31 to #35), Julia lost even 10 positions (from #37 to #47) and TypeScript won just one position (from #49 to #48).

And here's the new top 10 programming languages right now, according to TIOBE's January 2020 index.
  • Java
  • C
  • Python
  • C++
  • C# (up two positions from January 2019)
  • Visual Basic .NET (down one position from January 2019)
  • JavaScript (down one position from January 2019)
  • PHP
  • Swift (up six positions from January 2019)
  • SQL (down one position from January 2019)

AI

Machines Are Learning To Write Poetry. (newyorker.com) 46

Dan Rockmore, writing for New Yorker: There are more resonances between programming and poetry than you might think. Computer science is an art form of words and punctuation, thoughtfully placed and goal-oriented, even if not necessarily deployed to evoke surprise or longing. Laid out on a page, every program uses indentations, stanzas, and a distinctive visual hierarchy to convey meaning. In the best cases, a close-reader of code will be rewarded with a sense of awe for the way ideas have been captured in words. Programming has its own sense of minimalist aesthetics, born of the imperative to create software that doesn't take up much space and doesn't take long to execute. Coders seek to express their intentions in the fewest number of commands; William Carlos Williams, with his sparse style and simple, iconic images, would appreciate that. One poet's "road not taken" is one programmer's "if-then-else" statement. Generations of coders have taken their first steps by finding different ways to say "Hello, World." Arguably, you could say the same for poets.

Many programmers have links to poetry -- Ada Lovelace, the acknowledged first programmer ever, was Lord Byron's daughter -- but it's a challenge to fully bridge the gap. Sonnets occupy something of a sweet spot: they're a rich art form (good for poets) with clear rules (good for machines). Ranjit Bhatnagar, an artist and programmer, appreciates both sides. In 2012, he invented Pentametron, an art project that mines the Twittersphere for tweets in iambic pentameter. First, using a pronouncing dictionary created at Carnegie Mellon, he built a program to count syllables and recognize meter. Then, with a separate piece of code to identify rhymes, he started to assemble sonnets. For the first National Novel Generation Month (NaNoGenMo), in 2013, Bhatnagar submitted "I got a alligator for a pet!," a collection of five hundred and four sonnets created with Pentametron. Bhatnagar's code required that each line be an entire tweet, or essentially one complete thought (or at least what counts as a thought on Twitter). It also did its best to abide by strict rules of meter and rhyme.

Security

Hundreds of Millions of Cable Modems Are Vulnerable To New Cable Haunt Vulnerability (zdnet.com) 26

A team of four Danish security researchers has disclosed this week a security flaw that impacts cable modems that use Broadcom chips. From a report: The vulnerability, codenamed Cable Haunt, is believed to impact an estimated 200 million cable modems in Europe alone, the research team said today. The vulnerability impacts a standard component of Broadcom chips called a spectrum analyzer. This is a hardware and software component that protects the cable modem from signal surges and disturbances coming via the coax cable. The component is often used by internet service providers (ISPs) in debugging connection quality. On most cable modems, access to this component is limited for connections from the internal network. The research team says the Broadcom chip spectrum analyzer lacks protection against DNS rebinding attacks, uses default credentials, and also contains a programming error in its firmware.
Google

Chrome OS Has Stalled Out 112

Speaking of Chromebooks, David Ruddock, opines at AndroidPolice: Chrome OS' problems really became apparent to me when Android app compatibility was introduced, around five years ago. Getting Android apps to run on Chrome OS was simultaneously one of the Chrome team's greatest achievements and one of its worst mistakes. In 2019, two things are more obvious than ever about the Android app situation on Chrome. The first is that the "build it and they will come" mantra never panned out. Developers never created an appreciable number of Android app experiences designed for Chrome (just as they never did for Android tablets). The second is that, quite frankly, Android apps are very bad on Chrome OS. Performance is highly variable, and interface bugs are basically unending because most of those apps were never designed for a point-and-click operating system. Sure, they crash less often than they did in the early days, but anyone saying that Android apps on Chrome OS are a good experience is delusional.

Those apps are also a crutch that Chrome leans on to this day. Chrome OS doesn't have a robust photo editor? Don't worry, you can download an app! Chrome doesn't have native integration with cloud file services like Box, Dropbox, or OneDrive? Just download the app! Chrome doesn't have Microsoft Office? App! But this "solution" has basically become an insult to Chrome's users, forcing them to live inside a half-baked Android environment using apps that were almost exclusively designed for 6" touchscreens, and which exist in a containerized state that effectively firewalls them from much of the Chrome operating system. As a result, file handling is a nightmare, with only a very limited number of folders accessible to those applications, and the task of finding them from inside those apps a labyrinthine exercise no one should have to endure in 2019. This isn't a tenable state of affairs -- it's computing barbarism as far as I'm concerned. And yet, I've seen zero evidence that the Chrome team intends to fix it. It's just how it is. But Android apps, so far as I can tell, are basically the plan for Chrome. Certainly, Linux environment support is great for enthusiasts and developers, but there are very few commonly-used commercial applications available on Linux, with no sign that will change in the near future. It's another dead end. And if you want an even more depressing picture of Chrome's content ecosystem, just look at the pitiable situation with web apps.
AI

MIT's New Tool Predicts How Fast a Chip Can Run Your Code (thenextweb.com) 13

Folks at the Massachusetts Institute of Technology (MIT) have developed a new machine learning-based tool that will tell you how fast a code can run on various chips. This will help developers tune their applications for specific processor architectures. From a report: Traditionally, developers used the performance model of compilers through a simulation to run basic blocks -- fundamental computer instruction at the machine level -- of code in order to gauge the performance of a chip. However, these performance models are not often validated through real-life processor performance. MIT researchers developed an AI model called Ithmel by training it to predict how fast a chip can run unknown basic blocks. Later, it was supported by a database called BHive with 300,000 basic blocks from specialized fields such as machine learning, cryptography, and graphics. The team of researchers presented a paper [PDF] at the NeuralIPS conference in December to describe a new technique to measure code performance on various processors. The paper also describes Vemal, a new automatically generating algorithm that can be used to generate compiler optimizations.
Open Source

Linus Torvalds Calls Blogger's Linux Scheduler Tests 'Pure Garbage' (phoronix.com) 191

On Wednesday Phoronix cited a blog post by C++ game developer Malte Skarupke claiming his spinlocks experiments had discovered the Linux kernel had a scheduler issue affecting developers bringing games to Linux for Google Stadia.

Linus Torvalds has now responded: The whole post seems to be just wrong, and is measuring something completely different than what the author thinks and claims it is measuring.

First off, spinlocks can only be used if you actually know you're not being scheduled while using them. But the blog post author seems to be implementing his own spinlocks in user space with no regard for whether the lock user might be scheduled or not. And the code used for the claimed "lock not held" timing is complete garbage.

It basically reads the time before releasing the lock, and then it reads it after acquiring the lock again, and claims that the time difference is the time when no lock was held. Which is just inane and pointless and completely wrong...

[T]he code in question is pure garbage. You can't do spinlocks like that. Or rather, you very much can do them like that, and when you do that you are measuring random latencies and getting nonsensical values, because what you are measuring is "I have a lot of busywork, where all the processes are CPU-bound, and I'm measuring random points of how long the scheduler kept the process in place".

And then you write a blog-post blamings others, not understanding that it's your incorrect code that is garbage, and is giving random garbage values...

You might even see issues like "when I run this as a foreground UI process, I get different numbers than when I run it in the background as a batch process". Cool interesting numbers, aren't they?

No, they aren't cool and interesting at all, you've just created a particularly bad random number generator...

[Y]ou should never ever think that you're clever enough to write your own locking routines.. Because the likelihood is that you aren't (and by that "you" I very much include myself -- we've tweaked all the in-kernel locking over decades, and gone through the simple test-and-set to ticket locks to cacheline-efficient queuing locks, and even people who know what they are doing tend to get it wrong several times).

There's a reason why you can find decades of academic papers on locking. Really. It's hard.

"It really means a lot to me that Linus responded," the blogger wrote later, "even if the response is negative." They replied to Torvalds' 1,500-word post on the same mailing list -- and this time received a 1900-word response arguing "you did locking fundamentally wrong..." The fact is, doing your own locking is hard. You need to really understand the issues, and you need to not over-simplify your model of the world to the point where it isn't actually describing reality any more...

Dealing with reality is hard. It sometimes means that you need to make your mental model for how locking needs to work a lot more complicated...

Open Source

Linux Kernel Developers and Commits Dropped in 2019 (phoronix.com) 37

Phoronix reports that on New Year's Day, the Linux kernel's Git source tree showed 27,852,148 lines of code, divided among 66,492 files (including docs, Kconfig files, user-space utilities in-tree, etc).

Over its lifetime there's been 887,925 commits, and around 21,074 different authors: During 2019, the Linux kernel saw 74,754 commits, which is actually the lowest point since 2013. The 74k commits is compares to 80k commits seen in both 2017 and 2018, 77k commits in 2016, and 75k commits in both 2014 and 2015. Besides the commit count being lower, the author count for the year is also lower. 2019 saw around 4,189 different authors to the Linux kernel, which is lower than the 4,362 in 2018 and 4,402 in 2017.

While the commit count is lower for the year, on a line count it's about average with 3,386,347 lines of new code added and 1,696,620 lines removed...

Intel and Red Hat have remained the top companies contributing to the upstream Linux kernel.

Slashdot Top Deals