Programming

Who Killed The Junior Developer? (medium.com) 380

Melissa McEwen, writing on Medium: A few months ago I attended an event for women in tech. A lot of the attendees were new developers, graduates from code schools or computer science programs. Almost everyone told me they were having trouble getting their first job. I was lucky. My first "real" job out of college was "Junior Application developer" at Columbia University in 2010. These days it's a rare day to find even a job posting for a junior developer position. People who advertise these positions say they are inundated with resumes. But on the senior level companies complain they can't find good developers. Gee, I wonder why?

I'm not really sure the exact economics of this, because I don't run these companies. But I know what companies have told me: "we don't hire junior developers because we can't afford to have our senior developers mentor them." I've seen the rates for senior developers because I am one and I had project managers that had me allocate time for budgeting purposes. I know the rate is anywhere from $190-$300 an hour. That's what companies believe they are losing on junior devs.

Education

Learning To Program Is Getting Harder (slashdot.org) 400

theodp writes: While Google suggests that parents and educators are to blame for why kids can't code, Allen Downey, Professor at Olin College argues that learning to program is getting harder . Downey writes: The fundamental problem is that the barrier between using a computer and programming a computer is getting higher. When I got a Commodore 64 (in 1982, I think) this barrier was non-existent. When you turned on the computer, it loaded and ran a software development environment (SDE). In order to do anything, you had to type at least one line of code, even if all it did was another program (like Archon). Since then, three changes have made it incrementally harder for users to become programmers:
1. Computer retailers stopped installing development environments by default. As a result, anyone learning to program has to start by installing an SDE -- and that's a bigger barrier than you might expect. Many users have never installed anything, don't know how to, or might not be allowed to. Installing software is easier now than it used to be, but it is still error prone and can be frustrating. If someone just wants to learn to program, they shouldn't have to learn system administration first.
2. User interfaces shifted from command-line interfaces (CLIs) to graphical user interfaces (GUIs). GUIs are generally easier to use, but they hide information from users about what's really happening. When users really don't need to know, hiding information can be a good thing. The problem is that GUIs hide a lot of information programmers need to know. So when a user decides to become a programmer, they are suddenly confronted with all the information that's been hidden from them. If someone just wants to learn to program, they shouldn't have to learn operating system concepts first.
3. Cloud computing has taken information hiding to a whole new level. People using web applications often have only a vague idea of where their data is stored and what applications they can use to access it. Many users, especially on mobile devices, don't distinguish between operating systems, applications, web browsers, and web applications. When they upload and download data, they are often confused about where is it coming from and where it is going. When they install something, they are often confused about what is being installed where. For someone who grew up with a Commodore 64, learning to program was hard enough. For someone growing up with a cloud-connected mobile device, it is much harder.
theodp continues: So, with the Feds budgeting $200 million a year for K-12 CS at the behest of U.S. tech leaders, can't the tech giants at least put a BASIC on every phone/tablet/laptop for kids?
Youtube

YouTube TV Is Adding More Channels, But It's Also Getting More Expensive (theverge.com) 79

YouTube's internet TV streaming service is expanding its programming with the addition of several Turner networks including TBS, TNT, CNN, Adult Swim, Cartoon Network, truTV, and Turner Classic Movies. YouTube TV is also bringing NBA TV and MLB Network to the base lineup. NBA All Access and MLB.TV will be offered as optional paid add-ons "in the coming months." The downside? The price of the service is going up. The Verge reports: Starting March 13th, YouTube TV's monthly subscription cost will rise from $35 to $40. All customers who join the service prior to the 13th will be able to keep the lower $35 monthly rate going forward. And if you've been waiting for YouTube to add Viacom channels, that still hasn't happened yet. Hopefully these jumps in subscription cost won't happen very often. Otherwise these internet TV businesses might suddenly start feeling more like cable (and not in a good way). The Verge also mentions that YouTube TV is adding a bunch of new markets including: Lexington, Dayton, Honolulu, El Paso, Burlington, Plattsburgh, Richmond, Petersburg, Mobile, Syracuse, Champaign, Springfield, Columbia, Charleston, Harlingen, Wichita, Wilkes-Barre, and Scranton.
IOS

Apple's Software 'Problem' and 'Fixing' It (learningbyshipping.com) 95

According to media reports, Apple is planning to postpone some new features for iOS and macOS this year to focus on improving reliability, stability and performance of the existing versions. Steven Sinofsky, a former President of the Windows Division, shared his insights into the significance of this development: Several important points are conflated in the broad discussion about Apple and software: Quality, pace of change, features "versus" quality, and innovation. Scanning the landscape, it is important to recognize that in total the work Apple has been doing across hardware, software, services, and even AI/ML, in total -- is breathtaking and unprecedented in scope, scale, and quality. Few companies have done so much for so long with such a high level of consistency. This all goes back to the bet on the NeXT code base and move to Intel for Mac OS plus the iPod, which began the journey to where we are today.

[...] What is lost in all of this recent discussion is the nuance between features, schedule, and quality. It is like having a discussion with a financial advisor over income, risk, and growth. You don't just show up and say you want all three and get a "sure." On the other hand, this is precisely what Apple did so reliably over 20 years. But behind the scenes there is a constant discussion over balancing these three legs of the tripod. You have to have all of them but you "can't" but you have to. This is why they get paid big $.

[...] A massive project like an OS (+h/w +cloud) is like a large investment portfolio and some things will work (in market) and others won't, some things are designed to return right away, some are safe bets, some are long term investments. And some mistakes... Customers don't care about any of that and that's ok. They just look for what they care about. Each evaluates through their own lens. Apple's brilliance is in focusing mostly on two audiences -- Send-users and developers -- tending to de-emphasize the whole "techie" crowd, even IT. When you look at a feature like FaceID and trace it backwards all the way to keychain -- see how much long term thought can go into a feature and how much good work can go unnoticed (or even "fail") for years before surfacing as a big advantage. That's a long term POV AND focus. This approach is rather unique compared to other tech companies that tend to develop new things almost independent of everything else. So new things show up and look bolted on the side of what already exists. (Sure Apple can do that to, but not usually). All the while while things are being built the team is just a dev team and trying to come up with a reliable schedule and fix bug. This is just software development.

Facebook

YouTube CEO: Facebook Should 'Get Back To Baby Pictures' (cnet.com) 119

YouTube CEO Susan Wojcicki won't divulge her biggest fear about competing with Facebook, but she will give them some free advice. From a report: "They should get back to baby pictures," Wojcicki said Monday at the Code Media conference in Huntington Beach, California. Video has been an obsession for Facebook, as it tries to swipe the most advertising dollars migrating off television before YouTube can get them. Facebook has been aggressively advancing the number of clips and live streams that bubble up to the top of your News Feed and has rolled out a central hub for TV-like programming called Watch. "You always have to take competition seriously. You don't win by looking backwards; you win by looking at your customers and looking forward," she said.
Programming

The Quest To Find the Longest-Serving Programmer (tnmoc.org) 115

In 2014, the National Museum of Computing published a blog post in which it tried to find the person who has been programming the longest. At the time, it declared Bill Williams, a 70-year old to be one of the world's most durable programmers, who claimed to have started coding for a living in 1969 and was still doing so at the time of publication. The post has been updated several times over the years, and over the weekend, the TNMC updated it once again. The newest contender is Terry Froggatt of Hampshire, who writes: I can beat claim of your 71-year-old by a couple of years, (although I can't compete with the likes of David Hartley). I wrote my first program for the Elliott 903 in September 1966. Now at the age of 73 I am still writing programs for the Elliott 903! I've just written a 903 program to calculate the Fibonacci numbers. And I've written quite a lot of programs in the years in between, some for the 903 but also a good many in Ada.
IT

Why Paper Jams Persist (newyorker.com) 122

A trivial problem reveals the limits of technology. Fascinating story from The New Yorker: Unsurprisingly, the engineers who specialize in paper jams see them differently. Engineers tend to work in narrow subspecialties, but solving a jam requires knowledge of physics, chemistry, mechanical engineering, computer programming, and interface design. "It's the ultimate challenge," Ruiz said.

"I wouldn't characterize it as annoying," Vicki Warner, who leads a team of printer engineers at Xerox, said of discovering a new kind of paper jam. "I would characterize it as almost exciting." When she graduated from the Rochester Institute of Technology, in 2006, her friends took jobs in trendy fields, such as automotive design. During her interview at Xerox, however, another engineer showed her the inside of a printing press. All Xerox printers look basically the same: a million-dollar printing press is like an office copier, but twenty-four feet long and eight feet high. Warner watched as the heavy, pale-gray double doors swung open to reveal a steampunk wonderland of gears, wheels, conveyor belts, and circuit boards. As in an office copier, green plastic handles offer access to the "paper path" -- the winding route, from "feeder" to "stacker," along which sheets of paper are shocked and soaked, curled and decurled, vacuumed and superheated. "Printers are essentially paper torture chambers," Warner said, smiling behind her glasses. "I thought, This is the coolest thing I've ever seen."

Programming

Researchers Create Simulation Of a Simple Worm's Neural Network (tuwien.ac.at) 75

ClockEndGooner writes: Researchers at the Technische Universitat Wein have created a simulation of a simple worm's neural network, and have been able to replicate its natural behavior to completely mimic the worm's natural reflexive behavior. According to the article, using a simple neural network of 300 neurons, the simulation of "the worm can find its way, eat bacteria and react to certain external stimuli. It can, for example, react to a touch on its body. A reflexive response is triggered and the worm squirms away. This behavior is determined by the worm's nerve cells and the strength of the connections between them. When this simple reflex network is recreated on a computer, the simulated worm reacts in exactly the same way to a virtual stimulation -- not because anybody programmed it to do so, but because this kind of behavior is hard-wired in its neural network." Using the same neural network without adding any additional nerve cells, Mathias Lechner, Radu Grosu, and Ramin Hasani were able to have the nematode simulation learn to balance a pole "just by tuning the strength of the synaptic connections. This basic idea (tuning the connections between nerve cells) is also the characteristic feature of any natural learning process."
Open Source

A Look at How Indian Women Have Persevered Through Several Obstacles To Contribute to the Open Source Community (factordaily.com) 274

A fascinating story of how Indian women have persevered through various roadblocks, including cultural, to actively contribute to the open source community. An excerpt from the story: As Vaishali Thakker, a 23-year old open source programmer looked over the hall filled with around 200 people, she didn't know how to react to what she had just heard. Thakker was one of the five women on the stage at PyCon India 2017, a conference on the use of the Python programming language, in New Delhi. The topic of the discussion was "Women in open source." As the women started discussing the open source projects they had been working on, the challenges and so on, someone from the audience got up and drew the attention of the gathering to the wi-fi hotspots in the hall. They were named "Shut the fk up" and "Feminism sucks." "It was right on our faces," remembers Thakker. For their part, the organisers were upset and even warned the audience. But the event had no code of conduct for anyone to really penalise or expel the culprits.

"It's disheartening when you're talking about the problem, someone is actually giving a proof that it (gender bias) indeed is a problem. In a way, I found it funny, because how stupid can you be to give the proof that the problem actually exists," says Thakker. And how. It's just been three years in her coding career but she is familiar with the high wall that gender stereotyping puts up in the world of software scripting. More so in her chosen field of coding. Thakker is among a small -- but fast-growing -- set of women coders from India shaping the future of several open source platforms globally including the Linux kernel, the core software program behind the world's biggest eponymous open source software.

Programming

Rust Creator Graydon Hoare Says Current Software Development Practices Terrify Him (twitter.com) 353

An anonymous reader writes: On Monday Graydon Hoare, the original creator of the Rust programming language, posted some memories on Twitter. "25 years ago I got a job at a computer bookstore. We were allowed to borrow and read the books; so I read through all the language books, especially those with animals on the covers. 10 years ago I had a little language of my own printing hello world." And Monday he was posting a picture of O'Reilly Media's first edition of their new 622-page book Programming Rust: Fast, Safe Systems Development. Then he elaborated to his followers about what happened in between.

"I made a prototype, then my employer threw millions of dollars at it and hired dozens of researchers and programmers (and tireless interns, hi!) and a giant community of thousands of volunteers showed up and _then_ the book arrived. (After Jim and Jason wrote it and like a dozen people reviewed it and a dozen others edited it and an army of managers coordinated it and PLEASE DESIST IN THINKING THINGS ARE MADE BY SINGLE PEOPLE IT IS A VERY UNHEALTHY MYTH)." He writes that the nostaglic series of tweets was inspired because "I was just like a little tickled at the circle-of-life feeling of it all, reminiscing about sitting in a bookstore wondering if I'd ever get to work on cool stuff like this."

One Twitter user then asked him if Rust was about dragging C++ hackers halfway to ML, to which Hoare replied "Not dragging, more like throwing C/C++ folks (including myself) a life raft wrt. safety... Basically I've an anxious, pessimist personality; most systems I try to build are a reflection of how terrifying software-as-it-is-made feels to me. I'm seeking peace and security amid a nightmare of chaos. I want to help programmers sleep well, worry less."

Open Source

'How I Coined the Term Open Source' (opensource.com) 117

Today is the 20th anniversary of the phrase "open source software," which was coined by the executive director of the Foresight Institute, a nonprofit think tank focused on nanotech and artificial intelligence. The phrase first entered the world on February 3rd, 1998. Christine Peterson writes: Of course, there are a number of accounts of the coining of the term, for example by Eric Raymond and Richard Stallman, yet this is mine, written on January 2, 2006. It has never been published, until today. The introduction of the term "open source software" was a deliberate effort to make this field of endeavor more understandable to newcomers and to business, which was viewed as necessary to its spread to a broader community of users... Interest in free software was starting to grow outside the programming community, and it was increasingly clear that an opportunity was coming to change the world... [W]e discussed the need for a new term due to the confusion factor. The argument was as follows: those new to the term "free software" assume it is referring to the price. Oldtimers must then launch into an explanation, usually given as follows: "We mean free as in freedom, not free as in beer." At this point, a discussion on software has turned into one about the price of an alcoholic beverage...

Between meetings that week, I was still focused on the need for a better name and came up with the term "open source software." While not ideal, it struck me as good enough. I ran it by at least four others: Eric Drexler, Mark Miller, and Todd Anderson liked it, while a friend in marketing and public relations felt the term "open" had been overused and abused and believed we could do better. He was right in theory; however, I didn't have a better idea... Later that week, on February 5, 1998, a group was assembled at VA Research to brainstorm on strategy. Attending -- in addition to Eric Raymond, Todd, and me -- were Larry Augustin, Sam Ockman, and attending by phone, Jon "maddog" Hall... Todd was on the ball. Instead of making an assertion that the community should use this specific new term, he did something less directive -- a smart thing to do with this community of strong-willed individuals. He simply used the term in a sentence on another topic -- just dropped it into the conversation to see what happened.... A few minutes later, one of the others used the term, evidently without noticing, still discussing a topic other than terminology. Todd and I looked at each other out of the corners of our eyes to check: yes, we had both noticed what happened...

Toward the end of the meeting, the question of terminology was brought up explicitly, probably by Todd or Eric. Maddog mentioned "freely distributable" as an earlier term, and "cooperatively developed" as a newer term. Eric listed "free software," "open source," and "sourceware" as the main options. Todd advocated the "open source" model, and Eric endorsed this... Eric Raymond was far better positioned to spread the new meme, and he did. Bruce Perens signed on to the effort immediately, helping set up Opensource.org and playing a key role in spreading the new term... By late February, both O'Reilly & Associates and Netscape had started to use the term. After this, there was a period during which the term was promoted by Eric Raymond to the media, by Tim O'Reilly to business, and by both to the programming community. It seemed to spread very quickly.

Peterson remembers that "These months were extremely exciting for open source," adding "Every week, it seemed, a new company announced plans to participate. Reading Slashdot became a necessity, even for those like me who were only peripherally involved. I strongly believe that the new term was helpful in enabling this rapid spread into business, which then enabled wider use by the public."

Wikipedia notes that Linus Torvalds endorsed the term the day after it was announced, that Phil Hughes backed it in Linux Journal, and that Richard Stallman "initially seemed to adopt the term, but later changed his mind."
AI

Ford Patents Driverless Police Car That Ambushes Lawbreakers Using AI (washingtonpost.com) 126

Ford has developed a patent for a police car that issues tickets without even pulling you over. The same car could also use artificial intelligence to find good hiding spots to catch traffic violators (Warning: source may be paywalled; alternative source) and identify drivers by scanning license plates, tapping into surveillance cameras and wirelessly accessing government records. The Washington Post reports: The details may sound far-fetched, as if they belong in the science-fiction action flick "Demolition Man" or a new dystopian novel inspired by Aldous Huxley's "Brave New World," but these scenarios are grounded in a potential reality. They come from a patent developed by Ford and being reviewed by the U.S. government to create autonomous police cars. Ford's patent application was published this month. Although experts claim autonomous vehicles will make driving safer and more rule-bound, Ford argues in its application that in the future, traffic violations will never disappear entirely. "While autonomous vehicles can and will be programmed to obey traffic laws, a human driver can override that programming to control and operate the vehicle at any time," the patent's application says. "When a vehicle is under the control of a human driver there is a possibility of violation of traffic laws. Thus, there will still be a need to police traffic."

The patent application says that autonomous police vehicles don't necessarily replace the need for human police officers for catching traffic scofflaws. Some "routine tasks," such as issuing tickets for failure to stop at a stop sign, can be automated, the patent says, but other tasks that can't be automated will be left to people. The application, which was filed in July 2016 and includes elaborate diagrams depicting the autonomous police car interacting with its environment, says officers could be inside the vehicle at all times and reclaim control of the car when necessary. But the application also shows how an autonomous police vehicle could be able to carry out many tasks we associate with human officers.

Desktops (Apple)

Apple Still Aims To Allow iPad Apps To Run on Macs This Year (axios.com) 63

Apple's push for performance and security improvements over new features will also apply to this year's Mac software, Axios reported on Wednesday, but one key feature remains on the roadmap for 2018: The ability for Macs to run iPad apps. From the report: On the Mac side, this is taking the form of a new project around security as well as improvements in performance when waking and unlocking the system. While users would certainly welcome changes that make their systems run better and more securely, customers tend to be more motivated to make purchases based on new features rather than promised improvements around security or performance, which can be tough to judge. The signature new feature for the Mac -- the ability to run iPad apps -- is a significant undertaking that adds a high degree of complexity to this year's OS release.
Programming

Employers Want JavaScript, But Developers Want Python, Survey Finds (infoworld.com) 222

An anonymous reader quotes InfoWorld: When it comes to which programming languages are in demand by employers, JavaScript, Java, Python, C++, and C -- in that order -- came out on top in a recent developer survey. Developers, however, want to learn languages like Python, Go, and Kotlin. A survey of developers by technical recruiter HackerRank, conducted in October, found no gap between languages employers want and what developers actually know, with JavaScript barely edging out Java...

HackerRank also found gaps in JavaScript frameworks between what employers want and what developers know. The React JavaScript UI library had the biggest delta between employers and developers, with about 37 percent of employers wanting React skills but only about 19 percent of developers having them... [But] problem-solving skills are the most-sought by employers, more than language proficiency, debugging, and system design.

The survey involved 39,441 developers, and concluded that "Python ruled among all age groups," according to Application Development Trends, "except for those 55 years or older, who narrowly prefer C."
Programming

Tim Cook: Coding Languages Were 'Too Geeky' For Students Until We Invented Swift (thestar.com) 335

theodp writes: Speaking to a class of Grade 7 students taking coding lessons at the Apple Store in Eaton Centre, the Toronto Star reports that Apple CEO Tim Cook told the kids that most students would shun programming because coding languages were 'too geeky' until Apple introduced Swift. "Swift came out of the fundamental recognition that coding languages were too geeky. Most students would look at them and say, 'that's not for me,'" Cook said as the preteens participated in an Apple-designed 'Everyone Can Code' workshop. "That's not our view. Our view is that coding is a horizontal skill like your native languages or mathematics, so we wanted to design a programming language that is as easy to learn as our products are to use."
Programming

Donald Knuth Turns 80, Seeks Problem-Solvers For TAOCP (stanford.edu) 71

An anonymous reader writes: When 24-year-old Donald Knuth began writing The Art of Computer Programming, he had no idea that he'd still be working on it 56 years later. This month he also celebrated his 80th birthday in Sweden with the world premier of Knuth's Fantasia Apocalyptica, a multimedia work for pipe organ and video based on the bible's Book of Revelations, which Knuth describes as "50 years in the making."

But Knuth also points to the recent publication of "one of the most important sections of The Art of Computer Programming" in preliminary paperback form: Volume 4, Fascicle 6: Satisfiability. ("Given a Boolean function, can its variables be set to at least one pattern of 0s and 1 that will make the function true?")

Here's an excerpt from its back cover: Revolutionary methods for solving such problems emerged at the beginning of the twenty-first century, and they've led to game-changing applications in industry. These so-called "SAT solvers" can now routinely find solutions to practical problems that involve millions of variables and were thought until very recently to be hopelessly difficult.
"in several noteworthy cases, nobody has yet pointed out any errors..." Knuth writes on his site, adding "I fear that the most probable hypothesis is that nobody has been sufficiently motivated to check these things out carefully as yet." He's uncomfortable printing a hardcover edition that hasn't been fully vetted, and "I would like to enter here a plea for some readers to tell me explicitly, 'Dear Don, I have read exercise N and its answer very carefully, and I believe that it is 100% correct,'" where N is one of the exercises listed on his web site.

Elsewhere he writes that two "pre-fascicles" -- 5a and 5B -- are also available for alpha-testing. "I've put them online primarily so that experts in the field can check the contents before I inflict them on a wider audience. But if you want to help debug them, please go right ahead."
Math

Has the Decades-Old Floating Point Error Problem Been Solved? (insidehpc.com) 174

overheardinpdx quotes HPCwire: Wednesday a company called Bounded Floating Point announced a "breakthrough patent in processor design, which allows representation of real numbers accurate to the last digit for the first time in computer history. This bounded floating point system is a game changer for the computing industry, particularly for computationally intensive functions such as weather prediction, GPS, and autonomous vehicles," said the inventor, Alan Jorgensen, PhD. "By using this system, it is possible to guarantee that the display of floating point values is accurate to plus or minus one in the last digit..."

The innovative bounded floating point system computes two limits (or bounds) that contain the represented real number. These bounds are carried through successive calculations. When the calculated result is no longer sufficiently accurate the result is so marked, as are all further calculations made using that value. It is fail-safe and performs in real time.

Jorgensen is described as a cyber bounty hunter and part time instructor at the University of Nevada, Las Vegas teaching computer science to non-computer science students. In November he received US Patent number 9,817,662 -- "Apparatus for calculating and retaining a bound on error during floating point operations and methods thereof." But in a followup, HPCwire reports: After this article was published, a number of readers raised concerns about the originality of Jorgensen's techniques, noting the existence of prior art going back years. Specifically, there is precedent in John Gustafson's work on unums and interval arithmetic both at Sun and in his 2015 book, The End of Error, which was published 19 months before Jorgensen's patent application was filed. We regret the omission of this information from the original article.
Programming

Apple Shuts Swift Mailing List, Migrates to Online Forum (swift.org) 25

An anonymous reader writes: Apple's Swift project "has completed the process of migrating to the Swift Forums as the primary method for discussion and communication!" announced a blog post on Friday. "The former mailing lists have been shut down and archived, and all mailing list content has been imported into the new forum system."

While they're still maintaining a few Swift-related mailing lists, they're moving discussions into online forums divided into four main categories: Evolution, Development, Using Swift, and Site Feedback. Forum accounts can be set up using either email registration or GitHub accounts.

It was one year ago that Swift creator Chris Lattner answered questions from Slashdot readers.
Security

'Text Bomb' Is Latest Apple Bug (bbc.com) 60

An anonymous reader quotes a report from the BBC: A new "text bomb" affecting Apple's iPhone and Mac computers has been discovered. Abraham Masri, a software developer, tweeted about the flaw which typically causes an iPhone to crash and in some cases restart. Simply sending a message containing a link which pointed to Mr Masri's code on programming site GitHub would be enough to activate the bug -- even if the recipient did not click the link itself. Mr Masri said he "always reports bugs" before releasing them. Apple has not yet commented on the issue. On a Mac, the bug reportedly makes the Safari browser crash, and causes other slowdowns. Security expert Graham Cluley wrote on his blog that the bug does not present anything to be particularly worried about -- it's merely very annoying. After the link did the rounds on social media, Mr Masri removed the code from GitHub, therefore disabling the "attack" unless someone was to replicate the code elsewhere.
Programming

Which JavaScript Framework is the Most Popular? (infoworld.com) 161

An anonymous reader quotes InfoWorld's report on which JavaScript frameworks are the most widely-used: In a study of 28-day download cycles for front-end JavaScript frameworks, NPM, which oversees the popular JavaScript package registry, found that React has been on a steady upward trajectory; it now accounts for about 0.05 percent of the registry's 13 billion downloads per month as of the fourth quarter of 2017. Web developers as well as desktop and mobile developers are adopting the library and it has spawned an ecosystem of related packages. Preact, a lightweight alternative to React, also has seen growth and could become a force in the future.

On the down side, Backbone, which accounted for almost 0.1 percent of all downloads in 2013, now comprises only about 0.005 percent of downloads (about 750,000 per month). Backbone has declined steeply but is kept afloat by the long shelf life of projects using it, NPM reasoned. The jQuery JavaScript library also remains popular but has experienced decreasing interest. Angular, the Google-developed JavaScript framework, was the second-most-popular framework behind React, when combining the original Angular 1.x with the rewritten Angular 2.x. Version 1.x was at about 0.0125 percent of downloads last month while version 2.x was at about 0.02 percent. Still, Angular as a whole is showing just modest growth.

They also report that the four JavaScript frameworks with the fastest growth rates for 2017 were Preact, Vue, React, and Ember.

But for back end services written in JavaScript, npm reports that Express "is the overwhelmingly dominant solution... The next four biggest frameworks are so small relative to Express that it's hard to even see them."

Slashdot Top Deals