Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming IT

Should IT Professionals Be Liable for Ransomware Attacks? (acm.org) 250

Denmark-based Poul-Henning Kamp describes himself as the "author of a lot of FreeBSD, most of Varnish and tons of other Open Source Software." And he shares this message in June's Communications of the ACM.

"The software industry is still the problem." If any science fiction author, famous or obscure, had submitted a story where the plot was "modern IT is a bunch of crap that organized crime exploits for extortion," it would have gotten nowhere, because (A) that is just not credible, and (B) yawn!

And yet, here we are.... As I write this, 200-plus corporations, including many retail chains, have inoperative IT because extortionists found a hole in some niche, third-party software product most of us have never heard of.

But he's also proposing a solution. In Denmark, 129 jobs are regulated by law. There are good and obvious reasons why it is illegal for any random Ken, Brian, or Dennis to install toilets or natural-gas furnaces, perform brain surgery, or certify a building is strong enough to be left outside during winter. It may be less obvious why the state cares who runs pet shops, inseminates cattle, or performs zoological taxidermy, but if you read the applicable laws, you will learn that animal welfare and protection of endangered species have many and obscure corner cases.

Notably absent, as in totally absent, on that list are any and all jobs related to IT; IT architecture, computers, computer networks, computer security, or protection of privacy in computer systems. People who have been legally barred and delicensed from every other possible trade — be it for incompetence, fraud, or both — are entirely free to enter the IT profession and become responsible for the IT architecture or cybersecurity of the IT system that controls nearly half the hydrocarbons to the Eastern Seaboard of the U.S....

With respect to gas, water, electricity, sewers, or building stability, the regulations do not care if a company is hundreds of years old or just started this morning, the rules are always the same: Stuff should just work, and only people who are licensed — because they know how to — are allowed to make it work, and they can be sued if they fail to do so.

The time is way overdue for IT engineers to be subject to professional liability, like almost every other engineering profession. Before you tell me that is impossible, please study how the very same thing happened with electricity, planes, cranes, trains, ships, automobiles, lifts, food processing, buildings, and, for that matter, driving a car.

As with software product liability, the astute reader is apt to exclaim, "This will be the end of IT as we know it!" Again, my considered response is, "Yes, please, that is precisely my point!"

This discussion has been archived. No new comments can be posted.

Should IT Professionals Be Liable for Ransomware Attacks?

Comments Filter:
  • by AcidFnTonic ( 791034 ) on Sunday May 29, 2022 @06:16PM (#62575776) Homepage

    No my compiler is mine and I do not have to yield it to you. I will continue writing code you may not like or agree with. Politely fuck off.

    • by david.emery ( 127135 ) on Sunday May 29, 2022 @06:20PM (#62575786)

      I don't think anyone is preventing you from writing code. What the idea is that you be -responsible- if your code is broken.

      What other industry can people fuck up as badly as software, with no consequences?

      • by AcidFnTonic ( 791034 ) on Sunday May 29, 2022 @06:24PM (#62575794) Homepage

        Extreme liability is in itself a form of prevention. Please reread the above once more.

      • by otuz ( 85014 ) on Sunday May 29, 2022 @06:37PM (#62575818) Homepage

        Warranty and liability would be something you'd negotiate in the software license. Not something to apply to the industry as a whole.

      • by splutty ( 43475 ) on Sunday May 29, 2022 @06:40PM (#62575830)

        Now if the managers could go fuck right off with their unrealistic timelines, and companies actually pay QA a decent wage, and people would actually learn to code properly, and and and.

        Yeah. Liability is fine, but the problem is rarely just the programmer.

        • Now if the managers could go fuck right off with their unrealistic timelines, and companies actually pay QA a decent wage, and people would actually learn to code properly, and and and.

          Yeah. Liability is fine, but the problem is rarely just the programmer.

          All you have to do is point out and know all the unforseen problems! 8^p

        • by Brain-Fu ( 1274756 ) on Sunday May 29, 2022 @09:21PM (#62576202) Homepage Journal

          The problem is almost never the programmer:

          1. The employers choose to hire inexperienced greenies right out of college, instead of seasoned veterans who know how to write secure software, to lower costs. These entry-level engineers simply don't know any better, since they haven't put in the time working under someone who does!
          2. The employers pick the unrealistic timelines you mentioned, thus forcing the IT Pros to do things they know will produce bad software despite their protests.
          3. The employers change the requirements every day, multiple times a day, without moving the deadlines, thus forcing the code to be brimming with technical debt due to all the re-writing, scrambling, and confusion over what it should even do.
          4. The employers choose not to hire a sufficient QA team, again to save costs, thus ensuring there will be plenty of bugs (including security bugs).
          5. The customers buy this garbage because the price is significantly lower than the price charged by any competitor who has much higher costs of production due to taking proper care in making sure the software is secure and reliable.

          The programmer may be the sword, but the employer is the hand that wields it. And ultimately they bend to the customer's demand for low-cost-quick-to-market crap, because if they don't, the customer just buys from someone who will. The programmer is just the tip of the iceberg of blame.

          • by Tom ( 822 ) on Monday May 30, 2022 @01:15AM (#62576610) Homepage Journal

            1. The employers choose to hire inexperienced greenies right out of college, instead of seasoned veterans who know how to write secure software, to lower costs. These entry-level engineers simply don't know any better, since they haven't put in the time working under someone who does!

            But they are cheaper!

            2. The employers pick the unrealistic timelines you mentioned, thus forcing the IT Pros to do things they know will produce bad software despite their protests.

            But we need to ship it to make sales!

            3. The employers change the requirements every day, multiple times a day, without moving the deadlines, thus forcing the code to be brimming with technical debt due to all the re-writing, scrambling, and confusion over what it should even do.

            Planning it out properly from the start is too expensive! Also, didn't you guys invent this Egale or Agile or Agely or whatever it's called so that we don't need to think about what we actually want until you've started writing it?

            4. The employers choose not to hire a sufficient QA team, again to save costs, thus ensuring there will be plenty of bugs (including security bugs).

            I see you're starting to get it. Yes, only saved costs are good costs!

            5. The customers buy this garbage because the price is significantly lower than the price charged by any competitor who has much higher costs of production due to taking proper care in making sure the software is secure and reliable.

            You see, NOW you get it. Nobody actually understands any of the magic you do, but people understand $$$ and our sales team hypes up our crap the same as the competition hypes up their crap, so the only thing that people have to decide on what to buy is price.

            "reliable". Pfft. By the time the whole shit comes crashing down, the person who bought it has done their two quarters, taken their golden handshake and moved on to the next company. What do they care?

        • I'm pretty skeptical that licensing for software professionals can work.

          However: Your argument about bad, overweening managers is actually an argument _in favor_ of doing so. More heavily licensed engineering fields do in fact invest the engineer with both responsibility and powers that cannot be short-circuited by domineering managers, with force of law. And also they're much higher paid to take on this responsibility.

          For example, here's an interesting case from Stack Exchange: Workplace a few years ago in

        • Yeah, I think liability needs to start with the managers. Or better yet with with the shareholders ultimately calling the shots, but one of the few things pretty much every nation on Earth agrees on is that the 1% must be protected from liability for their actions. (I think we can all agree that the 1% of stock owned by the rest of the population combined wields no meaningful power)

          I think there's merit in considering an IT situation similar to engineers or doctors - generally speaking it's their employer

          • by Gimric ( 110667 )

            It would be a pretty perverse outcome if the employing companies were able to disclaim liability (Read the EULA!) but the employees were held accountable. Maybe start with making software companies liable first?

          • I agree that the next maturity of the profession is professional unions and licensing, but one thing to keep in mind is that in the fields of civil engineering, medicine and law, a manager that is not a licensed professional themselves cannot order or instruct professionals into how to do their job. This is why a manager is held liable if they instruct engineers to cheap out on a bridge. People saying that managers should be the ones held liable... YES... however they are missing the forest for the trees. T
      • by LiquidAvatar ( 772805 ) on Sunday May 29, 2022 @06:41PM (#62575832) Journal
        Apparently, politics.
      • by ufgrat ( 6245202 ) on Sunday May 29, 2022 @06:47PM (#62575862)

        Let's see:

            * Banks
            * Pharmaceutical corporations
            * Politicians
            * Social media
            * News "Entertainment"

        The difference is, those are largely motivated by greed, where as poor software is usually the result of incompetence. Or outsourced programming-- the intersection of "greed" and "incompetence".

      • by Anonymous Wizard ( 9704014 ) on Sunday May 29, 2022 @07:01PM (#62575920)
        When engineering physical structures calculations can be made to ensure they hold up to their use case in typical and statistically likely situations . When they are built if the design, materials, and craftsmanship adhere to those precalculated standards, the builders are not liable for acts of malice or acts of god. In software engineering only known potential exploits are calculable and forseeable. The software engineers should only be held liable for breaches and failures which are calculable and forseeable which do not involve innovative exploits, nor involving security policy circumvention where breaches of human trust are beyond industry security standards. Assuming they were comissioned to engineer such a thing to standards. Liability would then fall upon the employer. Ie. Fire IT if the network gets infected with the "iloveyou" virus in 2022, unless an employee disregarded policy or circumvented procedure in order to load the link in that email. And if the IT profressional was hired on Fiver to whip up the network security policy for a large enterprise overnight, then it falls on the enterprise.
      • by Ol Olsoc ( 1175323 ) on Sunday May 29, 2022 @07:27PM (#62575980)

        I don't think anyone is preventing you from writing code. What the idea is that you be -responsible- if your code is broken.

        What other industry can people fuck up as badly as software, with no consequences?

        So why would I write anything if I were to be held criminally liable for anything that ever happens to that code?

        And tell me - do you have examples of perfect always secure code that will protect their writers in perpetuity from any and all liability?

        The idea that a company that will force their IT department into a cost center, then hold their cost center employees criminally liable for their products is making criminals out of the wrong people.

        Hobow the suits who demand to use "Password1", and want all the inconveniences of security bypassed? The breaches have tended to be really low hanging fruit that is obvious. Not incompetence, but often intentional breaches made for a select group's convenience.

        You might as well fire the custodians - it will have just as effective a result.

      • by MrKaos ( 858439 )

        What other industry can people fuck up as badly as software, with no consequences?

        Banking.

      • What other industry can people fuck up as badly as software, with no consequences?

        Literally every single one which requires workers who are smarter than politicians and other would-be-regulators. The fastest way to yield a labor shortage in an industry a slim minority of the population is even capable of working in productively would be to add extreme liability risks. Risk is taken on by the corps selling code already, and yes, they SHOULD BE reviewing every line of every external library/dependency they include instead of adopted the "it's industry standard, everyone uses ___, there's

      • by mysidia ( 191772 )

        Nope! It's written strictly in the EULA you must agree to buy my software that you agree to be 100% liable for any condition arising from your use of the software if it doesn't do what we hoped it would do

        No promises, and No warranties..

        Don't like it, then go hire someone else's. Oh wait.. My software is not a Commodity like Plumbing or Electrical, therefore I make the only one, and it's even Patented with proprietary encrrypted file formats, so Nobody else can write this, even if they wanted. Gues

      • Politicians
        Bankers
        Management ...

    • by Ichijo ( 607641 )

      Ok but first we will ask you to:

      show us proof of education, which convinces us that you know what you're doing; and show us your liability insurance... The second question is never asked in IT.

      It is quite unusual, even for very large companies, to have an in-house licensed electrician or plumber, because, by and large, that stuff just works, just keeps working, and when it does not, you need more than a single craftsperson and suitable tools to fix it, so it is much cheaper to call in a specialized company

      • If Russians were able to fuck with your electrical or plumbing remotely, then yes, companies would need those guys on hand.
  • Because why not? First of all, they decide salaries and amount of IT companies hire. They can and do fuck things up with companies a lot more. This should be even MORE true in case if IT services are outsourced outside of country of company's residence.(so foreign national cant be extradited from to country where IT sec was breached)
  • As long as some IT decider can't tell the difference from a client or a server, or Git upstream from a working copy, we are living - IT wise - living in a society that has no culture. Sort of like the Arabs in that old movie "Laurence of Arabia" that come to town and leave it 2 days later, in a state of dismay and chaos, because they have no effing clue how such a thing as a "town" or "city" actually works. Looking at Dubai today one could say they perhaps still haven't learned. Anyway, I digress.
    IT wise, too many cultureless dimwitts are still calling the shots and generally are utterly incapable of building an IT culture and an IT security culture around that or making decisions that would facilitate such a development.

    Until that happens, no amount of liability will fix anything. That would be like blaming the spreading of germs on the only guy that uses soap. The suggestion however, pretty much fits (and illustrates) the dire state of culture, when it comes to IT.

  • No, obviously not. (Score:3, Insightful)

    by Maxo-Texas ( 864189 ) on Sunday May 29, 2022 @06:34PM (#62575806)

    It's a very dynamic area and even major corporations haven't figured out a reliable way to stop it yet. New exploits are discovered *every* day.

    I would *never* have never worked in a field like that for any amount of money if I was constantly risking financial ruin.

    It would be like holding doctors financially accountable for deaths to new or rare diseases while also saying malpractice insurance couldn't be used to protect them in those cases.

  • by 93 Escort Wagon ( 326346 ) on Sunday May 29, 2022 @06:36PM (#62575814)

    On more than one occasion I have been overridden by non-IT higher-ups regarding security policies I've wanted to put in place.

    It's not a regular occurrence, but there have been a few times when some faculty research group's convenience (I work at a university) forced me or one of my coworkers to implement changes (or prevent us from making new changes). And by "forced" I mean the Chair said "quit arguing and just do what they want".

    • by Ichijo ( 607641 )

      The point is to make IT the decider. When doing the wrong thing risks a higher insurance premium, or losing your license for professional misconduct, or a lawsuit against you personally, it gives you more power to say no.

      Do you want this power or not?

      • Do you want this power or not?

        Based on my personal experience, I don't believe what you describe is what would actually happen. The blame might get shifted to IT even more than it already is; but the power to actually make final decisions on implementation most probably would still be out of our hands.

      • by theCoder ( 23772 )

        If IT was the decider and also liable for any security breaches, then the systems being administered would become essentially unusable. Unreasonable security precautions would make it so difficult to use the systems that hardly anyone would actually be able to get anything done. This isn't because IT people are evil or lazy, but simply because the system of rewards and punishments is shifted so that CYA would become most important.

        So, yeah, that's not really a world I want to live in.

    • It's not a regular occurrence

      You lucky bastard. I've seen people forsake basic security requirements in IT to the point of forcing the IT guys to adopt less secure policies in everything from healthcare to finance to government to defense to manufacturing.

    • by Tom ( 822 )

      On more than one occasion I have been overridden by non-IT higher-ups regarding security policies I've wanted to put in place.

      And you should be, because security is not the only thing that matters. But there should also be a process where the person who overrides you accepts the risk. That's why having an ISMS is a good idea - it ropes in the managers and documents their decisions. It's actually the primary reason why I think having an ISMS is a good thing - making management accountable.

      • And you should be, because security is not the only thing that matters.

        I actually agree with you - but then, when you can get security with very little inconvenience I think security should win. Such as when I've been told "open this research computer's ports to the world because we don't want to have to remember to use the university's VPN".

    • The policies for security should not come from you, they should come from the business.
      If you make up the policies, do you also do the risk assessment for them?
      Just remember to advise them where appropriate, please ask a written confirmation from anyone requiring a deviation from policy, so you can mention it to the auditors.
  • by ufgrat ( 6245202 ) on Sunday May 29, 2022 @06:40PM (#62575826)

    When you tell management that you need to implement changes for security purposes, and the response is "I'm the *beep* director of this *beep* organization, and you'll *beep* run things the *beep* way I tell you to", I don't want to be held responsible for a breach when a two-bit middle management jackass decides that they're smarter than their IT department because "hell, my son can install Linux!!".

    And sadly, none of that was made up-- Was a real conversation, that happened less than a year after a wave of desktops were compromised by an outfit out of eastern Europe.

    There is an existing culture within most businesses, hospitals, universities, etc., that says "what I have to do is more important than security", right up until a major breach happens, and then it's all about "we need to be more secure"-- and the manager who gave the order that resulted in the breach, is never held accountable.

    PS: Slashdot- Your ASCII art "filter" is still a crock of unfiltered trash. Three repeating vowels in one word is not a 50 line swastika.

  • You think software development is expensive now ? Wait to see how expensive it will be if this comes true. Also, what about Free Software ? This will probably kill it.

    Wonder why US medical care is expensive, look at the insurance companies and all the bureaucracy surrounding the medical field. My primary care physician needs to employ 4 to 5 people to deal with the paper work. When I was a child, it was usually 1 Doctor plus 1 nurse/admin. Plus the Dr made house calls.

  • by ctilsie242 ( 4841247 ) on Sunday May 29, 2022 @06:43PM (#62575844)

    In many places, management will even say, "security has no ROI", and at best give lip service to security. All laws making the IT grunts take the fall for ransomware will not get the budgets needed for actual security, or management to allow processes/policies to be changed. It just means that when ransomware happens, some IT people get fined or go to jail, some lip service happens, and nothing is done.

    I'm sure companies will love this, if it is passed. Just means IT is even more squeezed. Especially in companies where the C-levels want a security breach to happen so they can short their stock before announcing it.

  • If an IT person was liable for flaws in their code, they would have to buy liability insurance. Doctors, lawyers, and many other professions do this. The bad effect would be on open source because most volunteers probably would not be able to afford it. Of course, the open source archives could buy it, but if they did they would have to actually check the submitted code for flaws or they would be uninsurable. Someone would have to pay for that (for instance the people using the code). Of course, they w
    • Software would have to be certified, the same way stuff like circuit breakers are certified to meet some standards. Nobody would pay for the certification of open source software or if one version got certified, that version would be used for decades.

      The same reason that some airplanes use floppy disks - certification costs so much, it's cheaper to just use old hardware and software.

  • Internet connected/connectable network administration and associated hardware? Sure, I'd agree that probably does need some enforced rules. Beyond that, probably best not.

  • by Lando242 ( 1322757 ) on Sunday May 29, 2022 @06:49PM (#62575868)
    Should your mechanic be liable if you drive your car into a crowded sidewalk? Should your grocer be liable for a bad chili you cooked and brought to a potluck? So why should I be liable when some user on my network opens a bogus email or clicks a fraudulent link? You can have the best firewalls on the market, the tightest spam filter available, top of the line antivirus, but they are always making a better idiot. I've had customers insist they be given direct access to their spam filter only to them release clearly marked fraudulent messages, open them, and then click links in them. I've had users complain that websites they go to every day are now blocked by the firewall only for me to connect to their machines and find they misspelled the URL and didn't doublecheck. I've had clients receive texts on their PERSONAL cell phones, which we don't manage in any way, from their 'boss' or the VP of whatever asking for them to send them some cash and have them do it without a thought. Yet I am to be liable for these people? The people that refuse to update their OS because they don't like the square Windows logo? The people that refuse to move to move to a modern LOB application because they paid $2k for it 16 years ago and "it still works"? The people that insist on being able to hook their personal iPad to the secure office wifi after being told serval times not to? Not on your life.
  • by ksw_92 ( 5249207 ) on Sunday May 29, 2022 @06:53PM (#62575890)

    For so many businesses, IT is a utility and not a business that they want to dabble in. We're seeing the the beginning of the end of the roll-your-own IT architecture as a desirable state for places where IT does not drive their core money-making activities. A lot of these people have gotten bitten by bad software, bad IT and bad end-user activity and are learning that going to places like GSuite, O365, SAP, etc. etc. reduces their risks of IT-related loss. Yes, these kinds of businesses are IT lightweights but many, many places are like this. Just look at the numbers of subscribers that MS and Alphabet tout.

    For those places where IT is integral to production I think we're getting close to the point of licensing and/or guild-style demonstrations of aptitude. IT folks who do process control, factory-floor automation and other physical-systems control (i.e. utilities) tend to work at the direction of a "real" engineer and maybe even one who holds a license. In other places, like where the IT part is just programming for a product, the company holds the product performance liability and they are usually diligent about functionality. These are more likely to have an internal "guild system" where mastery of the craft is built and demonstrated over time. You don't just hand the job of writing embedded code for a front-loading washing machine over to some freshly-minted EE or CS kid, for example.

    Before we get to licensing though, we need to really wake up folks in the C-suites about what's going on. The Colonial incident was not a failure of the pipeline control system, but a failure of security scoping by the operators. They forgot that until that gallon of fuel is delivered into the customer's hands AND IS ACCOUNTED FOR, it is under control of the pipeline. Since they didn't keep the accounting system for fuel delivery in the same scope as their core pipeline operation they fumbled the security football.

  • by ufgrat ( 6245202 ) on Sunday May 29, 2022 @06:56PM (#62575902)

    Label ransomware attacks as "terrorism", and devote half the effort spent on tracking down terrorists to tracking down ransomware gangs. Treat them, and the scam call centers in India (and other countries) as serious criminals, and make their lives a living hell.

    After the first few doors are kicked in by heavily armed law enforcement officers, the frequency and severity of both plagues will dry up.

    The physical infrastructure of most countries isn't terribly secure-- but attacks against communication, utilities and other infrastructure is considered "terrorism" or "threat to national security", and dealt with appropriately. There's very little threat to these gangs, especially when many of the ransomware operations appear to be at least partly state-funded.

  • With the ultimate responsibility comes the ultimate power to refuse sign-off until all outstanding issues are addressed, even if it does make the release late. No more ship now, fix later.

    Also, software engineers who are bonded and able to sign off will be a LOT more expensive.

  • Comment removed based on user account deletion
    • That’s one of the better analogies that I’ve seen on this topic so far. However, there is a difference between being responsible for everything that can possibly go wrong with the fruits of your work, and being held accountable for negligence. For instance: a lock maker creates a great pick resistant lock that does a great job of keeping everyone out. Until one clever bloke comes up with lock snapping. Should the lock maker be liable for any homes that got burgled because of the weakness in his
  • by Xicor ( 2738029 ) on Sunday May 29, 2022 @06:59PM (#62575914)

    why would you hold the it professionals liable when the ones who caused the issues are almost always upper management? you can blame the dev for a bug... but chances are that bug are in there due to stressful environment, lack of testing because the company thinks testing is a waste of money, or ridiculously short time lines for products.

  • they need to be paying someone enough to be able to take that bullet.
    Let's say a good example might be the idea that Mister C. E. O. and his Severance are because he puts his head out to roll, so let's adjust the engineer pay to match that.
    What's that? They don't want to pay someone to hold that kind of liability?
    Why would they go so far as to expect it? Don't give them ideas, software engineer.
    There is a reason that literally every piece of code you download has that disclaimer right on it.
    This (free to
  • Need an UNION to stand up to the PHB who cut's costs costs, QA, dead lines, etc.

  • Short answer no. No engineer is responsible for that sort of thing. Building engineers aren't held responsible if (for example) someone sets off a van load of ANFO in the basement parking level. Bank vaults may be warranted for a length of time that they will resist various reasonably anticipated attempts to crack them (if installed according to specifications and maintained properly by the owner), but they are not sold as impossible to crack (at least not in the fine print). If they do get cracked, the eng

  • It's not false that IT is full of dangerous incompetents who call themselves 'engineers'; but it seems worth noting that the various more tightly regulated professions mentioned are more or less universally judged according to whether their workmanship will actively damage whatever it is they are working on; and whether their results are good enough to avoid falling over under their own weight; not according to how well they resist hostile action.
    • by gweihir ( 88907 )

      Well, yes and no. The standard here is "negligence". Of course, if the IT person followed the state-of-the art then they do not become liable for the break-in. That already follows from fundamental legal principles. But if they messed up or did significantly substandard work, things look quite a bit different. Also, businesses that hire unqualified people would become liable for the work those people did. Sure, accidents and mistakes still happen. That is normal. But somebody doing shoddy work that does not

      • Of course, if the IT person followed the state-of-the art then they do not become liable for the break-in

        This is normally done with certification. You use certified pipes, circuit breakers and you use certified software in airplanes.

        Of course this would mean that any updates have to be certified as well. So, it will be like with airplanes - one very expensive version that works, no updates for years. No FOSS, since certifying it would be much more expensive than just using closed source software that is certified.

  • by account_deleted ( 4530225 ) on Sunday May 29, 2022 @07:58PM (#62576050)
    Comment removed based on user account deletion
  • Most of the IT staff I work consider themselves engineers according to their signatures, yet when I ask them where they obtained their qualification, generally the response is "it came with landing the job".

    I'd say the bigger problem is with representation in the industry. If the majority of engineers are misrepresenting their qualifications, you are going to get incredibly varied outcomes. And that should make sense, as you wouldn't ask a person off the street who has only ever casually watched some civi
    • by gweihir ( 88907 )

      I don't understand why IT businesses and individuals are allowed to misrepresent engineer qualifications in the first place.

      I think at this time it is "managers" too stupid to understand that hiring unqualified people is exceptionally expensive in the longer run. Hence they try to keep the hiring "cheap" by hiring at the lower end. But I think that time is slowly coming to an end.

  • to give you a better guarantee than the corporations that sell you the software. There is not one software on the market today that assumes any liability. Microsoft would be out of business and nobody would be able to afford a desktop licence.

    In order for me to offer such a guarantee the only OS that I would recommend is OpenBSD with no GUI and no third party apps.

    What does a certified plumber even offer? In most jurisdictions certification is a rubber stamp that you have a general understanding. Then years

  • through the roof.

    Indeed all goods bought and sold via IT will be very expensive as everyone seeks to insure themselves. IE. The cost to the economy might be catastrophic.

  • Should IT Professionals Be Liable for Ransomware Attacks? No, not any sooner than Stupid People are held liable for clicking on phishing links in their email. All the IT credentials, degrees, and training that IT professionals get in order to be "professionals" doesn't do anything for the weakest link in the chain. The user who clicks on that malware link. The problem then is coming up with a legal definition for that user that does not get you fired or sued for slander. People do stupid things and no amou

  • by DaMattster ( 977781 ) on Sunday May 29, 2022 @08:44PM (#62576124)
    IT professionals are often the ones warning about problems. The people they're warning are often simply MBA types that have absolutely no knowledge of how software and computer systems work. Moreover, they have no interest or inclination towards learning anything about that. MBA types see everything as a fucking balance sheet. As long as IT professional's warnings never get heeded or are seen as cost-benefit analyses, then they should not be held responsible.
    • Good MBAs teach you about a lot more than blaance sheets, just as good IT degrees tech you a lot more than installing Windows. But you can find plenty of idiots in both professions. Maybe look for a better class of person to work with?

  • The best way to handle ransomware is to make paying ransoms a federal crime punishable by a minimum 20 years hard labor.

    As TFA points out critical infrastructure / safety critical domains already have process requirements in place. There are numerous qualifications and certifications personally and organizationally available for the purpose of summary judgment of prospective employees and vendors if that's what you want or need. Sucking at HR is your problem nobody else's.

    Personally I think society has al

  • OK, sure, insure the company against attack, insure the provider against E&O.

    Pay what both want to charge.

    The insurance company will say that your infrastructure is shit and that you need to pay 10x to be insurable. The provider will agree. You will go with somebody cheaper because you don't want security, you just want somebody to blame.

    I bet both of your hair points are roasting right about now (not many people on this site, tho). This was supposed to be easy and you were looking for an underling

  • by holophrastic ( 221104 ) on Sunday May 29, 2022 @09:20PM (#62576196)

    First, we don't hold plumbers liable when winter cracks a pipe. We expect the [homeowner] to follow certain practices, and we expect shit to happen anyway. So the plumber isn't liable for the pipe breaking -- the plumber is only liable for not installing it correctly, given what it was and where it was at the time of installation.

    Second, we don't hold the roofer liable when birds eat through it, or when wind tears it apart. We call that "wear-and-tear".

    Third, we don't hold the locksmith liable when a criminal picks the lock, nor the window installer/manufacturer when a criminal smashes the glass. We generally never hold anyone liable when a criminal actively breaks anything.

    Fourth, all of those industries are (currently) stable. Sure, they all change from decade to decade, and the code changes too, but very very slowly. If you were to have a building-code style code for IT solutions, it would be the same age as every other code -- enacted 3 years ago with then-3-year-old information. So today's IT code would be 6 years old. IT is [currently] evolving way too fast for any regulation to keep up. 6-year old regulation would be utterly meaningless today.

    Fifth, if you put a lawn chair at the end of your driveway, we're not surprised when someone steals it -- because you put it out in the open.

    So, let's review:

    First: homeowner-maintenance expected translates into compatibility with other software/OS/equipment.

    Second: wear-and-tear translates into patches, updates, and memory/storage cleanup/monitoring.

    Third: crime isn't on anyone. that translates into all hacking.

    Fourth: 6-year old regulations translate into 2015-era technology. does 2015's encryption even count as encryption these days?

    Fifth: left out in the open translates into, well, absolutely any internet-connected anything. It's all exposed to the entire world.

    I'll say again what I've said for decades. The FBI ended the wild wild west because it didn't matter how big the train robbery was, if you couldn't ever spend the gold, and you had to live in the woods, there was no point in being an outlaw. IT needs law enforcement for every criminal hack, breach, et cetera. Until there's law enforcement, you can't ever have security (especially documented security) by day-job workers that will ever repel the undocumented creativity of the infinite criminals.

    And we have oh-so-many parts of IT that really could be locked down at a federal level. Explain to me why packets originating from out-of-country aren't instantly flagged with their origin, so I can at least say that root-access to the server can't be accessed from outer mongolia? Can my country not flag the packet that they received from across the atlantic?

    • ...well thought-out objection. Overruled!

      Seriously though, good points. Frankly, they would need the equivalent of "building codes" anyway, if they intend to wrap it in legal consequences. And that is simply a non-starter. No such codified set of standards could possibly be maintained or absorbed quickly enough to be followed at all before it changed.

      • Sorry to restate your own point (regarding building codes). I added nothing of value and would like a "retract" button available to me for 60 seconds after posting...

  • Today it is a unrealistic for an IT company to provide that level of assurance, but you cannot be a professional without assuming liability for your actions. Hopefully that will create enough pressure on upstream vendors to create a proper chain of responsibility where the market for cybercrime is eliminated (rather than insured as today).

  • Absolutely, as long as IT has 100% control of the budget, policy, procedure, and implementation. Elsewise, not as long as they've followed policy and procedure. Responsibility with no authority is immoral.

  • Look ... we had bugs in things as simple as early 1980's coin-op arcade games. Centipede had a bug that crashed the score counter if you got over a certain score. Modern software is massively larger. (A modern game title for Windows typically has 1.5 to 5 million lines of source code!)

    But we're talking about business-critical applications here ... not just games for entertainment. And not only are we worried about bugs in the software itself that prevents it from working as intended, but ALSO demanding all

  • If I'm an IT professional for a company and I install Windows Server, apply all the patches and this is the box holding all the corporate data including customer information and it gets compromised which IT professional is being held responsible?

    Me? Because I installed and maintained the system? Even though I have applied every patch and update Microsoft has provided?

    Someone at Microsoft that programmed the particular piece of the OS that was exploited?

    The person that programmed the library that the prog

  • Code is not a physical construct that obeys physical laws. You can't develop things like building codes and standards. If you review a design of a bridge, review the actual building of it and test it, you can get a reasonable idea of how it can perform. And even then, things can go wrong.

    Software is a bridge that if a green car with three people used the middle lane for 32 seconds at 2:23PM then switched to the right lane for at least 56 seconds while going 3 to 5 miles an hour less than they were going a m

  • We'd have to go back to having stable operating systems like Sun workstations used to have, stick with verified tools and libraries, maybe use only one programming lanugage (C), do extensive QA, make security a part of everything from hardware, to boot firmware, to operating system, to ALL apps on the system. Network I/O would all have to be made secure, as would network management. Remote resources like servers/cloud and devices/sensors would have to be secure. Keyboards and screens would have to be mad
  • One of the reasons licensing for a lot of professions work is that the rules are stable. In construction and civil engineering, the laws of physics don't get completely rewritten overnight. New materials and techniques aren't introduced every month. You can write up rules for determining whether something will be safe and they won't become invalid next week. That allows professions to create standards for correctness and safety that professionals can count on to work and keep working over time. They have to

  • Moreover, it is proof that persons skilled in one area do not always excel in other areas, particularly ones they feel the need commenting.
  • by MrKaos ( 858439 )

    If you want to fix the problems of ransomware then make the businesses liable for not having backups or proper security processes. Thoses are systemic and can be dealt with as business processes.

  • by twocows ( 1216842 ) on Tuesday May 31, 2022 @10:07AM (#62579772)
    If the compromise or failure is the result of extreme negligence on the part of the professional, if this can be proven, and if their negligence isn't the result of gross managerial incompetence, then yes, I think there might be a case for this argument.

    The problem is, if this is ever implemented, that won't be the way it's implemented. It'll just be a way for liability to be forced further down the chain so that nobody "important" can be charged. Management had unrealistic deadlines, no QA/auditing, and verbally directed you to do things to undermine the security of your product or service? Doesn't matter; you did it, so you're at fault. That's how it'll be done, and I think anyone with a lick of sense should say "hell no" to this nonsense.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...