Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security Chrome Encryption Google The Internet Upgrades

Chrome 43 Should Help Batten Down HTTPS Sites 70

River Tam writes The next version of Chrome, Chrome 43, promises to take out some of the work website owners — such as news publishers — would have to do if they were to enable HTTPS. The feature might be helpful for publishers migrating legacy HTTP web content to HTTPS when that old content can't or is difficult to be modified. The issue crops up when a new HTTPS page includes a resource, like an image, from an HTTP URL. That insecure resource will cause Chrome to flag an 'mixed-content warning' in the form of a yellow triangle over the padlock.
This discussion has been archived. No new comments can be posted.

Chrome 43 Should Help Batten Down HTTPS Sites

Comments Filter:
  • by Ignacio ( 1465 ) on Sunday April 19, 2015 @04:34PM (#49506249)

    Welcome to 2013. [mozilla.org]

    • Re:Hello (Score:4, Informative)

      by Anonymous Coward on Sunday April 19, 2015 @05:04PM (#49506375)

      Nice try, but this is significantly different from what Firefox does.

      From TFA:

      The directive “causes Chrome to upgrade insecure resource requests to HTTPS before fetching them”, Google explained today [chromium.org].

      TFA's link to chromium.org essentially says the exact same thing:

      Upgrading legacy sites to HTTPS
      Transitioning large collections of unmodifiable legacy web content to encrypted, authenticated HTTPS connections can be challenging as the content frequently includes links to insecure resources, triggering mixed content warnings. This release includes a new CSP directive, upgrade-insecure-resources, that causes Chrome to upgrade insecure resource requests to HTTPS before fetching them. This change allows developers to serve their hard-to-update legacy content via HTTPS more easily, improving security for their users.

      Converting to plain English: If the URL says "http://", Chrome will first try the same link with "https://". You'll only see a mixed-content warning if the website fails to return content for the "https://" link. This obviously assumes that the website is running both HTTP and HTTPS, and that it will give the same content regardless of whether you use HTTP or HTTPS.

      Your link to Firefox 23 only talks about issuing warnings for mixed content; it does not say anywhere that it attempts to retrieve the HTTPS version of an HTTP link.

      tl;dr: Firefox just blocks it; Chrome looks for a safe alternative and only blocks if the safe alternative doesn't exist.

      [ Disclaimer: I use Firefox; I have never used Chrome. ]

      • by Lennie ( 16154 )

        This is pretty useless if other browsers don't adopt the same model.

        It just means some webdevelopers that forgot to test something in other browser might end up breaking sites unknowingly.

  • by John Bokma ( 834313 ) on Sunday April 19, 2015 @04:34PM (#49506251) Homepage
    Gives a better summary "The next version of Chrome will include a new security policy that may make it easier for developers to ensure “HTTPS” websites aren’t undermined by insecure HTTP resources."
  • by 140Mandak262Jamuna ( 970587 ) on Sunday April 19, 2015 @04:45PM (#49506297) Journal
    When it rains it pours. I am battling a serious RAID controller failure at my work desktop. At least I could go home, use VPN to access some common team servers to do some work. Lo, and behold! St Murphy, the patron saint of all things barfing, decides to step in at this critical juncture. Chrome decides to cut Java. Our wonderful IT had bought VPN software that relies on java plug-in in the browser. OK firefox will come to my rescue, so I thought. But St Murphy had anticipated my move.

    When everything fails, you sell your soul to Satan and decide to fire up, gasp, internet explorer. For some odd reason it manages to get past all the hurdles gets the network extender running. Satan is laughing at St Murphy. St Murphy never loses, his revenge will come soon, and it will be swift.

    In the meantime, caught as a mere pawn in the eternal battle between Satan and St Murphy I am ruing my fate and belly aching in slashdot.

    • by tomxor ( 2379126 )
      Your "IT" sold their souls when they brought shitty VPN software that relied on Java... Sure almost all VPN software is sucky most of them completely ignoring issues regarding running TCP over TCP, but adding a steaming pile of shit to a steaming pile of shit is just asking for a massive steaming pile of shit. Almost no one misses that horrid vulnerability that chromium is actively trying to eliminate.... so yeah fuck your stupid "IT"
      • Yeah, I know about java security model being completely broken as a plug-in. It has been two years since the advisory came out. Still our vpn vendor is pushing a java-plug-in based solution and our IT is still buying that load of crap. What to do?
        • Re: (Score:2, Interesting)

          As screwed up as this sounds I would take modern IE 11 over Firefox anyday.

          I would have a psychotic episode seeing me type this 5 years ago but Firefox has gone to shit starting with 4. Actually 3.6 U noticed slowness too.

          IE is great for running ancient shit intranet sites. Java is negligent to run as a plugin. Only few good reasons for IE is group policy to allow java to run on only intranet or trusted site lists. If your mcses at work have it enabled globally they should be slapped up the back of the head

    • Re: (Score:2, Interesting)

      by The MAZZTer ( 911996 )

      It is your IT dept's responsibility to keep the VPN working, not Google's. Google has chosen to drop support for a 20 year old insecure plugin architecture in favor of a more modern, secure one. Sure, it's one developed by Google, but 1) there wasn't an existing standard out there AFAIK so they had to make one and 2) the plugin interface is open source so anyone can go and implement it in their own browser, or in their own plugin.

      Oracle's official stance seems to be that Java users should switch to Firefox

    • Murphy's law means that whatever can happen, will happen. Matthew McConaughey taught me that, by having a career.

  • by Anonymous Coward

    After all, we aren't in the days where pages could be returned in place of images and somehow still get parsed by web browsers like in days of old.
    Holy shit that was an awful bug.

    Can still be used for tracking though.

  • by NoNeeeed ( 157503 ) <slash&paulleader,co,uk> on Sunday April 19, 2015 @04:53PM (#49506333)

    What a shock, a slashdot summary that misses the actual salient point of the linked article...

    Here's the description of the new feature from the linked article:

    If the same site was accessed in Chrome 43 -- which is beta now but should be stable in May -- the warning should vanish thanks to a browser Content Security Policy directive known as Upgrade Insecure Resources. The directive “causes Chrome to upgrade insecure resource requests to HTTPS before fetching them”, Google explained today.

    Here's Google's own description of the feature from the Chromium Blog [chromium.org]:

    Upgrading legacy sites to HTTPS

    Transitioning large collections of unmodifiable legacy web content to encrypted, authenticated HTTPS connections can be challenging as the content frequently includes links to insecure resources, triggering mixed content warnings. This release includes a new CSP directive, upgrade-insecure-resources, that causes Chrome to upgrade insecure resource requests to HTTPS before fetching them. This change allows developers to serve their hard-to-update legacy content via HTTPS more easily, improving security for their users.

    So basically this means you don't have to worry if you accidentally miss an HTTP asset link on your site when upgrading to HTTPS, Chrome will automatically do that for you.

    Hopefully the other browsers will follow suit soon, otherwise it's of limited use.

    • And if that resource for whatever reason is only on HTTP then your screwed?

  • So instead of going through and changing your pages to use https:/// [https] they want you to go through your pages and add a meta tag. (Yes I did read that there is an option to set it at the server level.)

  • by Anonymous Coward

    Run grep on every article (or SELECT from your database) and on every script for http[^s]. Then open a bug for every one of them you find. You're done when every bug is closed and every regression test passes.

    Oh shit, I forgot, web developers aren't engineers and aren't capable of doing the above. So this is really hard and can't be solved except by brilliant Google.

  • Create a plugin for a browser so that when you come across a page that has mixed content it finds out the contact information for the site and sends them a message how stupid they are automatically. Stop bugging me with warnings since I can't do anything about it. It's time to inconvenience the bad developer who made the page until they fix it.

  • Inconsistently reports perfectly secure SHA1 certificates as weak or fine, which means it's can be relied upon to determine your security.

    More info on Security Now #502 [grc.com]

    • by Sebby ( 238625 )
      Seriously, OS X Yosemite's 'auto correct' is a total failure: meant to post "...it can't be relied upon..."
    • by Anonymous Coward

      I'd be even more worried about their handling of certificate revocation. If you aren't on their special list, your cert isn't revoked.

  • by X0563511 ( 793323 ) on Sunday April 19, 2015 @05:42PM (#49506523) Homepage Journal

    For a good long while it's been annoying when dealing with mangled SSL configurations - at least firefox let's you tweak stuff in about:config to work around them. [ryananddebi.com]

    No, getting the site fixed is not always an option, and validation of the certificate is not always necessary. For instance, there was a good long while where Chrome was completely unusable with some of our ZFS storage appliances (which live on a nonrouted private management network) because of retarded cert validation changes. Sure, that makes sense when you are visiting your bank's site... but not so much when you're trying to get into something on 10.0.0.0/8 when you're directly connected to the thing with a crossover cable... and no, updating the software in the controller wasn't an option because of outstanding critical-level bugs.

    Fun times.

  • by thogard ( 43403 ) on Sunday April 19, 2015 @09:28PM (#49507617) Homepage

    The push for https everywhere also means there is more metadata floating around. If all your are looking at is the metadata and not the data stream, https gives an observer more info about what is going on than with just http. Once you get into properly verifing certs, both sides and an observer has more info to tie a converstaion between a specific client and a server.

    You can see this yourself by getting something that does netflow and look at the data that comes from that.

  • From https://www.chromestatus.com/f... [chromestatus.com]:
    This feature allows authors to ask the user agent to transparently upgrade HTTP resources to HTTPS to ease the migration burden.

    So it is the content provider which decides if this is being used.

    It is not only a Google thing, check the Firefox bugzilla:
    https://bugzilla.mozilla.org/s... [mozilla.org]

    And the W3C Draft:
    https://w3c.github.io/webappse... [github.io]

    This is in my opinion a good thing, it leaves all control in the hands of the content provider and supports the move to encryption everyw

  • Like bugs in features that people actually want to use - http://ark42.com/chrome/ [ark42.com]

"The four building blocks of the universe are fire, water, gravel and vinyl." -- Dave Barry

Working...