Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Privacy

Transforming the Web Into a Transparent 'HTTPA' Database 69

An anonymous reader writes MIT researchers believe the solution to misuse and leakage of private data is more transparency and auditability, not adding new layers of security. Traditional approaches make it hard, if not impossible, to share data for useful purposes, such as in healthcare. Enter HTTPA, HTTP with accountability.
From the article: "With HTTPA, each item of private data would be assigned its own uniform resource identifier (URI), a component of the Semantic Web that, researchers say, would convert the Web from a collection of searchable text files into a giant database. Every time the server transmitted a piece of sensitive data, it would also send a description of the restrictions on the data’s use. And it would also log the transaction, using the URI, in a network of encrypted servers."
This discussion has been archived. No new comments can be posted.

Transforming the Web Into a Transparent 'HTTPA' Database

Comments Filter:
  • the key.

    All of these sorts of silly ideas depend on no exploits and everyone being a 'good guy'.

    If those two things were the case, there would be little to no reason to implement something in the first place.

    • Yep, sounds like just another variation on the evil bit [ietf.org].

      • by fuzzyfuzzyfungus ( 1223518 ) on Saturday June 14, 2014 @09:08PM (#47238643) Journal
        Even older than that. The 'Send the data, along with a description of restrictions on the use of the data' is (along with a dash of 'semantic web' nonsense, which appears to be a new addition), classic "Trusted Computing".

        Mark Stefik, and his group at Xerox PARC, were talking about 'Digital Property Rights Language' back in 1994 or so, and by 1998, if not earlier, it had metastasized into a giant chunk of XML [coverpages.org]. That later mutated into "XrML", the 'Extensible Rights Management Language', which eventually burrowed into MPEG-21/ISO/IEC 21000 as the standard's 'rights expression language'.

        Some terrible plans just never entirely die.
      • From the beginning of the article:

        > HTTPA, designed to fight the "inadvertent misuse" of data by people authorized to access it.

        It sounds to me that it's more similar to labeling a paper file "confidential, for xyz use only". By attaching the confidentiality information directly to the data, you seek to avoid having someone absent-mindedly email the information to a vendor, without thinking about the fact that the information is supposed to be kept confidential.

    • by raymorris ( 2726007 ) on Saturday June 14, 2014 @11:56PM (#47239065) Journal

      I think you've missed the point. Quoting the beginning of the article:

      > HTTPA,designed to fight the "inadvertent misuse" of data by people authorized to access it.

      I've had this conversation more than once:

      Bob - Why did you tell people about ___. That was supposed to be a secret.
      Sally - Oh, I'm sorry, I didn't realize that was supposed to be kept confidential.

      Also this thought "oops, what I just said was supposed to be kept confidential. I messed up."

      Those are the situations the protocol is supposed to address, the INADVERTENT release of confidential data. It's the digital equivalent of stamping a paper "confidential, for abc use only". Any time the system accesses the data, it is also reminded of the confidentiality rules attached to that data. This so they can, through processes and software, avoid mistakes. For example, a client could be set so that an attempt to copy confidential data to clipboard instead copies the reminder "this is confidential information", so someone copying it into an email without thinking gets reminded.

      • Years ago I was working as a subcontractor to a major defense contractor. I had a conversation with IT that went something like this:

        IT to all personnel: Anyone with a computer must review each file on their drive and label any that might contain confidential information. Please insert our company logo and the following text into any confidential files.
        Me to IT: To clarify, I have approximately X files on my hard drive. Do I really need to review ALL of my files?

    • Privacy, sadly, is a losing proposition.

      1) Google and advertisers track you + accumulate data.
      2) The government does the same
      3) Credit reporting agencies and banks selling your debt/credit card transaction data.
      4) Employers
      5) Insurance companies + on and on

      Facebook and Google and LinkedIn are just 3 companies built on invading your privacy and there are tons more.

      Short version: You are losing your privacy. "Not liking it", "Angry posts" and the like won't change this.

      On the plus side: They really ar
      • Short version: You are losing your privacy. "Not liking it", "Angry posts" and the like won't change this.

        So, what do you think WILL change it? Moving along with the other sheep to the other side of the field, hoping the wolf won't get you next?

        Angry posts are more likely to change it than what you seem to be advocating. Enjoy your cage.

        • >So, what do you think WILL change it?

          Government action and privacy laws are the only solution, which I can't see the government being interested in because they are one of the main perpetrators (NSA spying, etc.).

          "Angry posts" didn't stop email spam or telemarketing abuse -- someone complaining on the internet is of no concern to a company that is trying to generate revenue. Both of those were dealt severe blows by laws.
    • In essence, this is the same silly-assed idea that was mentioned here the other day, which some famous-for-something-else computer science has been working on for all of 50 years, or some damned thing.

      They want to take the Web, and make into some kind of holy f*king persistent interconnected-data mess, which would be broken all the time because data that is supposed to be persistent seldom is after a few years.

      I do not want the Internet to be an "interconnected database". I think if we tried to do it
  • by NoNonAlphaCharsHere ( 2201864 ) on Saturday June 14, 2014 @07:35PM (#47238381)
    So we have a stateless database with built-in DRM on every record and user tracking. Brilliant.
    • by NotInHere ( 3654617 ) on Saturday June 14, 2014 @08:00PM (#47238473)

      The original paper [mit.edu] has examples where such a DRM-based system has some legitimate usages. One was for patient data. If you want to eliminate special client software there, you can have this system, and run everything on the browser. The system abstracts and standardizes the access control, which is hopefully already present, and helps to close holes in the implementation. For intranets the model perfectly makes sense, however deployment into the wild wide web is of course extremely harmful.

      Media people didn't alter the story, as the paper already contained discussion about www deployment, but only picked the bullshit non-intratnet-web part.

      • Or you could just airgap your intranet. Even moving sensitive pages to a different port and firewalling outside access on that port should do it.
        • But no one ever really does that. Although you can state-freeze an OS, none of the OS makers have useful constructions that allow vetted air-gap updates via media transfer.

          The entire scheme looks like a paradise for someone that wants to crack it like an egg. This, too, shall pass.

          • by Anonymous Coward

            What? No one does that? Maybe not in the US, but over here in Europe we certainly do have air-gapped networks where security is necessary (military). We were able to push Windows security updates just fine.

      • "If you want to eliminate special client software there, you can have this system, and run everything on the browser."

        Being able to reuse more browser code might well improve the existing proprietary client; but (as with any DRM system) you can't eliminate special client software; because you would otherwise be incapable of distinguishing between a web browser that happily accepts your 'HTTPA' and then ignores your restrictions and one that accepts and obeys.

        As ever, you'll either need some suitably d
        • Being able to reuse more browser code might well improve the existing proprietary client; but (as with any DRM system) you can't eliminate special client software

          I've meant that the developers don't have to reinvent the wheel. Of course the software will be closed source, but the developers of the medical application won't need to invent their own access control, but can use this component. Its a good idea when it makes hospitals more secure.

  • by Anonymous Coward

    Privacy's on the honor system now!

  • by jhantin ( 252660 ) on Saturday June 14, 2014 @07:50PM (#47238427)

    All I see here is a bunch of stuff that all depends on trusted third parties... and in security circles, "trusted" means "can screw you over if they act against your interests". In this case it relies on trusted identity providers, labeled 'Verification Agent' in the paper.

    It all breaks down if a verification agent is compromised, and the breach of even a single identity can have severe consequences that the accountability system cannot trace once information is in the hands of bad actors.

    The authors effectively admit that this entire mechanism relies on the honor system; it explicitly cannot strictly enforce any access control, because in the context of medical data access control may stand between life and death.

    Finally, the deliberate gathering of all this information-flow metadata would add another layer to the panopticon the net is turning into.

    • Silly consumer, the risk of 3rd parties undermining the system is why we need to make Trusted Computing mandatory! Secure remote attestation for all, only rights-management-compliant systems allowed on the network!
      • by jhantin ( 252660 )

        This is oddly close to what I think DRM ought to be: advisory, not enforcing. Remove the accountability aspect, not least because it's a farce that leaves the most recent honest party holding the bag, and you have my concept of an ideal DRM engine: provenance meta-tags that let you know what color your bits are [sooke.bc.ca], which you can use if it affects you or ignore if it doesn't, leaving no rights-holder the wiser no matter what course you take.

        Accountability-oriented DRM, which prevents no action but forces yo

    • This may be out there in left field but what if: It is, I think, a fact of life that everybody is going to have an internet connected device of some sort and bandwidth will continue to be mostly sufficient. What if a protocol could be developed so that the contents of packets were encrypted by default and the location of the encryption key or some other permission approver was part of the packet. If the sender had to be verified (tracked) the location of the key or approver would be checked. Let's say w
  • Web browsers with DRM built in? Terrible.

    • Ah, so anyone but the legitimate user can easily access the content? Just like with the other forms of DRM?

  • Sounds to me like something the NSA would come up with, a universal database tracking everyone's access to every little thing on the internet, and the so-called 'restrictions' are as meaningless as the 'do not track' flag in a web browser, it only works when everyone is playing by the same rules.
  • Is it a bad summary or a stupid idea?

    As it is explained, it seems that system does not cover the case where someone gets the data and leaks it

    • by tlambert ( 566799 ) on Saturday June 14, 2014 @09:02PM (#47238633)

      Is it a bad summary or a stupid idea?

      Yes.

      As it is explained, it seems that system does not cover the case where someone gets the data and leaks it

      It's advisory access controls with voluntary indications of use, with transaction metadata logging.

      (1) The rights you could be granted are based on the object, not the actor and the object
      (2) You obtain the exported rights list
      (3) You voluntarily provide a purpose in line with the rights which are granted
      (4) Your voluntary compliance with the rights list is logged as metadata, because collection of metadata isn't controversial at all
      (5) You retrieve the data
      (6) You use it however the hell you want, because you're a bad actor
      (7) If you are a good actor, you enforce use restrictions in the client

      and...

      (8) You try to sell the idea as somehow secure, even though it's less secure than NFSv3, since NFSv3 at least requires the client to forge their ID

      So "Yes" - a bad summary, and a stupid idea.

  • This is a dumb idea that sounds like a good concept. It like any other good thing on the internet requires that no one be malicious. SMTP didn't used to be restricted until spammers abused it. All that it takes to defeat HTTPA is a client written to ignore the A part.

  • The problem is distribution of trust. This is solveable.

    • Really? No one in the known universe has figured out how enforce 'trust' on others.

      Please, enlighten me, how do you force me to owner your request to not tell your secrets to someone else? Kill me before I have the chance to do anything with the information? Thats the only known method, and it still depends on no exploits; such as me finding a way to tell the guy standing next to me before you manage to kill me.

  • More hoopla, with bandwidth and CPU intensive DRM and user activity tracking on top. What problem is this even trying to solve?

  • under a different protocol?

  • ...which already logs unique uris and often classifies using server- config'ed tags?

  • Maintain a physically secure, access controlled, TEMPEST hardened room in a secret protected location. Verify through periodic repeated inspection and test that all production media in the room is physically isolated from all untrusted communications networks (ideally, all networks). When you absolutely must share secret information with Alice, invite Alice to your room. Verify her identity, physically hand her the the information to read, monitor her while she reads the information, then physically retriev

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...