Trackerless BitTorrent Beta Posted 432
jgarzik writes "BitTorrent development is occuring at a furious pace. At the beginning of May, an Azureus update added distributed tracker and database features. Yesterday, Bram updated BitTorrent to include support for trackerless torrents in the new BitTorrent 4.10 beta."
I'm curious (Score:2, Interesting)
Diluting its strengths? (Score:5, Interesting)
If you lower the cost of entry to producing a BT release, won't that mean more .torrent file swimming around? With the increase of different torrents everywhere, won't that dilute the power of BT?
Is it legal to post only in questions?
Re:How (Score:5, Interesting)
There still is a target (Score:5, Interesting)
Does this really change... (Score:3, Interesting)
If this technology takes off (Score:5, Interesting)
1) **AA will squirm for a while
2) **AA will work harder than before to moniyor and restrict user rights on the internet, via congressional purchasesing, er, I mean lobbying.
I think #2 will ultimately be futile in that it will not slow their loss of control over media content distribution (and copyright violation) but it will make life unpleasant for many...
Re:Does this really change... (Score:5, Interesting)
Cat and mouse at it's best (Score:5, Interesting)
Is this a combat to the death ?
I guess nothing will beat private exchange ? (DRM)
no bittorrent download upgrade option? (Score:3, Interesting)
I was a bit surprised that the download for the upgrade didn't have a bittorrent option. Isn't that ironic? or did I miss the link on bittorrent.com?
so quick question... (Score:3, Interesting)
not sure how it'd work otherwise, but this gives each torrent a single responsible party for its uploading. on the plus side they could limit who has access to the download client tables to people who need it and upload valid.
curious, and no im not just using it for legitimate torrents, but i pay for my cable and id rather keep stuff on my file server than a tivo with a crappy interface.
Re:Won't stop the RIAA/MPAA (Score:5, Interesting)
Something like i2p? [i2p.net]
Re:Diluting its strengths? (Score:5, Interesting)
As for the second point, imagine a scenario where I have a big file (perhaps an iso) and I create and upload a
Now, in the old model there are only a few places you could have uploaded your
However, in the new model you won't notice, and the internet will have some people downloading via your torrent and others downloading via your friend's even though the data being shared is identical. At least, that was the grandparen'ts concern, and I suspect they are right.
Re:Diluting its strengths? (Score:3, Interesting)
I guess what I"m saying is -- torrents are a popularity contest. You can't win by being a poser.
Re:Diluting its strengths? (Score:5, Interesting)
Re:Since TFA is a bit short on details... (Score:5, Interesting)
Re:There still is a target (Score:2, Interesting)
Re:So... (Score:4, Interesting)
Hitting some of the larger college campuses would be a good start. Some colleges will fight, but until the precedent is set, others will block, and the highest bandwidth users will be offline.
Re:Diluting its strengths? (Score:1, Interesting)
Re:So...Idle Hands are... (Score:5, Interesting)
Inefficient network use also leads to waste of money - which could be used for charity. And you're forgetting of a fundamental right that all humans must have: Freedom of speech.
Re:So...Idle Hands are... (Score:5, Interesting)
There are a lot of people--I can't say whether this is true of the BT developers or not, as I don't know them--who are interested and drawn to projects that have a hint of subversion as well as technical challenge to them. Given the popularity and rate of development of such projects, this seems rather obvious.
Even better: Dijjer! (Score:3, Interesting)
Like this it's a distributed publishing system without any sort of tracker, but without torrent files either. In dijjer you make requests from your web browser through a proxy server that's your interface to the rest of the system.
It's different in that all of the data being distributed exists in a single system, not in grouped systems of people interested in the same file. Therefore there's a lot less concern about there being too few peers signed on to make the system work.
Horrible idea as far as product quality goes (Score:3, Interesting)
Re:I'm curious (Score:5, Interesting)
I am actually hoping somebody will make a plugin so azureus will act as an i2p router and not have to rely on and externally configured app.
Distributed tracking AND total anonymity let the party begin
Load balancing (Score:1, Interesting)
Re:Since TFA is a bit short on details... (Score:4, Interesting)
It sounds like they are both doing nearly the same thing, so if somebody beat you to the punch, why release a slightly different but just different enough to be incompatible implementation of distributed tracking?
Re:Since TFA is a bit short on details... (Score:4, Interesting)
Re:So... (Score:5, Interesting)
This is the reason why DHT, as the monkeys released it, is a Bad Thing(tm). They should've err'd on the side of caution and assumed torrents were "private" unless explicitly marked otherwise. Because they added the "private" flag to the info dictionary, sites cannot retroactively privatize their torrents -- it changes the info_hash, which is the exact reason why the monkeys put it there (where it technically doesn't belong.)
Re:Hmm... (Score:5, Interesting)
(all you have to do is join the swarm and sit back and log all the IPs reported by the tracker and from all the inbound connections.)
Re:Diluting its strengths? (Score:2, Interesting)
This would in fact be a -huge- step forward, because then you would have everyone sharing the file together instead of people on The Pirate Bay sharing it, and people on BT Efnet (RIP) sharing it, and people on TV Torrents sharing it, all via different trackers, diluting the potential upload power by separating rather than combining.
Of course, this distributed tracker might in fact eliminates all safeguards against leechers. Of course, those who really cared about that could just keep using their online sites instead of the distributed tracker.
Re:Diluting its strengths? (Score:3, Interesting)
Re:So...Idle Hands are... (Score:5, Interesting)
A week in 96 degree sun building houses for the homeless.
AND I also like BT.
I agree the artists need some money to keep working. I disagree that they won't write or create new art unless they get millions of dollars. I really disagree that the middlemen who do nothing that can't be replaced by BT should get rich. I donate money to artists (via magnatune among others) where I know the artists are actually going to see a majority of the money and I've established that I like the art.
I also try some stuff, don't pay for it, don't bother to delete it but never listen to it again.
There is now more quality songs/art/tv shows/movies than I could watch/listen to if I spent every day from waking to sleeping consuming it. Only monopolies are holding up the prices- but the glut is coming and prices will drop.
Losing Centralized tracker is not good (Score:5, Interesting)
Actually the centrallized tracker is a very important thing. It decides who downloads what. Without the central tracker the effort will not be that synchronized.
I was expecting the development to be towards making the tracker redundant, with creating a super tracker, that would track the tracker.
Also the
Also the Emule has it better that it can determine that multiple names of a file are actually the same file, based on the same Hash.
I would think it would be better to have super trackers track the trackers, with multiple super-trackers tracking the same tracker. And each super tracker would be tracking multiple trackers. Super trackers would provide the search capabilities, and would share tracker information among themselves. They would also provide tracker redundancy. They would also be able to determine if the different file names are in fact the same file, and merge several trackers into one.
I think the peers with good bandwidth and with maximum completed parts would become the tracker. The benefit of being the tracker would be that you get the file faster, because the tracker would obviously give itself the benefit. Then when the tracker has completed its own file. A new tracker would be selected.
What do people here think?
Re:So...Idle Hands are... (Score:3, Interesting)
You have just given yourself the answer you were looking for. Freenet makes it *very* difficult to track down the sources of files. If you're downloading music or videos, it is sufficiently anonymous for what you're doing.
But as is pointed out on several sites discussing Freenet, if you're a dissident trying to release information, you could still be in for a whole lot of trouble...
Re:Right. This only solves part of the problem (Score:3, Interesting)
Re:So... (Score:2, Interesting)
Requiring, nay, demanding sites regenerate all their torrents is a lame answer and a dangerous precedent -- do you want to go through this again every 3 months when some other idiots do something stupid like this? It's also impossible. You're talking about recoding millions of torrents, forcing every single user to re-download each torrent (not just the az users, every f'ing user), and deal with the information leakage and "lost stats" for users who don't grab the new torrents before DHT hands out their personal and unique torrent. The Azureus developers really failed to give this shit any thought at all (which is all too common with them.) [where's the support for specifying peer sources per torrent, for example?]
Re:So...Idle Hands are... (Score:3, Interesting)
(a) "You first. If you eliminate all your activities of which I disapprove, I'll reciprocate."
(b) No one is obligated to give. That's one of the things that freedom means.
(c) I give already in other ways. I have given enough.
Re:So...Idle Hands are... (Score:3, Interesting)
Suppose, using some new hypothetical p2p program, my client uses one network, say, Gnutella, to search for a title. Using Gnutella, my client downloads a file of instructions that describes how to reassemble what I want using various numbered blocks. (For example, a block's number might be its SHA-256 hash) Next, my client searches the network, maybe using a completely different network or protocol, for each of the block numbers. The downloaded blocks are labeled with a B, as in B58273838922837389. The reassembled content file, the file I originally searched for, is made up of blocks labeled with a C, as in C1, C2, C3, etc.
So the file I want is reassembled, according to the list of instructions, like this....
C1 = B166 xor B224
C2 = B338 xor B426
C3 = B872 xor B998
C4 =...
C5 =
etc.
(Drawback, I used double, or triple or more, of the bandwidth necessary to download the file.)
So which IP did I get the infringing content from?
Remember, each block could be found using a different mechanism, Gnutella, OpenNap, Http, etc. Each block is just a bunch of random bits, indisginguishable from noise.
Well, the beginning of the file, C1, was created from blocks B166 and B224. (Of course, they would have much longer block numbers.) But block B166 combined with some other block on the network results in part of The Declaration of Indepencance. And block B224 combined with yet another block, results in part of The Bible. So was B166 or B224 infringing?
And which IP address gave me the infringing content?
The gnutella node that gave me the reassembly list didn't give me any actual infringing content, just a bunch of numbers. I suppose that the reassembly list could also have been a file that was recursively shared using the Blocks scheme I describe here. Thus I might have to reassemble something, only to find out that I have reassembled a new reassembly list (as long as I knew up front that this would be the case).