Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security

Googling Your Way Into Hacking 431

knifee writes "New scientist is running an article explaining how hackers can use Google's cache to quickly hunt down sensitive pages, for example, by searching the terms "bash history", "temporary" and "password". Might be worth looking at this tutorial about robots.txt if you think you might be at risk." That's pretty amusing.
This discussion has been archived. No new comments can be posted.

Googling Your Way Into Hacking

Comments Filter:
  • by mjmalone ( 677326 ) * on Thursday July 31, 2003 @11:45AM (#6581068) Homepage
    For example, one common filename for passwords is "bash history".

    This guy is a security consultant? Come on, what admin in their right mind would enter a password in cleartext on the command line and allow it to be stored in ~/.bash_history? The first thing I do when I log onto a box is link bash_history to /dev/null, just out of habit. The security problem isn't google's fault, it is stupid admin's who don't know what they are doing.
    • Wouldn't it be more fun to ln -s ~/.bash_history /dev/random instead?

      Would make for interesting google logs. ;)

      Don't have to worry about that particular problem. Both FreeBSD and MacOS X use tcsh by default anyway, and all of my users are Unix stupid, so they never log into shell.
    • is link bash_history to /dev/null
      i understand your point, but to leave .bash_history the way it is, allows for an admin to see if anyone has compromised security, no?

    • by gooru ( 592512 ) on Thursday July 31, 2003 @12:05PM (#6581297)
      It's not even just ~/.bash_history but ~/ itself! Who in the world would make that world-readable and published on the web?!?!? This isn't even the default for any configuration I've seen. (Does anyone else know differently?) It's one thing to spider ~/public_html or /var/www or whatever you have set up for your webserver...quite another to have ~/ published on the web. I can't believe this is a security problem for people, though I suppose it is a proven possibility.
      • You have to have execute permission on each interim directory between / and public_html (or whatever you have it set to on your server.) This is because the directory execute bit is the "change to this directory" bit. A lot of users fuck this up and just make their home directories world readable, or even writable. Just another reason to separate the user from his data whenever possible. The trick is to do it in a way that won't make them feel left out. Obviously some people are more willing to put in the t
        • By default, your history files are only readable by you and is not group/world readable. Your shell actually sets this up--regardless of your umask--when it first creates the file so only a bozo who manually changes the modes deserves what they get as a consequence.
      • One possibility is that some 'clever' admin has set the 'webmaster' user's home directory to /var/www (or whatever your docroot is) - Then, as well as easy access to the html files, the .bash_* files would be left there too
    • by dan14807 ( 162088 ) * on Thursday July 31, 2003 @12:10PM (#6581357) Journal
      > The first thing I do when I log onto a box is link > bash_history to /dev/null

      unset HISTFILE
    • The security problem isn't google's fault, it is stupid admin's who don't know what they are doing.

      More than once, when looking for a specific dll, I've found a whole software install in a directory on somebodys network.

    • by inertia187 ( 156602 ) * on Thursday July 31, 2003 @12:24PM (#6581484) Homepage Journal
      It's happened to me. My .bash_history has contained passwords. Why? Because I'd type too fast and not look at the screen. For example:
      bash-2.05a$ ssh inertia@whatevre
      ssh: whatevre: no address associated with hostname.
      bash-2.05a$ f33lokihum
      Oops.
    • by Bigbutt ( 65939 ) on Thursday July 31, 2003 @12:45PM (#6581709) Homepage Journal
      Well, we had a stupid admin who, as a test put the /etc/passwd file into webspace.

      We had another admin who tried to su to root and typed in su [root password]. We check the logs searching for someone typing in a non-user account that looks like garbage and we notify the admin to change their password.
    • by jd ( 1658 ) <imipakNO@SPAMyahoo.com> on Thursday July 31, 2003 @12:53PM (#6581769) Homepage Journal
      #include "IANAL.h"


      You can probably use this to set up "honeypots" which may be legal in States where traditional fake services would be considered illegal as entrapment.


      Simply set up a virtual machine (user-mode linux is a good one for this). Have the root account publicly read/write and somehow "accidently" visible to httpd.


      Have the login shell a program which acts as your honeypot, logging activity, tracing back to the user, etc. All the stuff honeypots do so well.


      Next is to ensure that the root password is visible, plain-text, and in a file that is visible to search engines. Your average skript kiddie is not going to question the apparent generosity of the admin. To get the engine to find the account, you probably want to have your main web page link into your virtual machine's root account - say via an FTP.


      Now, none of this is entrapment, in the sense that the person must pro-actively attempt to present a false identity before the service is accessed. There can be no question that the identity of any user logging in is fake, that the user logging in knows that it is fake, and that there has been a deliberate, pre-meditated attempt to compromise an account.


      If you want to go one step further, have the login shell transfer some goodies, such as cpuburn. Now, these have to have a "legit" use by a "legit" user, as anyone who gets burned is likely to complain. You have to be able to stand your ground and say "hey, I use this service as a convenient way to do hardware tests on remote machines - I locked that account against intruders, so if an intruder gets in, it's not my fault if they get burned."


      (If you leave something dangerous "just lying around", you could probably be held accountable if someone gets hurt, even if they were stupid or malicious. But if you make a "reasonable" attempt to deny access, then it's not your problem.)


      In fact, if you do any freelance tech stuff, you might very well use the service for real as a way of fetching over stress-testing software. It would make it a lot harder for "victims" of your root snare to complain, as you could then prove a legitamate use by legitamate users - the victim not being one of them.

    • This would be a good way to set up a "slightly more legit" honeypot, in States or countries where "services for the sole purpose of entrapping people" is illegal.

      Set up a virtual machine (user-mode linux might be a good choice) and make sure the root password is in a whole bunch of files that skript kiddies are likely to google for, and in which the root account might reasonably be found (if the admin is stupid, that is).

      Set the login shell to an application which creates a fake shell, and which uses th

    • Actually, I do not link bash_history to /dev/null.

      I've been compromised once, and the attacker went through great length to install a rootkit in /tmp/../foo , grep his IP out of the message logs, etc. etc. The only thing that he forgot to do was remove the bash_history file, and I knew _exactly_ what damage he had done to my system.
  • by Anonymous Coward on Thursday July 31, 2003 @11:47AM (#6581089)
    google [216.239.51.104]
  • RIAA Logic: (Score:5, Funny)

    by connsmythe96 ( 576445 ) <slashdot AT adamkemp DOT com> on Thursday July 31, 2003 @11:47AM (#6581095) Homepage
    Google can be used to illegaly hack into computers (possibly stealing copyrighted information). Google must be shut down and all of its users owe us lots of money.
    • aha! (Score:3, Interesting)

      by Frymaster ( 171343 )
      this explains the trememndous number of google searches for "index of /scripts" that come from google to my site...

      of course i have section on my site for bash scripts... and it has an index page. looks like someone got dissappointed.

    • SCO Logic: (Score:5, Funny)

      by KillerHamster ( 645942 ) on Thursday July 31, 2003 @12:34PM (#6581595) Homepage
      Google uses operating systems! All your code are belong to us! Google must be shut down and all of its users owe us lots of money.
  • by Tweakmeister ( 638831 ) on Thursday July 31, 2003 @11:49AM (#6581112) Homepage
    A quick search for "Password [google.com]" doesn't yield any "promising" hacking results. It's too common a word.
  • Yea (Score:5, Funny)

    by mao che minh ( 611166 ) * on Thursday July 31, 2003 @11:49AM (#6581114) Journal
    Must be how that guy found out that my phpnuke code had a mySQL injection flaw in the news module. My article about a Hulk doll with big penis wasn't exactly fine journalism, but I would imagine that it was better then 40 lines of "hacked by Stacey 100% brasil LOL" that it was overwritten with.

    Damn script kiddies.

  • by Anonymous Coward on Thursday July 31, 2003 @11:50AM (#6581126)
    I tried this a while back - it isn't as easy as it looks with Google. I recently discovered WhittleBit [whittlebit.com] and it is pretty good at narrowing down what you are searching for because it lets you indicate which search results are good and which aren't, and re-search on that basis.

    This is particularly useful for this type of thing since it isn't always obvious what the criteria are for what you want to search for - with WhittleBit you don't need to know, it figures it out for itself.

  • by brlewis ( 214632 ) on Thursday July 31, 2003 @11:52AM (#6581148) Homepage
    They should mention that disallowing a URI in robots.txt tells crackers which URIs on your site have sensitive information. What I do is create a top-level /unpub/ URI, and everything sensitive goes underneath it with hard-to-guess names. In robots.txt I disallow /unpub only.
    • by PetoskeyGuy ( 648788 ) on Thursday July 31, 2003 @12:22PM (#6581466)
      I hope you at least have an .htaccess on the files to put a password on that directory. Hard-to-guess names is good, but making them password protected is better.

      Of course on some of the cheaper web hosting companies out there you can just search the /home/*/web folders. They have to be public so the web server can read them. Stupid I know, but all to common. Config.php for most apps will have all the users passwords in plaintext.

      The HTTPD user should be a member of each users group so you don't have to set world rights to your files. Assuming it's just hosting and no other rights are required.
      • by brlewis ( 214632 ) on Thursday July 31, 2003 @12:42PM (#6581675) Homepage

        Password-protected directories wouldn't need to be in robots.txt. Using robots.txt + security by obscurity is for things like family photos, where I don't want to maintain usernames and passwords for my entire extended family, but it isn't absolutely critical that no unauthorized person ever see them. I doubt I could trust my entire extended family to keep passwords secure anyway.

        Yeah, cheap shared hosting is largely insecure. I wonder how tough it would be to set up shared hosting using squid as an http accelerator, and let users run web servers under their own UID on different ports, while squid forwards from port 80.

    • Even more entertaining is to add a disallow: /secret.cgi entry, and then have secret.cgi log the IP address, datetime, etc, of requests.

      For bonus points, you can have secret.cgi automatically add requesting IP's to an apache rewrite config file.

      Cheers
      -b
  • robots.txt? (Score:5, Interesting)

    by Karma Sucks ( 127136 ) on Thursday July 31, 2003 @11:53AM (#6581150)
    You're kidding right? Putting stuff in robots.txt is the best way to *guarantee* that robots will go specifically for the file/directories you choose to deny.

    Don't be naive about robots.txt... expect to have to do some relatively fancy hacking to actually enforce it.
  • Sometimes its fun to look for WSFTP.LOG files and see what people have been uploading to website. You might find a file or two that's not linked from the other pages.

    Of course, it's not as fun as looking through the open "images" directories on angelfire pages. You always find stuff that's not linked from the main page.
  • Sesitive? (Score:3, Funny)

    by GoofyBoy ( 44399 ) on Thursday July 31, 2003 @11:53AM (#6581158) Journal

    use Google's cache to quickly hunt down sesitive pages,

    Try hacking a dictionary [reference.com].
  • robots.txt (Score:5, Interesting)

    by panaceaa ( 205396 ) on Thursday July 31, 2003 @11:54AM (#6581164) Homepage Journal
    Robots.txt only makes well-behaved search engines not index certain portions of your site. You're still going to be vulnerable until you take the sensitive pages off-line completely. But even then, if a passwords list has been indexed by Google, updating your robots.txt file won't remove it from Google's cache until Google spiders your site again. At which time, Google will discover the passwords list doesn't exist and remove it from the cache.

    At least that's how it should work. Is anyone aware of Google requesting robots.txt more often than they spider pages? And then proactively removing pages from their cache based on new robots.txt entries?

    While the article deals with Google specifically, lots of non-well-behaved spiders go through common locations looking for password files regardless of what you've blocked out with robots.txt. The only way to completely protect your data is to remove it from your site.
    • Re:robots.txt (Score:2, Interesting)

      by KenSeymour ( 81018 )
      I think you have to do more than that to get it out of the cache.

      I once had family phone numbers on a web page. Upon reflection, I decided that was no good and deleted the web page.

      It remained in the google cache until I replaced the file with a blank one with the same URL.
    • Re:robots.txt (Score:3, Informative)

      According to my experience with my webservers, Google will request robots.txt frequently as it spiders a site. And yes, they do remove pages from their cache based not only because of new robots.txt entries but new META tags in individual pages.

      If you can't wait until the next time Google crawls your site to have your information removed, you can always use Google's Automatic URL Removal System [google.com]. Details are available here [google.com].

      A few months back I updated all of my web pages to include the NOARCHIVE META ta

  • robots.txt (Score:5, Interesting)

    by zero-one ( 79216 ) <jonwpayne@gmBOYSENail.com minus berry> on Thursday July 31, 2003 @11:54AM (#6581169) Homepage
    Having a robots.txt is a good idea but it always amuses me when web sites use robots.txt to list all the areas of their site that they don't what people to look at. When robots.txt contains entries like "Disallow: /admin.asp" or "Disallow: /backdoor.asp" it stops being a way of controlling search engines and becomes a site map of all the places hackers might be interested in.

  • by stonebeat.org ( 562495 ) on Thursday July 31, 2003 @11:55AM (#6581183) Homepage
    It is always a good iea to kep the robots out of anywhere there is sensitive information. i several methods for added security. robot.txt is a good way, but i also the deflecction technique in apache's mod_rewrite to keep the crawlers out.
  • ICQ (Score:5, Interesting)

    by bazik ( 672335 ) <bazik@NOsPam.gentoo.org> on Thursday July 31, 2003 @11:58AM (#6581211) Homepage Journal
    A friend of mine actually used this to steal ICQ numbers. He wrote a perl script wich googles from "00000001.idx 00000001.dat" to "99999999.idx 99999999.dat" and spits out the result links to a textfile if it gets a full match.

    The ICQ password is stored in one of those two datafiles and there are dozend of free decrypt programms for that out there.

    But if you think about it... how or why does someone put his ICQ directory on a webserver?!

    On the other hand... some people are hosting pr0n sites and dont even know about it ;)
    • shocked! that's against google's terms of service, I hope you know

      (end sarcasm)
    • Re:ICQ (Score:3, Informative)

      by Politburo ( 640618 )
      If you're lazy and wanted to transfer ICQ information between sites, you might just toss it up on some webspace you have, download it from where you wanted it, and then forget about it forever.
  • I find it kind of depressing that even in Slashdot abstracts the word hacker isn't translated into the more correct "cracker".

    In this case, you could argue that using Google's cache to track down information for the purposes of cracking is very clever and is therefore deserving of being called a "hack", making the cracker a hacker.
  • Forgotten (Score:4, Funny)

    by orange_6 ( 320700 ) <(moc.liamg) (ta) (tlagtj)> on Thursday July 31, 2003 @12:00PM (#6581237) Journal
    So if I forgot my password, google can just tell me what it is? Can it tell me my credit card number too?
    • Can it tell me my credit card number too?

      Sure, John. I just checked. Your Visa number is 4803 1809 2273 4821, expiration 03/05.

      Your Discover card bill is overdue, though. Don't forget, according to this record, you've got 18.5% on overdue, PLUS your $15/mo late fee.

      Your 'condition' should have been cleared up by now, so why'd you refill that prescription on Tuesday? Oh, wait, I see here that you deposited three brand new $20's at the US Bank down near Santa Fe. Doing a little insurance fraud, there? :)

      Oh,

  • My favorite... (Score:5, Informative)

    by inertia187 ( 156602 ) * on Thursday July 31, 2003 @12:00PM (#6581242) Homepage Journal
    My favorite Google search phrase is:
    "Index of" "Name Last modified Size Description"
    Then you add file extensions or other things. For example:
    • mpg [google.com]
    • mov [google.com]
    • mp3 [google.com]
    • secret [google.com] - doesn't have to be file extensions...
    • "My Documents" [google.com] - yeah, that's secure...
    • etc
    Anyway, as you can see, it's pretty effective. Sometimes admins wise up, and all you have is the Google cache. But sometimes they don't, and you get to look. Thanks Google!
  • Well, duh! (Score:4, Insightful)

    by panda ( 10044 ) on Thursday July 31, 2003 @12:00PM (#6581248) Homepage Journal
    If something is meant to be private, then why even temporarily put links to it on your publicly visible pages? Additionally, if something really is private, then lock it down in the httpd.conf so that only certain IP addresses can access it. Then, its basically invisible to the rest of the world.

    Of course, if there's a bug in your server software all bets are off. Which is why it's better not to put private stuff where it can be seen on a public network.

    I would have thought that was pretty obvious.
  • BZZZZZZZT! Wrong! (Score:3, Interesting)

    by Entropy248 ( 588290 ) on Thursday July 31, 2003 @12:04PM (#6581285) Journal
    I don't think so. [google.com]

    I went through all 6 pages of results and found nothing. Ditto for searches on any of the terms individually. I imagine that searches on individual sites might be what the author is actually talking about, but have no independant means of verifying this. This FUD detected by Entropy248. Wow. I just RTFA and tried it at home...
  • by fastdecade ( 179638 ) on Thursday July 31, 2003 @12:04PM (#6581287)
    This article gives me great ideas for a website:

    * bash.history blog - Everything I ran today
    * /dev/tty blog - Everything I typed today
    * /dev/stdout blog - Everything I saw today

    COMING SOON: Welcome to My Bank Account Details, Favourite Passwords I Enjoy Using
  • So, let me get this straight: There is cracking info on the web. And Google can be used to search the web.


    We have a situation here, folks. Something must be done!


    Well, what do you expect from "new scientist"?

  • Scuse me? (Score:5, Insightful)

    by arth1 ( 260657 ) on Thursday July 31, 2003 @12:08PM (#6581327) Homepage Journal
    Shouldn't that be bash_history, passwd and tmp?
    Was this written down by a non-techie from an audio interview?

    Regards,
    --
    *Art
  • One would _think_ that admins would protect against this now, but i'm sure many won't.

    either way, it's a sweet hack, considering that the admin won't have any logs to show how the information leaked

    -t
  • robots.txt folly (Score:3, Insightful)

    by arth1 ( 260657 ) on Thursday July 31, 2003 @12:11PM (#6581364) Homepage Journal
    It might be worth it NOT to look at robots.txt -- after all, with robots.txt you effectively disclose to anyone who asks what you don't want to be shown.

    A robots.txt like this would be invaluable to a hacker, even though it would prevent Google from indexing:

    User-agent: *
    Disallow: /secret/passwd

    Regards,
    --
    *Art
  • by vadim_t ( 324782 ) on Thursday July 31, 2003 @12:12PM (#6581377) Homepage
    It's supposed to be used to tell bots not to access some parts of your site due to other reasons.

    Common reasons would be that you host a site with a forum on a DSL line and don't want google to index all 5000 threads on it. It's also good for dynamic pages, for example it makes no sense to index a generated page that will be out of date tomorrow. It'll be much better to let it index the archive instead.

    Using this for security is just stupid though, as it'd contain a list of vulnerable places. Maybe it will make harder for people to find your vulnerabilities from google, but it will help a lot whoever wants to attack you specifically.

    Security problems have to be fixed by setting proper permissions and keeping your server up to date, and not by relying on that every spider that comes to your site will be polite enough to follow robots.txt
  • by presroi ( 657709 ) <neubau@presroi.de> on Thursday July 31, 2003 @12:13PM (#6581382) Homepage
    Some people think that the google cache does not reveal the host name to the http-server.

    The result looks like this:
    proxy1.health.magwien.gv.at - - [29/Jul/2003:22:27:14 +0200] "GET /hfaq/icons/linki.png HTTP/1.0" 200 278 "http://www.google.at/search?q=cache:QIq92lU3jkUJ: www.presroi.de/hfaq/+heroin&hl=de&lr=lang_de&ie=UT F-8" "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0; ENR 2.0 emb)"
    proxy1.health.magwien.gv.at - - [29/Jul/2003:22:27:14 +0200] "GET /hfaq/icons/bt3.gif HTTP/1.0" 200 3170 "http://www.google.at/search?q=cache:QIq92lU3jkUJ: www.presroi.de/hfaq/+heroin&hl=de&lr=lang_de&ie=UT F-8" "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0; ENR 2.0 emb)"
    proxy3.health.magwien.gv.at - - [29/Jul/2003:22:27:43 +0200] "GET /hfaq/stats.html HTTP/1.0" 200 5231 "http://www.google.at/search?q=cache:QIq92lU3jkUJ: www.presroi.de/hfaq/+heroin&hl=de&lr=lang_de&ie=UT F-8" "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0; ENR 2.0 emb)"
  • by joeldg ( 518249 ) on Thursday July 31, 2003 @12:15PM (#6581399) Homepage
    I have seen more phpmyadmin pages wide open on google that anything else.. Not putting things like that under htaccess at least is pure laziness and stupidity.

    Also it seems people put mysql dumps on their webservers as well..
    search for ' "SELECT * FROM credit" + "###" ' and you will see.

    This has been going on since google introduced the site cache.
  • some guide! (Score:5, Funny)

    by mblase ( 200735 ) on Thursday July 31, 2003 @12:15PM (#6581401)
    Long says an obvious combination of search terms would include the terms "bash history", "temporary" and "password".

    Hmph. When I searched for those phrases at Google, all I got were a bunch of Linux technical how-tos and code samples. If this guy wants to teach us how to be hackers using Google, he's going to have to be more helpful than that!
  • by shoppa ( 464619 ) on Thursday July 31, 2003 @12:21PM (#6581458)
    At least 5 years ago it was fairly common knowledge that if you found any webserver's access_log you would get some juicy URL's. The method still works...
  • by scarolan ( 644274 ) on Thursday July 31, 2003 @12:22PM (#6581467) Homepage
    try searching for _vti_pvt and service.pwd on Google [google.com]. There are lots of people still using frontpage 4.0 or whatever, with their frontpage password file in plain view. I won't tell you what to do with that file, if you don't know already.
  • Google Warez Machine (Score:5, Interesting)

    by dhodell ( 689263 ) on Thursday July 31, 2003 @12:24PM (#6581482) Homepage
    I regarding the ability to use Google as a warez search machine. The article was about Google censorship and the one response to my post pinpointed almost exactly the point that I brought up, which is the point discussed in this article. [slashdot.org]

    Google has a nice long list of directory lists containing warez (remember the days of l33t FTP searching for filenames? Google for something like, in my last article: "xwin32*.exe * * * * *" "listing of"), serial numbers (Oh, I've found XP's serial number several times in Google's cache) and other "sensitive" information. My question is if other commercial sites are being constantly shut down due to these links (intentional or not), why aren't people targeting Google as well?

    In fact, if I'm *cough*too cheap to buy software*cough* or just want to evaluate some crippleware or such before I buy it, I often skip astalavista [astalavista.box.sk] and cracks.am [cracks.am] and just Google it up. Saves me the porn and pop ups, and I don't have to cripple my browser for this (yes I know it's possible to do in other ways, yes I enjoy javascript, no thanks, I don't want comments about how I'm retarded because I don't do it the right way).

    This is similar for sites such as the Internet Archive's Wayback Machine [archive.org] that contains other sensitive information.

    Because of the academic merit of both of these search mechanisms, I doubt either one will be shut down. Indeed, I highly doubt restrictions will be placed. They're valuable tools for finding more valuable tools. For more information about this sort of stuff, I suggest searching on Fravia+'s web-searching lore [searchlores.org]. Other information on there relates to "reality cracking", reverse engineering, and other taboo topics. Google's got it all cached. Interested? Just search for (insert topic here) site:searchlores.org.

    Sometimes I don't think the comparison of Google to God is that far off. Pardon my heresy.
    • it is not googles responcibility to monitor what other people on the net are doing.

      Besides, that sword as 2 sides, if someone intending malace uses google then a law enforcement agency can also use it.

  • by Rahga ( 13479 ) on Thursday July 31, 2003 @12:28PM (#6581520) Journal
    I honestly know of nobody else who uses this technique, I just figured I would try it back when I was hunting down upgrades for old games like Quake 2 while places like FilePlanet were getting hammered:

    At google, type "index of", followed by the precise name of the file you are looking for.

    I'd say this gives me good results on a fast server 95% of the time.
  • damn it... (Score:3, Informative)

    by edrugtrader ( 442064 ) on Thursday July 31, 2003 @12:33PM (#6581575) Homepage
    if only slashdots search was as good as googles i could point out this is the third time in a year this "story" has been run.
  • by lawpoop ( 604919 ) on Thursday July 31, 2003 @12:33PM (#6581585) Homepage Journal
    I tried "bash history", "password", and "temporary", hit "I feel lucky" and I didn't get to hack anything.

    I guess I don't have the patience to be a real hacker.

  • A little bit OT (Score:3, Informative)

    by edmz ( 118519 ) on Thursday July 31, 2003 @12:47PM (#6581724) Homepage
    Not the same kind of "hacks", but more than one might have missed that O'Reilly published [oreilly.com]recently Google Hacks [amazon.com]. Mostly targeted to webmasters or "power users".
  • by hohokus ( 253713 ) on Thursday July 31, 2003 @01:22PM (#6581974)
    while randomly googling for "index of" and ".bash_history", i found this, which may be amusing:

    http://www.smart-dev.com/texts/google.txt [smart-dev.com]

Someday somebody has got to decide whether the typewriter is the machine, or the person who operates it.

Working...