implague Posted April 8, 2010 Share Posted April 8, 2010 1. Go to the address bar in Firefox 2. Delete the url 3. Copy and paste this about:robots :lol: Link to comment Share on other sites More sharing options...
Administrator DKT27 Posted April 8, 2010 Administrator Share Posted April 8, 2010 Also:about:logoabout:mozillaabout:cache;) Link to comment Share on other sites More sharing options...
Bizarre™ Posted April 8, 2010 Share Posted April 8, 2010 @DKT27:You're a :smartass: Link to comment Share on other sites More sharing options...
Administrator DKT27 Posted April 8, 2010 Administrator Share Posted April 8, 2010 Get MR Tech Toolkit and you will get more of those. ;)It has hell lot of features. :) Link to comment Share on other sites More sharing options...
DoctorWho Posted April 8, 2010 Share Posted April 8, 2010 all the about stuff here:http://kb.mozillazine.org/About_protocol_links Link to comment Share on other sites More sharing options...
spootnack Posted April 8, 2010 Share Posted April 8, 2010 This is cool ! :dance2: I didn't know ! Thanks.++ Link to comment Share on other sites More sharing options...
DreamHaters Posted April 8, 2010 Share Posted April 8, 2010 Yeh i saw this when 3.0 was released...Random as hell. Link to comment Share on other sites More sharing options...
HX1 Posted April 8, 2010 Share Posted April 8, 2010 Odd I thought for about two seconds this was about the robots.txt file you can place in a directory to stop FF from prefetching.. Link to comment Share on other sites More sharing options...
Administrator DKT27 Posted April 9, 2010 Administrator Share Posted April 9, 2010 robots.txt can be used for many things. And can be very unsafe at times. If you get what I'm saying. ;) Link to comment Share on other sites More sharing options...
HX1 Posted April 9, 2010 Share Posted April 9, 2010 Not this one I use in all of my servers..User-agent: ALLDisallow: /This simply disallows any Web Browser from fetching items that are either indicated in pages ( which s unlikely being that I control/write my own ).. or prevents browsers from pre-fetching the links and resources in directories on the server.. using idle bandwidth or w/e.. Therefore crawling the server.. with a bot.. or natural browser function.. ( with also some intuitive changes in the config for the servers.. ) is disallowed, saves on bandwidth, un-necessary load, even caching of inaccurate data which can become outdated in a split second.. SO this isn't technically dangerous when used as intended.. BUT an individual can ( if I remember right ) design the document to pre-fetch items.. the same can be done of any webpage as well.. They can even target specific browsers... so my concerns where with efficiency.. very little upload bandwidth... and hacking. Which I did some myself to see what and how I needed to protect various areas.. The reason my sites are the way they are.. sometimes I even forget.. why I did something a certain way.. Then I have to remember what I did and why.. how it works.. LOL ( I know sounds stupid ).. There are quite a few things to protect... and nice means of accurate control..BTW: Don't get me wrong this method can be useful... like being able to load a page.. and have other pages readily available on demand... Sometimes acn amke the site much quicker to load.. Link to comment Share on other sites More sharing options...
Administrator DKT27 Posted April 9, 2010 Administrator Share Posted April 9, 2010 The reason I say it can be very unsafe is, most of the admins all over the internet use this to stop bots to access the areas. That includes the areas that the admin doesn't want a bot or a searcher come to know about. Thus, if a hacker knows this method or remembers it quite well, he can exploit it for personal benefits.Even a site like microsoft.com or many other big sites, have shown some hidden areas in their robots.txt that a normal person woundn't had thought in his dreams. Just open robots.txt on any site you visit and have a laugh about the security the admin of the site keeps.Admins or site owners have never taken this point into consideration. That's why it results in many unwanted disclosures. Some are big, some are small to entertain a small man. :lol:Your robots.txt is fully safe though. :P Link to comment Share on other sites More sharing options...
HX1 Posted April 9, 2010 Share Posted April 9, 2010 Yeah that is why you use other methods of protection.. The reason that places like Microsoft and others put these in so that it does not get circumvented in some way.. but if I have something I want that protected.. well.. your going to have to do more than hack my server..LOL Link to comment Share on other sites More sharing options...
Administrator DKT27 Posted April 9, 2010 Administrator Share Posted April 9, 2010 Haha. Only a expert can hack a server. And as far as I know you in tight security, it would surely be very hard. :P Link to comment Share on other sites More sharing options...
Tucker Posted April 9, 2010 Share Posted April 9, 2010 Fun stuff! :) Link to comment Share on other sites More sharing options...
haxx0rxx Posted April 9, 2010 Share Posted April 9, 2010 cool stuff. specially the robots. lol. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.