Jump to content

Robots in Firefox -


implague

Recommended Posts

1. Go to the address bar in Firefox

2. Delete the url

3. Copy and paste this

about:robots

:lol:

Link to comment
Share on other sites


  • Replies 14
  • Views 2.1k
  • Created
  • Last Reply
  • Administrator

Also:

about:logo

about:mozilla

about:cache

;)

Link to comment
Share on other sites


all the about stuff here:http://kb.mozillazine.org/About_protocol_links

Link to comment
Share on other sites


DreamHaters

Yeh i saw this when 3.0 was released...Random as hell.

Link to comment
Share on other sites


Odd I thought for about two seconds this was about the robots.txt file you can place in a directory to stop FF from prefetching..

Link to comment
Share on other sites


  • Administrator

robots.txt can be used for many things. And can be very unsafe at times. If you get what I'm saying. ;)

Link to comment
Share on other sites


Not this one I use in all of my servers..


User-agent: ALL
Disallow: /

This simply disallows any Web Browser from fetching items that are either indicated in pages ( which s unlikely being that I control/write my own ).. or prevents browsers from pre-fetching the links and resources in directories on the server.. using idle bandwidth or w/e.. Therefore crawling the server.. with a bot.. or natural browser function.. ( with also some intuitive changes in the config for the servers.. ) is disallowed, saves on bandwidth, un-necessary load, even caching of inaccurate data which can become outdated in a split second.. SO this isn't technically dangerous when used as intended.. BUT an individual can ( if I remember right ) design the document to pre-fetch items.. the same can be done of any webpage as well.. They can even target specific browsers... so my concerns where with efficiency.. very little upload bandwidth... and hacking. Which I did some myself to see what and how I needed to protect various areas.. The reason my sites are the way they are.. sometimes I even forget.. why I did something a certain way.. Then I have to remember what I did and why.. how it works.. LOL ( I know sounds stupid )..

There are quite a few things to protect... and nice means of accurate control..

BTW: Don't get me wrong this method can be useful... like being able to load a page.. and have other pages readily available on demand... Sometimes acn amke the site much quicker to load..

Link to comment
Share on other sites


  • Administrator

The reason I say it can be very unsafe is, most of the admins all over the internet use this to stop bots to access the areas. That includes the areas that the admin doesn't want a bot or a searcher come to know about. Thus, if a hacker knows this method or remembers it quite well, he can exploit it for personal benefits.

Even a site like microsoft.com or many other big sites, have shown some hidden areas in their robots.txt that a normal person woundn't had thought in his dreams. Just open robots.txt on any site you visit and have a laugh about the security the admin of the site keeps.

Admins or site owners have never taken this point into consideration. That's why it results in many unwanted disclosures. Some are big, some are small to entertain a small man. :lol:

Your robots.txt is fully safe though. :P

Link to comment
Share on other sites


Yeah that is why you use other methods of protection.. The reason that places like Microsoft and others put these in so that it does not get circumvented in some way.. but if I have something I want that protected.. well.. your going to have to do more than hack my server..LOL

Link to comment
Share on other sites


  • Administrator

Haha. Only a expert can hack a server. And as far as I know you in tight security, it would surely be very hard. :P

Link to comment
Share on other sites


Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...