Page 1 of 2 12 LastLast
Results 1 to 10 of 19

Thread: HOSTS file - Tacky messages

Hybrid View

  1. #1
    Retired Guest

    HOSTS file - Tacky messages

    http://mewnlite.com/unable.jpg
    There oughta be a way browsers could detect when a site is being blocked by
    a hosts file and return a friendlier & much shorter message!

    --
    -- I'm retired. I was tired yesterday. I'm tired again today --

  2. #2
    David H. Lipman Guest

    Re: HOSTS file - Tacky messages

    From: "Retired" <senile@nursinghome.nat>

    > http://mewnlite.com/unable.jpg
    > There oughta be a way browsers could detect when a site is being blocked by
    > a hosts file and return a friendlier & much shorter message!
    >


    That's the nature of of using the etc/hosts file in a way that it wasn't intended.

    It is real simple - If you don't like it, don't use it.

    --
    Dave
    Multi-AV Scanning Tool - http://multi-av.thespykiller.co.uk
    http://www.pctipp.ch/downloads/dl/35905.asp



  3. #3
    siljaline Guest

    Re: HOSTS file - Tacky messages

    Retired wrote:
    > http://mewnlite.com/unable.jpg
    > There oughta be a way browsers could detect when a site is being blocked by
    > a hosts file and return a friendlier & much shorter message!


    Depends on what Hosts file in which you refer ?
    Try this: Note (cannot guarantee results as am not a Moz user)
    <http://winhelp2002.mvps.org/hostsfaq.htm#Firefox_only>

    Silj


    --
    "Arguing with anonymous strangers on the Internet is a sucker's game
    because they almost always turn out to be -- or to be indistinguishable from
    -- self-righteous sixteen-year-olds possessing infinite amounts of free time."
    - Neil Stephenson, _Cryptonomicon_


  4. #4
    VanguardLH Guest

    Re: HOSTS file - Tacky messages

    Retired wrote:

    > There oughta be a way browsers could detect when a site is being
    > blocked by a hosts file and return a friendlier & much shorter
    > message!


    Well, think about it. What is happening when you use a hosts file? You
    are telling your computer to use the IP address that *you* specified to
    lookup a hostname that you specified. A DNS lookup is required to
    convert a hostname, a text string that humans like to use, to an IP
    address, a binary value required by computers to connect to each other.
    When doing a DNS lookup, your OS will first do a lookup in the local
    hosts file (which equates a hostname to an IP address). Well, what is
    your web browser trying to connect to? A web server! So run a web
    server at the IP address you specify. If you specify 127.0.0.1 for the
    IP address then run a web server at that IP address (i.e., your local
    host). You could run a web server on a different one of your intranet
    hosts and point your hosts file over there, like running a web server on
    your host at 192.168.10.30 and defining the lookups in your hosts file
    to use that IP address (192.168.10.30 hn1.doubleclick.net). Your web
    browser is trying to connect via HTTP to a web server so give it one at
    whatever IP address you use in the hosts file.

    At the web server to which you point using the IP address in the hosts
    file, deliver a web page that tells you that some content got blocked,
    like "<host> access blocked by hosts file". Don't make it too big
    because some content might be small images inside a web page that you
    are blocking and you want to see your blocking message. Unless you did
    a custom install of your *unidentified* operating system, it's like it
    included a web server so you already have one installed. Use that one
    to deliver your replacement page for the blocked content. One
    suggestion is to use an image for the replacement content instead of
    text. A web page might have a small area for the content that you want
    to block which is resized depending on the window size for the web
    browser, and an image lends itself well to resizing.

    There is a freeware utility web server used just for this purpose; see
    http://www.softpedia.com/get/Interne.../eDexter.shtml.
    You installed the product on your computer which was, in effect, a web
    server. Since it was running on your own computer, it was listening at
    IP address 127.0.0.1. So when your web browser specified a hostname
    that matched with one in the hosts file and because your hosts file
    pointed at 127.0.0.1, your web browser connected to this local web
    server to show its replacement content ("Content blocked"). I never
    bothered to look much into that tool since all it did was run a local
    web server and most operating systems already include their own.

    Whether using eDexter or another web server, and when you want to run
    your own real web server on your own host, there'll be problem with
    which web server your or other web browsers will connect. You can't
    have 2 web servers running at 127.0.0.1 (localhost) and both listening
    on the default HTTP port (80). The DNS scheme (whether using a DNS
    server or the hosts file) doesn't know about ports. It provides a
    lookup from hostname to IP address. That's it. However, if you want to
    run your own real web browser that is accessible to others, you can have
    it listen on a port other than 80, like 8080. Then in the port
    forwarding that you'll need to configure in your router to punch a hole
    through its firewall to allow outside access to your host and its web
    server, you have it forward connects on port 80 on itself (the router)
    to port 8080 on the port to which you forward those connects to your
    internal web server host. Router port 80 --> web server host port 8080.
    Obviously if you're going to punch a hole through the router's firewall
    and run a web server on your internal host then you better be sure to
    lockup and secure that internal host on your network. You would use one
    web server for the hosts file lookups (your web browser attempts to get
    to a hostname on port 80 but the hosts file points to your own web
    server listening on port 80 and delivers its replacement content).
    Another web server on the same host is listening on a different port
    (8080) which you use for external requests from your host (even you can
    go out and back into your local host) and you use port forwarding to go
    from port 80 on your WAN-side connection (router's upstream side) to
    port 8080 for your web server host. (By the way, if you want to run
    your own web server and want to identify it by name instead of having to
    remember what is the current dynamically assigned IP address for your
    router given to it by your ISP, look at DynDNS or No-IP for free dynamic
    DNS service).

    Alternatively, use some security software that does this for you. Some
    security products will let you do URL blocking (rather than use a hosts
    file). A disadvantage with the hosts file is that you have to specify a
    host, not a domain. You cannot block on .doubleclick.com in a hosts
    file. You HAVE to specify the FQDN (fully qualified domain name) and
    that includes specifying the host. If you look at the MVPs hosts file,
    it has over 50 entries alone for the Doubleclick domain to list the
    hostnames for all those that are known for that domain. In a security
    product that lets you do URL blocking, you can just specify to block on
    ..doubleclick.com and .doubleclick.net to cover that domain. The
    security product will intercept connects to those domains and, like a
    web server, deliver to your web browser a small message as the
    replacement content for the blocked content. For example, Avast lets
    you do URL blocking and it puts a small message as replacement content.

    Note that if your intent is to block others, like your children, from
    getting to a web site (versus blocking ads), they can still use an IP
    address to get there. The hosts file or other URL blocking works on
    hostnames, not on IP addresses. Like DNS, they are returning an IP
    address for the hostname you specified. Obviously they aren't involved
    if a lookup isn't needed because you specified an IP address. Also
    remember that anything you can do on your computer can be undone by
    anyone you grant physical access to that computer, even if they don't
    have an admin-level account on that computer. If your intent is to
    censor what others can see using your computer, do the filtering
    upstream at a device to which those users do not have physical access,
    like in the router or in a gateway host. If you doing ad-blocking, you
    don't need to worry since you don't need to be concerned about how you
    are censoring yourself.

    Be aware that web sites can determine if you are not downloading all of
    their content. If you choose to block some of the content, they can see
    that and choose not to provide you with some of their content (i.e.,
    deliver to you a limited page) or entirely block delivery of their web
    page (after detecting you don't get some of it). In fact, some blocking
    can render a page worthless. For example, if you block
    "google.com/adsense/" then the page might not paint correctly because
    the site is using off-site scripts that come from the other domain.
    Most users are using a hosts file that was pre-compiled by someone else,
    like the MVPs hosts file, so you can run into problems at some sites
    regarding loss of their content.

    With a hosts file, you have to manually rename that file to something
    else and reload/refresh your web browser to get the web page to display
    correctly, then later remember to rename back the hosts file to continue
    using it again later. Adding the site to your Trusted Sites list won't
    help because you are using the hosts file to redirect the connects to
    127.0.0.1 (or whatever IP address you specify; I find 127.0.0.0 works
    faster since the computer doesn't waste time trying to find a web
    server). It is a manual process (or you could write a batch file to
    script the renames) to get the hosts file temporarily out of the way.
    With a security product that includes URL blocking, it's usually pretty
    simple to quickly disable just its URL blocking.

  5. #5
    JD Guest

    Re: HOSTS file - Tacky messages

    Retired wrote:
    > http://mewnlite.com/unable.jpg
    > There oughta be a way browsers could detect when a site is being blocked by
    > a hosts file and return a friendlier& much shorter message!
    >


    This works for SeaMonkey, which uses Firefox as it's browser:

    about:config

    Set browser.xul.error_pages.enabled to false.

    --
    JD..

  6. #6
    Retired Guest

    Re: HOSTS file - Tacky messages

    JD <JD@example.invalid> wrote in
    news:6vSdnZsl3IoZYjbTnZ2dnUVZ_hydnZ2d@posted.grand ecom:

    > Retired wrote:
    >> http://mewnlite.com/unable.jpg
    >> There oughta be a way browsers could detect when a site is being
    >> blocked by a hosts file and return a friendlier& much shorter
    >> message!
    >>

    >
    > This works for SeaMonkey, which uses Firefox as it's browser:
    >
    > about:config
    >
    > Set browser.xul.error_pages.enabled to false.


    You da man! Works like a charm. There's still blank space there but that's
    much prettier than all that "unable to connect garbage".

    Thank you!

    --
    -- I'm retired. I was tired yesterday. I'm tired again today --

  7. #7
    JD Guest

    Re: HOSTS file - Tacky messages

    Retired wrote:
    > JD<JD@example.invalid> wrote in
    > news:6vSdnZsl3IoZYjbTnZ2dnUVZ_hydnZ2d@posted.grand ecom:
    >
    >> Retired wrote:
    >>> http://mewnlite.com/unable.jpg
    >>> There oughta be a way browsers could detect when a site is being
    >>> blocked by a hosts file and return a friendlier& much shorter
    >>> message!
    >>>

    >>
    >> This works for SeaMonkey, which uses Firefox as it's browser:
    >>
    >> about:config
    >>
    >> Set browser.xul.error_pages.enabled to false.

    >
    > You da man! Works like a charm. There's still blank space there but that's
    > much prettier than all that "unable to connect garbage".
    >
    > Thank you!
    >


    You're welcome! Not my solution but always glad to share what I've
    learned from this and other newsgroups. And I really didn't like the
    unable to connect garbage. Kind of defeated the purpose of the HOSTS file.

    --
    JD..

  8. #8
    David H. Lipman Guest

    Re: HOSTS file - Tacky messages

    From: "JD" <JD@example.invalid>

    > Retired wrote:
    >> JD<JD@example.invalid> wrote in
    >> news:6vSdnZsl3IoZYjbTnZ2dnUVZ_hydnZ2d@posted.grand ecom:
    >>
    >>> Retired wrote:
    >>>> http://mewnlite.com/unable.jpg
    >>>> There oughta be a way browsers could detect when a site is being
    >>>> blocked by a hosts file and return a friendlier& much shorter
    >>>> message!
    >>>>
    >>>
    >>> This works for SeaMonkey, which uses Firefox as it's browser:
    >>>
    >>> about:config
    >>>
    >>> Set browser.xul.error_pages.enabled to false.

    >>
    >> You da man! Works like a charm. There's still blank space there but that's
    >> much prettier than all that "unable to connect garbage".
    >>
    >> Thank you!
    >>

    >
    > You're welcome! Not my solution but always glad to share what I've learned from this and
    > other newsgroups. And I really didn't like the unable to connect garbage. Kind of
    > defeated the purpose of the HOSTS file.
    >


    Kudos!



    --
    Dave
    Multi-AV Scanning Tool - http://multi-av.thespykiller.co.uk
    http://www.pctipp.ch/downloads/dl/35905.asp



  9. #9
    FromTheRafters Guest

    Re: HOSTS file - Tacky messages


    "JD" <JD@example.invalid> wrote in message
    news:MtqdnRs1mv5jWTHTnZ2dnUVZ_v2dnZ2d@posted.grand ecom...
    > Retired wrote:
    >> JD<JD@example.invalid> wrote in
    >> news:6vSdnZsl3IoZYjbTnZ2dnUVZ_hydnZ2d@posted.grand ecom:
    >>
    >>> Retired wrote:
    >>>> http://mewnlite.com/unable.jpg
    >>>> There oughta be a way browsers could detect when a site is being
    >>>> blocked by a hosts file and return a friendlier& much shorter
    >>>> message!
    >>>>
    >>>
    >>> This works for SeaMonkey, which uses Firefox as it's browser:
    >>>
    >>> about:config
    >>>
    >>> Set browser.xul.error_pages.enabled to false.

    >>
    >> You da man! Works like a charm. There's still blank space there but that's
    >> much prettier than all that "unable to connect garbage".
    >>
    >> Thank you!
    >>

    >
    > You're welcome! Not my solution but always glad to share what I've learned
    > from this and other newsgroups. And I really didn't like the unable to connect
    > garbage. Kind of defeated the purpose of the HOSTS file.


    That is not the purpose of the HOSTS file anyway. It is being misused as
    a filter. Better would be an actual firewall.



  10. #10
    VanguardLH Guest

    Re: HOSTS file - Tacky messages

    FromTheRafters wrote:

    > "JD" <JD@example.invalid> wrote in message
    > news:MtqdnRs1mv5jWTHTnZ2dnUVZ_v2dnZ2d@posted.grand ecom...
    >> Retired wrote:
    >>> JD<JD@example.invalid> wrote in
    >>> news:6vSdnZsl3IoZYjbTnZ2dnUVZ_hydnZ2d@posted.grand ecom:
    >>>
    >>>> Retired wrote:
    >>>>> http://mewnlite.com/unable.jpg
    >>>>> There oughta be a way browsers could detect when a site is being
    >>>>> blocked by a hosts file and return a friendlier& much shorter
    >>>>> message!
    >>>>>
    >>>>
    >>>> This works for SeaMonkey, which uses Firefox as it's browser:
    >>>>
    >>>> about:config
    >>>>
    >>>> Set browser.xul.error_pages.enabled to false.
    >>>
    >>> You da man! Works like a charm. There's still blank space there but that's
    >>> much prettier than all that "unable to connect garbage".
    >>>
    >>> Thank you!
    >>>

    >>
    >> You're welcome! Not my solution but always glad to share what I've learned
    >> from this and other newsgroups. And I really didn't like the unable to connect
    >> garbage. Kind of defeated the purpose of the HOSTS file.

    >
    > That is not the purpose of the HOSTS file anyway. It is being misused as
    > a filter. Better would be an actual firewall.


    The same attitude would mean a Swiss Army knife would have very limited
    functionality; i.e., it would only have as many uses as there were
    blades. What something was intended for use doesn't necessarily limit
    for what it can be used. The use of the hosts file as a content blocker
    from a host (not a set of hosts at a domain) has been long established
    for over a decade. Not only can it be used as a content blocker (ads,
    intellitext), it can be used to block known malicious sites (distribute
    malware or infected software, attempt drive-by downloads).

    For example, I can add entries to the hosts file to block a program that
    phones home that I'd want to get updates (which I don't want because
    they cost money and I'm satisfied with the old already-paid-for
    version). The product is a media stream capture program so it obviously
    has to make Internet connections to capture those streams so I cannot
    create a blanket rule to block all network connections by the product,
    but I can keep it from phoning home. It also does make an IP-only
    connect to phone home which I block using IPSec. That's all built into
    Windows so I don't need to install or rely on 3rd party software which
    is what you suggest. In a similar vein, I use SpywareBlaster (free
    non-resident version) to update the registry to add class IDs with their
    killbit set to prevent known malware from getting loaded from registry
    reference). Nothing of SpywareBlaster has to run after the killbit
    update because this is a feature already built into Windows. I see no
    reason to rely on the requirement of installing 3rd party software,
    keeping it updated, incurring software conflicts or behavior anomalies
    or usage limitations on my host when there already exist inbuilt
    solutions in Windows. I also use SRPs (software restriction policies)
    already built into Windows instead of having to install 3rd party
    HIPS-enabled security products to define the same rules inside of them
    that I can do using SRPs.

    Alas, not all firewall products include a URL blocking feature so what
    you mention may not be available. Not many users considering use of or
    using the hosts file are operating their own gateway host or firewall
    host/appliance to include URL blocking. Of course, URL blocking only
    works on hostnames since the idea is to intercept the DNS lookups. A
    user specifying an IP address would circumvent any URL blocking. Yes,
    you can incorporate IP blocking, too, but IPs change quicker than
    hostnames as sites, webhosters, ISP, or sites move around their hosts in
    their network or choose to use load balancing. I've found IP address
    blocking requires more maintenance (updates) than URL blocking but many
    security products only give you IP blocking.

    Personally I find the hosts file to be cumbersome but then few users of
    pre-compiled hosts files review their content. Users of pre-compiled
    hosts files let someone else decide what should get blocked to their
    host. For example, it takes over 50 entries for Doubleclick in the MVPs
    hosts file. It is also possible that any hostname will be accepted by a
    nameserver to resolve to an IP address. Whatever you ask for a host
    lookup will return an IP address from their nameserver which means there
    is nearly an infinite number of hosts employed by that domain and no
    hosts file will cover them all (and even attempting such would result in
    slowing down the DNS lookup). The hosts file works on hosts, not
    domains.

    If you use a firewall, AV, or other security product that provides URL
    blocking, you can block entire domains. For example, I can use Avast's
    URL blocking to get rid of ANY hosts at a domain, not just a few;
    however, Avast hasn't the foresight to include import/export
    functionality to allow exporting a backup of the list of URL block
    strings so a subsequent fresh install can have the list imported. The
    free version of Online Armor won't let you export/import its settings.
    So while they have URL blocking, you'll lose your own list of blockings
    that you compiled over time and experience.

    Alternatively, instead of local URL blocking, you can have it done by
    your DNS provider. OpenDNS, for example, will let you select categories
    of sites to block. Their free account lets you define up to 50 URL
    block strings (so you could specify domains instead of hosts). Alas,
    their free account only lets you define 50 URL block strings but for
    most users that are compiling their own block list this is sufficient.
    To force all your users to use OpenDNS (or some other specific DNS
    provider where filtering is available), you would have to block any port
    53 traffic out of your router, gateway host, or firewall host/appliance
    that doesn't target OpenDNS' DNS server.

    The OP apparently only cares about URL blocking (mostly its effect of
    displaying a message in the area of the web page for the blocked
    content) in the context of a web browser. While Firefox can do ad
    blocking using an extension and the config option eliminates the error
    text for the blocked content, IE8 can also block sites or domains using
    its InPrivate Filtering feature (but you have to compile and import the
    ..xml file along with a registry edit to have IE8 always load with
    InPrivate Filtering enabled). Since I build that .xml file, I know
    what's getting blocked, I can easily disable the blocking (by
    temporarily disabling InPrivate Filtering in IE8), and I'm not concerned
    in the URL blocking strings can be exported from a security product so
    they can be imported later in a fresh install.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •