• What On Earth Is CGI?

    Date: 2011.02.24 | Category: WebDesign | Response: 0

    Let’s unlock a little bit of the mystery about something called CGI. If it helps any, CGI means Common Gateway Interface.

    This is a method which is used to swap data between the server (the hardware and software that actually allows you to get to your web site) and a web client (your browser). CGI is actually a set of standards where a program or script (a series of commands) can send data back to the web server where it can be processed.

    Typically, you use standard HTML tags to get data from a person, then pass that data to a CGI routine. The CGI routine then performs some action with the data.

    Some of the more common uses of CGI include:

    Guest books – The CGI routine is responsible for accepting the data, ensuring it is valid, sending an email acknowledgement back to the writer, perhaps sending an email to the webmaster, and creating the guest book entry itself.

    Email Forms – A simple CGI forms routine just formats the data into an email and sends it back to the webmaster. More complicated routines can maintain a database, send an acknowledgement and validate data.

    Mailing List Maintenance – These routines allow visitors to subscribe and unsubscribe from a mailing list. In this case, the CGI routine maintains a database of email addresses, and the better ones send acknowledgements back to the visitor and webmaster.

    A CGI routine can be anything which understands the CGI standard. A popular CGI language is called PERL, which is simple to understand and use (well, compared to other languages). PERL is a scripting language, which means each time a PERL routine is executed the web server must examine the PERL commands to determine what to do. In contrast, a compiled language such as C++ or Visual Basic can be directly executed, which is faster and more efficient.

    Okay, in a nutshell (and greatly simplified), here’s how it works:

    1) You (the webmaster) specify a form tag which includes the name of the CGI routine.

    2) You create HTML tags which retrieves data from your visitors.

    3) Each of the input tags includes a variable name. The data which is retrieved from the visitor (or directly set if the tag includes the “hidden” qualifier) is placed in the variable name.

    4) When the visitor presses the “submit” button, the CGI routine which was specified in the form tag is executed. At this time, the CGI routine “takes control”, meaning the browser essentially is waiting for it to complete.

    5) This CGI routine can get data from variable names. It retrieves the data and does whatever action is required.

    6) When the CGI routine finishes, it returns control back to the browser.

    Some important things to remember about CGI routines:

    You can install CGI routines on your own site if your host allows it http://www.webair.com is an example of a web host which allows for CGI routines. Some web hosts do not allow you to install your own routines but do provide some pre-written ones to you. If these are not sufficient for your needs, you can find a remote hosting service to provide the necessary functions.

    Generally, if you install your own routines they must be installed in the cgi-bin directory of your site. This is a special location which allows scripts and programs to be executed.

    CGI routines work best on Apache-style servers. Windows NT and Windows 2000 does support CGI, but it tends to be slow and problematic.

    If you use a remote hosting service, you must remember that although they appear to be giving you this for free, you are actually paying a price. Usually they want to display advertisements, although some of them actually take visitors away from your site.

    When you write a CGI routine, you have the choice of a scripting language like PERL or a compiled language such as C++ or Visual Basic. Anything which can execute on the web server is acceptable.

    I hope this short introduction to CGI has cleared up some of the mystery.

    Article written by Lee

  • Redirecting Questionable Adult Traffic

    Date: 2011.02.24 | Category: 2257, Scripts | Response: 0

    The one thing almost all reputable adult webmasters agree on is that one way or the other, we want to rid the net of those webmasters that profit from traffic primarily gained from either the direct promotion of or, targeting of, keywords relating to child pornography however, how can you tell what traffic you are being sent and, more importantly, how can you do something about the type of traffic that you receive to your site in order to filter out this unwanted traffic? The answer is simple, use a script to redirect the traffic elsewhere before it even hits your site.

    Child Porn Redirection Php Script.

    The following php script when used on your server will enable you to send unwanted traffic gained by the promotion of using ‘illegal’ keywords in the search engines and sites which link to your own.

    <!– Start Copy Here –>

    <?

    // Redirect “Lolita” traffic

    $refer_full_path = “$HTTP_REFERER”.”$PATH_INFO”;

    if(( preg_match(“/lolita/i”, $refer_full_path)) ||
    ( preg_match(“/child/i”, $refer_full_path)) ||
    ( preg_match(“/preteen/i”, $refer_full_path)) ||
    ( preg_match(“/pre-teen/i”, $refer_full_path)) ||
    ( preg_match(“/pedo/i”, $refer_full_path)) ||
    ( preg_match(“/underage/i”, $refer_full_path)) ||
    ( preg_match(“/beast/i”, $refer_full_path)) ||
    ( preg_match(“/rape/i”, $refer_full_path)) ||
    ( preg_match(“/kinder/i”, $refer_full_path)) ||
    ( preg_match(“/incest/i”, $refer_full_path)) ||
    ( preg_match(“/kiddie/i”, $refer_full_path))) {

    header(“Location: $refer_full_path”);
    exit;
    }

    ?>

    <?
    $words=array(“childporn”,”underage”,”beast”,”interracial”,”lolita”,”preteen”);
    for($i=0;$i<count($words);$i++){
    if(eregi($words[$i],$HTTP_REFERER)){
    header(“Location: http://www.fbi.gov/?CHILD_PORN_ON_DISK_LOGGED_AND_REPORTED”);
    }
    }
    ?>

    <!– End Copy Here –>

    In order to use this script, simply add additional keywords or partial word matches to the top part of the script and, include the bottom half of the script at the top of your HTML coding.

    Any traffic being sent to your site via keywords which you have specifically told the script not to allow access to your web site will be forwarded to the url in the bottom part of the php script which again, can be changed to wherever you like.

    Article written by Lee

  • Pay Per Click Or Search Engine Optimization

    Date: 2011.02.24 | Category: Paid Traffic, Search Engine Optimization | Response: 0

    What would you choose to run your business? Well each has their own benefits and drawbacks over one another.

    PPC or SEO The Breakdown.

    Pay Per Click or, PPC as it is most often referred to looks to the novice to be the better option for ‘immediate’ traffic results, you enter your desired keywords, place your minimum / maximum bid amount and you are set for top PPC engine listings for as long as you can maintain the balance in your engine account.

    Search Engine Optimization or, SEO on the other hand, is the more traditional way of attaining high ranking search engine pages. Either you or an SEO expert optimizes your websites pages and random other elements of your website and hopefully within a month or two, you achieve high rankings in the major search engines.

    So Which One? PPC or SEO?

    Generally speaking, SEO work is most commonly more cost effective to your business than utilizing PPC results to gain your traffic, you could pay an SEO expert anywhere from $500 plus to optimize your site and get high rankings indefinitely or, you could put that $500 into a PPC engine account and get high rankings until such time as your account balance runs dry.

    But lets look at this in terms of actual traffic…

    Say you get 1000 visitors to your SEO based website which you paid $500 for, each visitor has cost you $0.50c now lets say your site remains at the top of the engines for a few months perhaps even years each month you receive another 1000 visitors to your site, you have basically cut the cost of each surfer hitting your site down to less than a penny per hit (not taking into account bandwidth costs obviously).

    Now, on the other hand, you want to attract 1000 visitors from your chosen keywords via the PPC engines, most Pay Per Click search engines have a minimum bid amount of $0.05c per hit so right away in your first month, you could receive a potential 10k hits however, as most of you who have already tried your hand at the PPC engines will know, getting 10k hits for one or more keyword at a cost of $0.05 is hard to do, in fact, some would say almost impossible. Non the less let us keep going with this minimum bid amount for the time being.

    Immediately, you can see that you are already restricted to the actual amount of traffic you can receive from the PPC results to 10k hits however, this isn’t the case with the SEO traffic, you could potentially hit your top chosen keyword and stay there until another site out-optimizes you or, your site needs to be optimized again.

    Ultimately, the reasons you will choose over one or the other will be for either ease of traffic generation, PPC will allow you to gain almost instantaneous targeted traffic form the second you open your PPC account up until the point when your account funds empty whilst, SEO work will give you long term targeted traffic over time and, in most instances, this SEO traffic can last for years making the cost of the initial SEO work minimal.

    In Closing..

    Search Engine Optimization can last you years and years whilst Pay Per Click results can diminish in a relatively short amount of time depending on the amount of bid needed to achieve top listings.

    However lets look at a third option, using both PPC results and SEO results in conjunction with each other to minimize the traffic you lose from your SEO work and, to minimize the traffic you lose from your PPC results this will afford you the time to see what works with your Search Engine Optimized sites whilst being able to play with the targeting of keywords on your PPC traffic, once you have both types of search engine figured out, you can put them both together and use them to increase the traffic to your site for years to come.

    Article written by Lee

  • Domain Name Extensions + SEO

    Date: 2011.02.22 | Category: Search Engine Optimization, WebDesign | Response: 0

    With a slew of new TLD (Top Level Domain) extensions being launched in the first quarter of 2004 in addition to the hundreds already available how many of us actually take time to consider that when we register a .com, .net or, .org domain how will these sites help or hinder us with our search engine work? This is what we will take a closer look at in this article.

    Domain Name Extensions – What Are They?

    First of all, before we look at how the extensions of our domain names can assist us in running our search engine optimization methods we need to understand what the TLD’s themselves are actually for. Domain name extensions are essentially a way to recognize specific locales via the usage of domain names so for example, the domain name extension .com were primarily set up as commercial domain names however, with the commercial use of this extension it has also become, without a doubt the most popular extension for individuals or companies registering new domain names. In addition to the TLD extensions there are also a selection of domains extensions ranging from industry specific extensions such as .aero to country specific extensions such as .co.uk.

    Domain Name Extensions And Search Engines.

    Now we understand what the domain name extensions were put in place for we can now start to look at how they may benefit us in terms of SEO (search engine optimization) for example, head across to http://www.google.com and do a search on something such as ‘penis pills’ you can see from the results shown (01/01/04) that the first 10 results are evenly spread across a range of domain name extensions ranging from .com to .net and also some smaller .go.ro domain extensions. This would lead us to assume that at the current time, Google specifically is not paying to much attention to the extensions of the domain names we are using however, given the recent updates of the last month or two across Google this has also cleaned up a lot of the results that were present 2 months prior to this search in which the .biz extension was highly populated in the rankings.

    Domain Name Extension Abuse + Spam.

    With this slew of new domain name extensions being launched what seems like yearly this also opens up a whole lot of new problems for the webmaster primarily that of the domain name spam. Because domain names can be registered for as little as $5 per year many webmasters have taken to purchasing them, using them to spam the search engines and then, once the search engines discover the spam and remove the offending domains, the webmaster then moves on to new domains in effect, making domain names a disposable commodity to them. Whilst this method will certainly garner traffic for the search engine spammer it will also in turn mean that the traffic that honest webmasters receive from the search engines will be lower.

    Domain Name Extensions And Optimization.

    Hopefully this brief article has given you a little insight into how domain name extensions can both benefit and also detriment your business, by choosing your domain name extensions carefully and, making sure you do your best not to spam the search engines you can make some serious income from pure search engine traffic however, once you start to buy domain names with lesser known extensions to purely spam the search engines you are not only wasting your own money but, are also potentially wasting other hard working webmasters money too.

    Article written by Lee

  • Calling Complete JavaScript’s With One Line Of HTML Code

    Date: 2011.02.21 | Category: WebDesign | Response: 0

    We all know that some JavaScript coding can be excessive and add considerable amounts of additional coding to our pages making them both slower and less search engine friendly however, what if I told you there was a way you could call the EXACT same JavaScript functions you are calling now by using a single line of code and, in the process you would NEVER have to find a specific piece of JavaScript again as you would instantly know where it was saved.

    You would probably think the above is in fact very time consuming but, it actually takes no time at all.

    The secret is to copy and paste the JavaScript into a plain text file and save that on your server somewhere. However, instead of saving it as a .txt extension save this file as a .js extension and, to make it easier still to find, when you upload this file to your server place it in a directory called /JavaScript/.

    That’s pretty much it, easy huh?

    Now comes the part of actually getting the JavaScript to work on your web pages, what you need to do instead of using the complete JavaScript coding that you usually would if you wanted to make a pop-up console appear is to use the following coding placed between your <head> and </head> tags:

    <script language=”JavaScript” src=”../JavaScript/popup.js”></script>

    That will now call the popup.js file or whatever it was you called it from your server and make the JavaScript function work for your surfers with JavaScript enabled in their browser.

    As you can see, not only does using this method save you a considerable amount of time but, it will also make your pages more search engine friendly as spiders will not have to work their way through a lot of heavily JavaScript before they get to your SE content.

  • Cross Selling – Offer Your Surfers An Alternative

    Date: 2011.02.23 | Category: Promotion | Response: 0

    As webmasters our primary goal is to make money from our surfers in order to be successful at business, however, in order to make money from our surfers we need to constantly change the way in which we sell products and services to them.

    One such method of selling products to surfers is by using cross selling or, up selling as it is sometimes referred to in the industry. So what exactly is cross selling and, more importantly, what benefit does it give your business model? This is what we shall take a look at in this article.

    Cross Selling – The Basics.

    Cross selling is a method used by marketers to make the maximum amount of money out of a single sales lead at any given time regardless of whether you are working online or, in a bricks and mortar store cross selling is something that almost every sales person does from offering an extended warranty on a new television purchase to offering an affiliate product from inside your sites secure area.

    Cross Selling – How To.

    One of the easiest ways to initiate the cross selling process is simple to provide your site visitors with a selection of links going to a multitude of individual products, this may be a dating site, a software package or, something else, either way, by placing a selection of links on your main selling pages you have already initiated the cross selling process and, by doing so, have already increased your chance at making a sale.

    Cross Selling – New Customers.

    One method of cross selling that is becoming more and more prevalent on the internet is that of cross selling or, up-selling new customers immediately after they have made a purchase, this in itself is not a bad thing after all, you know they have money to spend so, why not use this fact to garner additional sales.

    One good way of offering cross sales to new customers is to give the an option on the actual shopping cart page immediately when they hit it, this may be for a site that complements the product they are making a purchase to already or, something completely different either way, by having this already on the order page waiting for your customer you might just make additional money from them.

    Cross Selling – Old Customers.

    So what happens when a customer leaves your site after making a purchase? Well there are two routes you can take with thee customers, one, is to leave them alone and hope that they return to buy from your site again whilst, the other, my personal preference, is to send these customers a regular follow up letter offering them a product they may be interested in based on their initial purchase after all, if they purchased a copy of Adobe Photoshop, the chances are they may also be interested in something such as Paint Shop Pro, why not give the surfer this as an option by placing a link in the follow up letter taking them to that specific area on your site.

    Cross Selling – Overview.

    When all is said and done, cross selling whether you agree with it or not is a very powerful sales tool, not only can it increase the revenue potentials from new clients but, can also enable any business a method of gaining further incomes from older customers, the one thing that you should all be doing on a regular basis is ensuring that you have a good cross selling strategy in place, regardless of whether you actually are selling memberships to sites, warranties on televisions or, mouse pads for new pc owners.

    Article written by Lee

  • Obscenity – Put It To The Test

    Date: 2011.02.23 | Category: Writing | Response: 0

    Regardless of how long any of us have been an adult webmaster we all need to be
    aware of obscenity laws and, in particular how they affect our businesses
    whether we think a hardcore photoset is ‘obscene’ or not ultimately, if you get
    taken to court on obscenity charges the one thing you should be aware of is how
    the courts will decide whether the images you are using will be classified as
    obscene or not.

    Testing Obscenity – The Miller Test.

    The Miller test was developed in the 1973 court case of Miller vs. California
    and in comprises of three parts ALL of which MUST be satisfied on order for
    something to be deemed obscene by the courts. The Miller test is the ‘official’
    method used by the United States Supreme Court for determining whether an
    expression or a speech can be determined as obscene and, if deemed obscene, it
    is not protected under the First Amendment and is therefore prohibited by law.

    The Miller Test – Part One.

    Part one of The Miller Test states something may be obscene if ‘the average
    person, applying contemporary adult community standards, would find that the
    work, taken as a whole, appeals to the prurient interest’ In essence, this
    means that if the ‘average’ person on a jury or on the bench finds the work to
    be deemed obscene then, it is. However for the court to rule something as
    obscene it also has to be deemed obscene by the standards set in part two and
    part three below.

    The Miller Test – Part Two.

    Part two of The Miller Test states that something is potentially obscene is
    ‘the average person, applying contemporary adult community standards, would
    find that the work depicts or describes, in a patently offensive way, sexual
    conduct’. Basically this is saying that if the images or speech is something
    which is not practiced in a manner befitting your local community standards
    then again, it may be obscene. However, as with part one of The Miller Test for
    a court to find something obscene it needs to fall below the standards in part
    three below.

    The Miller Test – Part Three.

    Part three of The Miller Test states that something is potentially obscene if,
    ‘a reasonable person would find that the work, taken as a whole, lacks serious
    literary, artistic, political, or scientific value.’ This is pretty much where
    you could potentially come unstuck after all, everyone has different sexual
    tastes and because of this, just because something that may be widespread such
    as ‘bare backing’ (to use as an example) could potentially be considered as
    obscene if you happen to have a jury who are devout practicers of safe sex.

    The Miller Test – Overview.

    In essence The Miller Test is a useful guideline for webmasters when it comes
    to operating our sites and specifically, when it comes to choosing the types of
    content we utilize on them however, for the most part The Miller Test itself is
    outdated. Since the early 70’s when this test was devised there have been many
    sexual practices that were once deemed obscene that have become more a part of
    everyday life and accepted in to society as whole thus, what once would (or
    could) have been deemed obscene would no longer be in the same sense as
    something that may be deemed obscene today could be found not to be in 5 years
    time.

    Article written by Lee

  • Building A Surfer Trap – Stage 5

    Date: 2011.02.21 | Category: Traffic | Response: 0

    So we hit stage 5 in this surfer trap tutorial.

    It was brought to my attention this morning that we never added any ALT tags to our single FPA link so, in a change to the planned tutorial I am going to touch on this stage as, once the search engines get to our surfer traps this is going to be a crucial aspect on how highly we get ranked.

    So what’s next?

    Ok, now what you have to do is go back to manually editing the FPA’s (All of them!)

    What you need to do is this…

    Take the Multi-Site FPA first then, on ALL of the links that lead to the single site FPA’s you need to add the ALT tag. Again, in the same way as we did originally however, instead of using this tag on the images we will use this on the actual TEXT of the link so, for example the link which may be:

    ‘Voyeur Porn’ leading to the FPA you have for the Voyeur niche in the HTML coding will already look like this:

    <a href=”mydomain.com”>Voyeur Porn</a>

    Will get turned into:

    <a href=”mydomain.com” ALT=”More Niche Related Keywords”>Voyeur Porn</a>

    The reason we are going back over these links now and not earlier on is because you should hopefully have started to get a small amount of traffic from your counter impressions. These counters are virtually ALWAYS being crawled by the search engines due to the amount of people linking to them so, by optimizing our site at this stage, it makes it less work in getting into the search engines.

    One other thing that we can now start to do (as we did a couple of stages back) is to create some more HTML pages with some tables on them, however, these will be HTML pages on their own with no images on them. Again however, you should make them 4 columns across and two rows high.

    What you want to add into these tables are NICHE links so for example, taking the TEEN niche we would make eight links like:

    Teen Sex
    College Girls
    Erotic Teens
    Teen Porn
    Etc
    Etc
    Etc
    Etc…

    You should do this for each of the MAIN niches so you would have a table for Teen, Gay, Mature, Asian, Ebony, Fetish and one for General and again, these should link to the NICHE FPA’s that you already have created.

    These tables will be used for another console on our surfer trap however, before we implement this console we are going to have some fun with them.

    Article written by Lee

  • Redirecting To A Different Page Using JavaScrip

    Date: 2011.02.24 | Category: Scripts | Response: 0

    There are times when a simple JavaScript redirection from one page to another can come in handy and, the following JavaScripting will enable you to do this.

    When a page contains this javascript, it will be redirected to another page that you specify in the “window.location=”. You can change the number of refresh seconds by changing the “move()’,1000 to the number of seconds you’d like.

    Example:

    1000 = 1 second
    2000 = 2 seconds
    3000 = 3 seconds

    Place this JavaScript code between the <head> and </head> tags

    <script language=”JavaScript”>
    <!–hide from old browsers
    var time = null
    function move() {
    window.location = ‘http://www.yourdomain.com’
    }
    //–>
    </script>

    Place this JavaScript code in your <body> tag

    <body onload=”timer=setTimeout(‘move()’,1000)”>

    You should now have a page that will redirect to a new url when it loads in the surfers browser window.

    Article written by Lee

  • Straight From The Horses Mouth – Get Googlized

    Date: 2011.02.24 | Category: Search Engine Optimization | Response: 0

    Many webmasters wonder how to ensure their sites will be included in Google’s index of web sites. Although Google crawls more than a billion pages, it’s inevitable some sites will be missed. When Google does miss a site, it’s frequently for one of these reasons:

    * The site is not well connected through multiple links to others on the web.
    * The site launched after Google’s last crawl was completed.
    * The design of the site makes it difficult for Google to effectively crawl its content (excessive frames, tables, etc).

    Google’s intent is to represent the content of the Internet fairly and accurately. To help make that goal a reality, we offer this guide to building a “crawler-friendly” site. There are no guarantees a site will be found by our crawler, but following these guidelines should increase the probability that your site will show up in Google search results.

    Do…
    Provide high-quality content on your page – especially your home page.
    If you follow only one tip from this page, this should be it. Our crawler indexes web pages by analyzing the content of the pages themselves. Google will index your site better if your pages contain useful information. Plus, your site has a better chance of becoming a favorite among web surfers and being linked to by others if the information it contains is relevant and useful.

    Submit your site to the appropriate category in a web directory.
    Listing your site in the Open Directory Project http://www.dmoz.org/ or Yahoo! http://www.yahoo.com/ increases the likelihood it will be seen by robot crawlers and web surfers.

    Pay attention to HTML conventions.

    Make sure that your <TITLE> and <ALT> tags are accurate and descriptive. Also, check your <A HREF> tags for errors since broken or improperly formatted links can prevent Google from indexing your page.

    Make use of the robots.txt file on your web server.
    This file tells crawlers which directories can or cannot be crawled. Make sure it’s current for your site so that you don’t accidentally block our crawler. Visit: http://www.robotstxt.org/wc/faq.html for a FAQ answering questions regarding robots and how to control them once they visit your site.

    Ensure that your site is accessible through HTML hyperlinks.
    Generally, your site is crawlable if the pages are connected to each other with ordinary HTML links. If certain areas are not linked, you may be excluding older browsers, differently-abled users, and Google. Google can crawl content from a database or other dynamically generated content as long as it can be found by following links. If you have many unlinked pages, you may want to create a jump page from which the crawler can find all of your pages.

    Build your site with a logical link structure.
    A hierarchical link structure is not only beneficial to you, but also to Google. More of your site can be crawled if it is laid out in with a clear architecture.

    Don’t…
    Fill your page with lists of keywords, attempt to “cloak” pages, or put up “crawler only” pages.
    If your site contains pages, links or text that you do not intend visitors to see, Google considers them deceptive and may ignore your site.

    Feel obligated to purchase a search optimization service.
    Some companies “guarantee” your site a place near the top of a results page. While legitimate consulting firms can improve your site’s flow and content, others employ deceptive tactics to try and fool search engines. Be careful – if your domain is affiliated with one of these services, it could be permanently banned from our index, we have found search engine optimization software like Web Position Gold works best but, again use it in moderation.

    Use images to display important names, content or links.
    Our crawler does not recognize text contained in graphics.
    Use ALT tags if the main content and key words on your page cannot be formatted in regular HTML.

    Provide multiple copies of a page under different URLs
    Many sites offer text-only or printer-friendly versions of pages that contain the same content as the graphic-enriched version of the page. While Google crawls these pages, duplicates are removed from our index. In order to ensure that we have the desired version of your page, place the other versions in separate directories and use the robots.txt file to block our crawler.

    Article written by a Google employee

Premium Sponsors















Categories

Site Links