• Using SSI For Auto Updates

    Date: 2011.02.24 | Category: WebDesign | Response: 0

    We all realize the benefits of being able to save time when building sites so, I got into thinking, how can I make my sites look as if they are continually updated without the need to go in and update them manually? Enter the world of SSI.

    SSI is actually a nifty little tool, not only can you include files from a central location but, you can include them at specific times of the day, days of the week or even months of the year, very handy indeed if you are building any type of site that needs updating periodically.

    Once the main burst of work has been completed you can pretty much use the same files over and over again to help you out.

    So onto the auto updating SSI, the following SSI coding will enable you to update a page or pages based on which day of the month it is. It will check the day the page has been accessed and display the relevant information again, this is a handy thing to have should your sponsor be running a promotion over several days, all you need to do is update a selection of SSI files and all of your sites are updated instantly.

    <!–#config timefmt=”%d”–>
    <!–#include virtual=”/yourdirectory/$DATE_LOCAL.txt”–>

    What you need to do is create 31 text files named 01.txt right the way through to 31.txt take the SSI call above and edit the location of the SSI files on your server, you may like to have a central folder named /SSI/ for this purpose so the location would be changed to /domain.com/SSI/$DATE_LOCAL.txt

    I the 31 files you created you could have a table ad with eight of your sponsors links, an article in each one or even just a simple text link, anything that you may want to update can be included in these files.

    As I mentioned above you can base the time, date and even month of rotation to whatever you like to alter how the files are rotated and ultimately viewed on the web you should change the %d in the timefmt field to one of the following:

    %d : Day of the month requires 31 files named 01.txt to 31.txt
    %w : Day of the week requires 7 files named 0.txt to 6.txt
    %j : Day of the year requires 365 files named 001.txt to 365.txt
    %u : The week of the year requires 52 files named 00.txt to 53.txt
    %m : The month of the year requires 12 files named 01.txt to 12.txt
    %H : Hour of the day requires 24 files named 00.txt to 23.txt
    %M : Minute of the hour requires 60 files named 00.txt to 59.txt

    As you can see from the above there really are no limitations to the uses of updating using SSI and, apart fro the relative ease of use and the time saved using them should one sponsor not be converting for you all you need to do to swap sponsors is alter your central set of SSI files and you have instantly changed sponsors over all of your sites.

    Article written by Lee

  • What On Earth Is CGI?

    Date: 2011.02.24 | Category: WebDesign | Response: 0

    Let’s unlock a little bit of the mystery about something called CGI. If it helps any, CGI means Common Gateway Interface.

    This is a method which is used to swap data between the server (the hardware and software that actually allows you to get to your web site) and a web client (your browser). CGI is actually a set of standards where a program or script (a series of commands) can send data back to the web server where it can be processed.

    Typically, you use standard HTML tags to get data from a person, then pass that data to a CGI routine. The CGI routine then performs some action with the data.

    Some of the more common uses of CGI include:

    Guest books – The CGI routine is responsible for accepting the data, ensuring it is valid, sending an email acknowledgement back to the writer, perhaps sending an email to the webmaster, and creating the guest book entry itself.

    Email Forms – A simple CGI forms routine just formats the data into an email and sends it back to the webmaster. More complicated routines can maintain a database, send an acknowledgement and validate data.

    Mailing List Maintenance – These routines allow visitors to subscribe and unsubscribe from a mailing list. In this case, the CGI routine maintains a database of email addresses, and the better ones send acknowledgements back to the visitor and webmaster.

    A CGI routine can be anything which understands the CGI standard. A popular CGI language is called PERL, which is simple to understand and use (well, compared to other languages). PERL is a scripting language, which means each time a PERL routine is executed the web server must examine the PERL commands to determine what to do. In contrast, a compiled language such as C++ or Visual Basic can be directly executed, which is faster and more efficient.

    Okay, in a nutshell (and greatly simplified), here’s how it works:

    1) You (the webmaster) specify a form tag which includes the name of the CGI routine.

    2) You create HTML tags which retrieves data from your visitors.

    3) Each of the input tags includes a variable name. The data which is retrieved from the visitor (or directly set if the tag includes the “hidden” qualifier) is placed in the variable name.

    4) When the visitor presses the “submit” button, the CGI routine which was specified in the form tag is executed. At this time, the CGI routine “takes control”, meaning the browser essentially is waiting for it to complete.

    5) This CGI routine can get data from variable names. It retrieves the data and does whatever action is required.

    6) When the CGI routine finishes, it returns control back to the browser.

    Some important things to remember about CGI routines:

    You can install CGI routines on your own site if your host allows it http://www.webair.com is an example of a web host which allows for CGI routines. Some web hosts do not allow you to install your own routines but do provide some pre-written ones to you. If these are not sufficient for your needs, you can find a remote hosting service to provide the necessary functions.

    Generally, if you install your own routines they must be installed in the cgi-bin directory of your site. This is a special location which allows scripts and programs to be executed.

    CGI routines work best on Apache-style servers. Windows NT and Windows 2000 does support CGI, but it tends to be slow and problematic.

    If you use a remote hosting service, you must remember that although they appear to be giving you this for free, you are actually paying a price. Usually they want to display advertisements, although some of them actually take visitors away from your site.

    When you write a CGI routine, you have the choice of a scripting language like PERL or a compiled language such as C++ or Visual Basic. Anything which can execute on the web server is acceptable.

    I hope this short introduction to CGI has cleared up some of the mystery.

    Article written by Lee

  • Straight From The Horses Mouth – Get Googlized

    Date: 2011.02.24 | Category: Search Engine Optimization | Response: 0

    Many webmasters wonder how to ensure their sites will be included in Google’s index of web sites. Although Google crawls more than a billion pages, it’s inevitable some sites will be missed. When Google does miss a site, it’s frequently for one of these reasons:

    * The site is not well connected through multiple links to others on the web.
    * The site launched after Google’s last crawl was completed.
    * The design of the site makes it difficult for Google to effectively crawl its content (excessive frames, tables, etc).

    Google’s intent is to represent the content of the Internet fairly and accurately. To help make that goal a reality, we offer this guide to building a “crawler-friendly” site. There are no guarantees a site will be found by our crawler, but following these guidelines should increase the probability that your site will show up in Google search results.

    Do…
    Provide high-quality content on your page – especially your home page.
    If you follow only one tip from this page, this should be it. Our crawler indexes web pages by analyzing the content of the pages themselves. Google will index your site better if your pages contain useful information. Plus, your site has a better chance of becoming a favorite among web surfers and being linked to by others if the information it contains is relevant and useful.

    Submit your site to the appropriate category in a web directory.
    Listing your site in the Open Directory Project http://www.dmoz.org/ or Yahoo! http://www.yahoo.com/ increases the likelihood it will be seen by robot crawlers and web surfers.

    Pay attention to HTML conventions.

    Make sure that your <TITLE> and <ALT> tags are accurate and descriptive. Also, check your <A HREF> tags for errors since broken or improperly formatted links can prevent Google from indexing your page.

    Make use of the robots.txt file on your web server.
    This file tells crawlers which directories can or cannot be crawled. Make sure it’s current for your site so that you don’t accidentally block our crawler. Visit: http://www.robotstxt.org/wc/faq.html for a FAQ answering questions regarding robots and how to control them once they visit your site.

    Ensure that your site is accessible through HTML hyperlinks.
    Generally, your site is crawlable if the pages are connected to each other with ordinary HTML links. If certain areas are not linked, you may be excluding older browsers, differently-abled users, and Google. Google can crawl content from a database or other dynamically generated content as long as it can be found by following links. If you have many unlinked pages, you may want to create a jump page from which the crawler can find all of your pages.

    Build your site with a logical link structure.
    A hierarchical link structure is not only beneficial to you, but also to Google. More of your site can be crawled if it is laid out in with a clear architecture.

    Don’t…
    Fill your page with lists of keywords, attempt to “cloak” pages, or put up “crawler only” pages.
    If your site contains pages, links or text that you do not intend visitors to see, Google considers them deceptive and may ignore your site.

    Feel obligated to purchase a search optimization service.
    Some companies “guarantee” your site a place near the top of a results page. While legitimate consulting firms can improve your site’s flow and content, others employ deceptive tactics to try and fool search engines. Be careful – if your domain is affiliated with one of these services, it could be permanently banned from our index, we have found search engine optimization software like Web Position Gold works best but, again use it in moderation.

    Use images to display important names, content or links.
    Our crawler does not recognize text contained in graphics.
    Use ALT tags if the main content and key words on your page cannot be formatted in regular HTML.

    Provide multiple copies of a page under different URLs
    Many sites offer text-only or printer-friendly versions of pages that contain the same content as the graphic-enriched version of the page. While Google crawls these pages, duplicates are removed from our index. In order to ensure that we have the desired version of your page, place the other versions in separate directories and use the robots.txt file to block our crawler.

    Article written by a Google employee

  • The Web Safe Color Palette

    Date: 2011.02.24 | Category: WebDesign | Response: 0

    The “Web Safe” palette is a bit controversial. It is a set of 216 colors that are, supposedly, guaranteed to appear as intended on all graphical displays when used in HTML, CSS, and images embedded in Web pages. Many Web developers believe that sticking to these colors is one of the holiest commandments in the Web design scripture.

    This was mostly a concern when most computers had 8-bit color displays; these days, most people run at 16-bit or 24-bit color. Although these bit-depths render the Web Safe palette pointless, dithering and quantification bugs in browsers and operating systems still cause problems in 16-bit displays (16-bit display, also known as “High Color” mode or “Thousands of Colors,” is generally problematic). Extensive testing has led to a new palette, called “Really Safe,” whose colors are guaranteed to appear correctly on all displays and all browsers.

    If you use different colors than these, you might see images and backgrounds of the same color appear at a slightly different tint, so that a “box” will be visible around them if the background extends beyond the image’s edges.

    Below is the table of ‘Web Safe’ and ‘Really Safe’ colors, you will see some of the color hex codes are in red, these are ‘Really Safe’ colors.

    Code Color Code Color Code Color Code Color Code Color Code Color
    000000 000033 000066 000099 0000cc 0000FF
    003300 003333 003366 003399 0033cc 0033ff
    006600 006633 006666 006699 0066cc 0066ff
    009900 009933 009966 009999 0099cc 0099ff
    00cc00 00cc33 00cc66 00cc99 00cccc 00ccff
    00FF00 00ff33 00FF66 00ff99 00FFCC 00FFFF
    330000 330033 330066 330099 3300cc 3300ff
    333300 333333 333366 333399 3333cc 3333ff
    336600 336633 336666 336699 3366cc 3366ff
    339900 339933 339966 339999 3399cc 3399ff
    33cc00 33cc33 33cc66 33cc99 33cccc 33ccff
    33ff00 33FF33 33FF66 33ff99 33FFCC 33FFFF
    660000 660033 660066 660099 6600cc 6600ff
    663300 663333 663366 663399 6633cc 6633ff
    666600 666633 666666 666699 6666cc 6666ff
    669900 669933 669966 669999 6699cc 6699ff
    66cc00 66cc33 66cc66 66cc99 66cccc 66ccff
    66FF00 66FF33 66ff66 66ff99 66ffcc 66FFFF
    990000 990033 990066 990099 9900cc 9900ff
    993300 993333 993366 993399 9933cc 9933ff
    996600 996633 996666 996699 9966cc 9966ff
    999900 999933 999966 999999 9999cc 9999ff
    99cc00 99cc33 99cc66 99cc99 99cccc 99ccff
    99ff00 99ff33 99ff66 99ff99 99ffcc 99ffff
    cc0000 cc0033 cc0066 cc0099 cc00cc cc00ff
    cc3300 cc3333 cc3366 cc3399 cc33cc cc33ff
    cc6600 cc6633 cc6666 cc6699 cc66cc cc66ff
    cc9900 cc9933 cc9966 cc9999 cc99cc cc99ff
    cccc00 cccc33 cccc66 cccc99 cccccc ccccff
    ccff00 ccff33 CCFF66 ccff99 ccffcc ccffff
    FF0000 FF0033 ff0066 ff0099 ff00cc FF00FF
    ff3300 ff3333 ff3366 ff3399 ff33cc ff33ff
    ff6600 ff6633 ff6666 ff6699 ff66cc ff66ff
    ff9900 ff9933 ff9966 ff9999 ff99cc ff99ff
    ffcc00 ffcc33 ffcc66 ffcc99 ffcccc ffccff
    FFFF00 FFFF33 FFFF66 ffff99 ffffcc FFFFFF

    Hopefully, you will find a use for the two different color palettes that are now available and, you can begin designing for your surfers, regardless of which browser they use.

    Article written by Lee

  • The Adult Internet – More Than Just Porn

    Date: 2011.02.24 | Category: General | Response: 0

    So you have several adult websites up and running and, if you are one of the lucky few in this day and age you have already started to make money from your adult sponsor programs however, there are so many other ways that webmasters can make money in the adult industry in addition to their regular ‘porn site’ sponsors yet, many webmasters do not realize there are vast opportunities awaiting them.

    Lets spend a few moments to look at the alternative sources of income that we can use in the adult industry and also take a look at the basics of each type of affiliate system.

    Email Collection.

    Opt in email is big business, there is no denying this fact so why not jump on the opt in email collecting bandwagon? Most email collection sponsors will offer you anywhere from $0.50 per email collected to $2.00 per email collected and, better still, to start making money from your existing traffic base all you need to do is include a small email collection box.

    Software Programs.

    Many of the larger adult affiliate programs now offer their webmasters a range of software products to generate additional income, many of these products are good for use not only in adult but, also in mainstream. For instance, Anonymous browsing software, history deletion software, IE Toolbars and even submission programs are all good ways to generate additional income either on a per download or, per sale basis. Best of all, you only need to add a new banner to the sites you currently have in order to get a share of this lucrative market.

    Herbal Products.

    Penis Pills, Breast Enlargement, Vitamin Supplements, you name it you can sell it on adult traffic. Whether you run a TGP, Link List or your own collection of free sites, these herbal products can often sell much better than ‘porn sites’ and, for the most part, the payout levels are the same, if not higher than what your regular ‘porn site’ trial will make you. Again to get started with this type of sales all you need to do is add a text link or banner to your existing sites.

    Casino Sites.

    Gambling makes BIG money, just look at Vegas for the example, millions of people a year visit Nevada to gamble and millions more simply can’t get there for a range of reasons however, almost every home in the United States has a computer and internet connection so why not start using this to your advantage? Many of the larger casino affiliate programs will pay based on the amount of funds credited to a new users account and, to get started making money from casino sites it is quite literally as easy as adding a banner or text link to your existing sites again.

    Pre Paid Services.

    Such as Credit Cards and the likes, selling these types of products to your adult surfers not only offers then ‘anonymity’ on the web when they do make online purchases but at the same time when your surfers sign up or one of these pre-paid credit cards you will be making some money in the process. Often however you will find that these types of affiliate program do not payout as much ‘long term’ as they are pretty much a first time deal as far as, once the surfer spends money to initially ‘charge’ their pre paid card with funds, that is when you will make money and not again. Still, even if you can make $50 by selling one of these cards, its still an additional $50 you wouldn’t have made otherwise.

    Adding To Your Profits.

    As you can see from the above, there are a wide range of products and services you can offer your surfers and, as we all know, the more things we can offer our ‘customers’ the more chance we have at one of them buying something. Of course, as with anything, the more effort we put into marketing these products and services the more chance e have at making a sale however, unless we try to sell the surfer something other than porn, how will we know whether they will buy or not?

    Article written by Lee

  • SEO Pyramid Scheme

    Date: 2011.02.24 | Category: Search Engine Optimization | Response: 0

    I thought i would spend a little time to let you guys in on a little something i like to call the SEO Pyramid Scheme

    Basically, we all know the importance of targeting specific keywords and phrases however, this isn’t necessarily the easiest of things to do until now…

    Lets take a simple free site as our example, We know we have to have Meta Tags, Descriptions, Alt Texts and Body text on our site but how do we keep this all in within the theme of our site and, more importantly, how do we make sure we target as much of our niche traffic as needed? Actually the process itself is a simple one that has been around for many years however, very few people make good use of it.

    So we have our free site all ready and waiting to be optimized for the search engines the first thing we need to do is take a look at our sites content (read as images) and make a short mental description of them so for example, if we have a teen site the pictures may be of a ‘blonde sexy teen model wearing stockings’.

    That description is the basis for our keyword pyramid.

    We now have to construct our pyramid based on that brief description so we start o break it down word by word for example:

    Blonde
    Sexy
    Teen
    Model
    Stockings

    That is our primary layer in the pyramid already completed, not so hard really was it

    Now comes our second level in our SEO pyramid:

    Blonde Sexy
    Sexy Teen
    Teen Model
    Model Stockings

    As you can see from this we now have our secondary layer of the pyramid all worked out onto our tertiary layer:

    Blonde Sexy Teen
    Sexy Teen Model
    Teen Model Stockings

    Now we have our tertiary layer for our pyramid we can continue this for further layer like this:

    Blonde Sexy Teen Model
    Sexy Teen Model Stockings

    Again, we can break this down one more level like this:

    Blonde Sexy Teen Model Stockings

    We now have 5 layers to our pyramid.

    The next stage is to incorporate these layers into both your Meta Tags and, more importantly you body text.

    Take each layer in turn and, where possible include one line from each later into each portion of our HTML code from the Meta Tags, Alt Tags, Main Body Text, Hyperlink Text and, Image File Names.

    By working through each layer of this SEO pyramid at a time you will not only discover keyword rich phrases that you could otherwise be missing out on in the search engines but, you will give your sites a theme making your chances of being listed for the correct search terms even better.

    Article written by Le

  • JavaScript – Redirecting Foreign Surfers

    Date: 2011.02.24 | Category: Scripts, WebDesign | Response: 0

    At some point or another we are no doubt going to have the need to redirect some or all of our surfers based on the language they speak, this snippet of JavaScript when placed on your page will enable you to do just that without the need for .php or other more complex scripting.

    Here is the coding that you need to place between your <head> and </head> tags:

    <SCRIPT LANGUAGE=”JavaScript1.2″>
    <!– Begin
    if (navigator.appName == ‘Netscape’)
    var language = navigator.language;
    else
    var language = navigator.browserLanguage;

    if (language.indexOf(‘en’) > -1) document.location.href = ‘english.shtml';
    else if (language.indexOf(‘nl’) > -1) document.location.href = ‘dutch.shtml';
    else if (language.indexOf(‘fr’) > -1) document.location.href = ‘french.shtml';
    else if (language.indexOf(‘de’) > -1) document.location.href = ‘german.shtml';
    else if (language.indexOf(‘ja’) > -1) document.location.href = ‘japanese.shtml';
    else if (language.indexOf(‘it’) > -1) document.location.href = ‘italian.shtml';
    else if (language.indexOf(‘pt’) > -1) document.location.href = ‘portuguese.shtml';
    else if (language.indexOf(‘es’) > -1) document.location.href = ‘Spanish.shtml';
    else if (language.indexOf(‘sv’) > -1) document.location.href = ‘swedish.shtml';
    else if (language.indexOf(‘zh’) > -1) document.location.href = ‘chinese.shtml';
    else
    document.location.href = ‘english.shtml';
    // End –>
    </script>

    To add additional language redirects to this JavaScript all you need to do is duplicate the:

    else if (language.indexOf(‘zh’) > -1) document.location.href = ‘chinese.shtml';

    Section of the coding changing the (‘zh’) language code to that of the language you wish to redirect.

    Article written by Lee

  • Adult Webmaster Health

    Date: 2011.02.21 | Category: General | Response: 0

    Sounds to me like you are a webmaster. Most of us realize that working at a computer everyday can and often is, bad for our health in one way, shape or, form. However, how do we alleviate these potential problems with our health?

    Well in this article I will detail some of the things that can often affect the webmasters in our industry and how they can be solved.

    RSI (Repetitive Stress Injuries) are the results of, as you may have figured out already, the effects of constantly doing the same movements over and over again using specific parts of your body. One of the most common of these that webmasters are aware of is CTS (Carpal Tunnel Syndrome) which is a result of typing a lot.

    So, how do you combat the effects of CTS? First and foremost, if you experience any form of pain at all, your first port of call should be the doctor, the pain you are feeling could be an indicator of a bigger problem. If you think your pain might be caused by use of the computer then an occupational therapist might also be a good person to visit.

    CTS is often attributed to use of your digits and poor hand positioning when you type one way to solve this problem is to go to Office Depot and purchase a wrist rest, this will ensure your wrist has ample support whilst you are working throughout the day.

    You might also like to try re-positioning your monitor, as a general rule of thumb, having your monitor placed about 20 inches away from your face will usually result in good posture and that in itself can often be a solution to the potential medical problems. One other thing on your posture, get a good chair, one with a high back may be good, these generally offer you more support and, can stop that awkward habit of leaning into your keyboard when you type.

    Eyestrain, Eyestrain is another common problem that the webmaster faces, often it leads to things such as excessive headaches, fatigue and, blurry vision, the most acceptable relief from eyestrain is the use of screen filter that will reduce the glare your monitor emits.

    There of course, some other things you can do to alleviate this problem such as, Adjusting your monitor so the top of the screen is no higher than eye level, as already mentioned, keeping the monitor a safe working distance from you, usually between 18 and 30 inches is recommended by doctors.

    We know computers are machines and tend to forget that our own bodies are complex machines which, should be looked after just as our computers should be, we often forget that sitting at the PC building what was meant to be a 10 minute site can turn into an hours worth of ‘online work’. Take some time every now and again to stand up and walk for 5 minutes, even if it is just to the local store to buy some more smokes, you are actually getting some exercise and, even though the tar in your smokes will end up killing you anyway, you’ll at least be able to work a little longer without getting any problematic computer related medical symptoms.

    This article is not meant as an alternative to visiting your physician and, should you think that any of the above are relevant to you then it is recommended that you visit your doctors without delay and follow any advice that they give you.

    Article written by Lee.

  • Robots.txt – Control The Robots That Crawl Your Sites

    Date: 2011.02.24 | Category: Search Engine Optimization, WebDesign | Response: 0

    By writing a structured text file you can indicate to robots that certain parts of your server are off-limits to some or all robots. It is best explained with an example:

    # robots.txt file for general use on web servers.

    User-agent: webcrawler
    Disallow:

    User-agent: googlebot
    Disallow: /

    User-agent: *
    Disallow: /cgi-bin
    Disallow: /logs
    The first line, starting with ‘#’, specifies a comment.

    The first paragraph specifies that the robot called ‘webcrawler’ has nothing disallowed: it may go anywhere.

    The second paragraph indicates that the robot called ‘googlebot’ has all relative URLs starting with ‘/’ disallowed. Because all relative URL’s on a server start with ‘/’, this means the entire site is closed off.

    The third paragraph indicates that all other robots should not visit URLs starting with /cgi-bin or /log. Note the ‘*’ is a special token, meaning “any other User-agent”; you cannot use wildcard patterns or regular expressions in either User-agent or Disallow lines.

    Two common errors:

    Wildcards are not supported: instead of ‘Disallow: /tmp/*’ just say ‘Disallow: /tmp’.
    You shouldn’t put more than one path on a Disallow line (this may change in a future version of the spec)
    Ultimately, without the use of robots.txt files on your servers/domains, you are risking a variety of potential problems including, unauthorized access to your cgi directory, unauthorized viewing of your site stats, possible spamming of the search engines by accidental crawling of doorway pages.

    One distinct advantage however of having a robots.txt file on your server is that, quite simply, you will be able to tell when and where your site has been indexed or potentially indexed as, all robots will automatically call for the robots.txt file BEFORE any other page on your server so, as long as you keep an eye open for any calls of this file, you can see who is knocking at your site for indexing purposes.

    Below is a robots.txt example that you can copy and paste into a text document to use on your own server:

    <!–Start Copy Below This Line–>

    User-agent: *
    Disallow: /cgi-bin
    Disallow: /logs

    <!–End Copy Above This Line–>

    The above will allow all spiders to crawl all of your site except the subdirectory’s ‘cgi-bin’ and ‘logs’ which, may be altered to suit any subdirectory’s you do not wish the spiders to crawl on your server.

    Article written by Lee

  • Undeveloped Domains – Put Them To Use

    Date: 2011.02.24 | Category: Domain Names | Response: 0

    Often when searching for new domain names, i come across what should, in theory be a golden opportunity only to find, the domain itself has already been registered and, whilst this in itself is annoying, what is even more annoying is that the domain 404’s when typed into the browser window.

    The mere fact that someone else, a webmaster no less, has thought about purchasing the same domain as what you may have means there is already value in that domain and, more importantly, you have potentially lost a sale.

    So how can we capitalize on this potential lost traffic from the off-set? That is what we will look at in this brief article.

    The first thing we need to do in order to start making some additional potential profit from our domain is to create a ‘generic’ holding page until such time that we have the time or, funding, to develop the site we had intended to place on our new domain name.

    This holding page can take many forms depending on the type of traffic you are hoping to target with the domain itself. Ideally, you will want to have as much choice for the surfer (or webmaster) on this holding page as you can so, you need to assess the nest types of sites to use, the best use of the traffic no matter how small it could be and, more importantly, the best way to maximize your sales potential.

    One good way of doing this is to split the page into three sections, two equal sized sections at the top portion of the page and, one smaller portion towards the very base of the page designed, almost like a footer.

    In the two top portions you should equally distribute both surfer orientated and, webmaster orientated links both of which need to be clearly separated.

    For example, the left side of the page take all of your top converting paysites and list them by niche, they don’t have to have fancy or heavy graphics, text links will suffice for now as this is only a ‘temporary’ page.

    On the right hand side of the page place some of your webmaster referral linking codes with a brief description, remembering that not only surfers could hit this page but webmasters themselves.

    On the ‘footer’ portion of the page, the most important section, you should put your contact details, ideally an email address and, if the domain warrants, details of how you can be reached by instant messenger. The reason for the email and instant messenger details is a simple one, if a webmaster REALLY wants the domain that you have, he, or she, might just make you an offer on it and, if they have no way to get in touch with you then, you have just lost an offer on a domain that you might not get around to using for months.

    of course, in addition to utilizing the traffic you have on the domain you can also use this holding page to generate more traffic, for example, placing a banner or button exchange code on the site or, perhaps a counter. The possibilities to generate traffic to these pages are limitless depending on how you use the holding page itself.

    Well, that’s the basics of domain holding pages explained and, hopefully you will have realized that no matter what you plan on doing with your new domains, after your host has added them to your server, the next thing you should do is to create a generic holding page that you can upload into the rot of the domain name and, who knows, you might end up making some money a little sooner from that unused domain name.

    Article written by Lee

Premium Sponsors















Categories

Site Links