• Converting Japanese Traffic – The Niche Paysites That Work

    Date: 2011.02.21 | Category: Traffic | Response: 0

    Up to this point we have only looked at the European traffic sources and what niches convert for them however, we will now take a look at the Japanese specific adult surfers and, see just what makes them tick when they come to look online for porn.

    Japanese Adult Traffic – Dispelling The Myths.

    Contrary to popular beliefs Japanese porn surfers are not all looking for Bukakke or Hentai orientated adult sites in fact, the results we saw far from backed this up. Also, many webmasters believe Japanese traffic is worthless again, from our results this is simply not the case.

    Japanese Adult Traffic – The Niches.

    We were extremely pleased with the results we found from the Japanese traffic we got to our hub sites in fact, we altered the type of sponsors we used on our Japanese hubs so that we could see just how well this type of international adult traffic converted and, i think you will see the results are worthy of you sending your Japanese adult traffic to sites that do not encompass dialers as the only means of revenue for those surfers after reading this article.

    Asian.

    We set up a small niche specific ‘Asian’ hub trap that we could use to primarily filter our Japanese traffic through and, as we expected, very little of the traffic actually purchased memberships to these sites that were supposedly made specifically for Japanese surfers instead, all but 2 sales came from sites outside of this ‘Asian’ specific hub.

    Ebony.

    By far our best converting niche on Japanese traffic, the sales figures we saw from a join perspective were enough to warrant that we altered the rest of our hub site to ensure maximum exposure of the Ebony niche sites we were using to get them in front of the Japanese surfer as quickly as possible.

    Anime / Hentai / Toon.

    Again another surprise, the sales we saw on these niche paysites were lower overall when compared to sites in completely different niches what was also surprising about this is that the vast majority of adult webmasters will actively push their Japanese traffic to sites in this niche.

    Bukakke.

    Finding a Japanese Bukakke sponsors was impossible for us, instead, we opted to send the traffic to a US only paysite and, as expected, we had no sales for this niche on the Japanese traffic we saw flowing through our sites. As mentioned above with the Anime niche, this is often the first type of site adult webmasters will send their Japanese speaking traffic too and, as we thought, this is wrong.

    Teen.

    The teen niche converted really well for us, not as well as the ebony niche did but non the less, we received a steady amount of signups each month, this steady signup rate was also added to by recurring incomes from the previous months signups towards the end of our three month test period.

    Gay.

    As with the Bukakke niche paysite we received absolutely no sales to the Japanese gay paysite we used in our hub site this was surprising as even with the other international traffic we had gotten at least one sale a month but, alas, it seems the vast majority of Japanese surfers do not want to see naked men online.

    Amateur.

    Specifically amateur web cam sites converted for us on our Japanese traffic and again, this was good because of the rebills at the end of the month in fact, we still have some rebills continuing now some 5 months after our test period ended.

    Japanese Adult Surfers – An Overview.

    As expected at the start of the test period, Japanese adult surfers are not primarily interested in Bukakke and Anime sites in fact, it would appear they were primarily interested in the Ebony paysites we had to offer them. This is interesting in itself because, when offered ‘Asian’ niche paysites we only achieved 2 sales a month compared to the vast amount of sales to the Ebony niche, I think it might be worthwhile re-visiting Japanese traffic at a later date so we can evaluate exactly what preferences they have over time.

    One other thing we discovered when testing the Japanese traffic was that, as we have been saying for a long time, Japanese surfers do hold credit / debit cards and, will use them online if their needs can be matched. This is good as if, like us, you use recurring sponsors on this type of traffic you will see some good long term residual income.

    Article written by Lee

  • Straight From The Horses Mouth – Get Googlized

    Date: 2011.02.24 | Category: Search Engine Optimization | Response: 0

    Many webmasters wonder how to ensure their sites will be included in Google’s index of web sites. Although Google crawls more than a billion pages, it’s inevitable some sites will be missed. When Google does miss a site, it’s frequently for one of these reasons:

    * The site is not well connected through multiple links to others on the web.
    * The site launched after Google’s last crawl was completed.
    * The design of the site makes it difficult for Google to effectively crawl its content (excessive frames, tables, etc).

    Google’s intent is to represent the content of the Internet fairly and accurately. To help make that goal a reality, we offer this guide to building a “crawler-friendly” site. There are no guarantees a site will be found by our crawler, but following these guidelines should increase the probability that your site will show up in Google search results.

    Do…
    Provide high-quality content on your page – especially your home page.
    If you follow only one tip from this page, this should be it. Our crawler indexes web pages by analyzing the content of the pages themselves. Google will index your site better if your pages contain useful information. Plus, your site has a better chance of becoming a favorite among web surfers and being linked to by others if the information it contains is relevant and useful.

    Submit your site to the appropriate category in a web directory.
    Listing your site in the Open Directory Project http://www.dmoz.org/ or Yahoo! http://www.yahoo.com/ increases the likelihood it will be seen by robot crawlers and web surfers.

    Pay attention to HTML conventions.

    Make sure that your <TITLE> and <ALT> tags are accurate and descriptive. Also, check your <A HREF> tags for errors since broken or improperly formatted links can prevent Google from indexing your page.

    Make use of the robots.txt file on your web server.
    This file tells crawlers which directories can or cannot be crawled. Make sure it’s current for your site so that you don’t accidentally block our crawler. Visit: http://www.robotstxt.org/wc/faq.html for a FAQ answering questions regarding robots and how to control them once they visit your site.

    Ensure that your site is accessible through HTML hyperlinks.
    Generally, your site is crawlable if the pages are connected to each other with ordinary HTML links. If certain areas are not linked, you may be excluding older browsers, differently-abled users, and Google. Google can crawl content from a database or other dynamically generated content as long as it can be found by following links. If you have many unlinked pages, you may want to create a jump page from which the crawler can find all of your pages.

    Build your site with a logical link structure.
    A hierarchical link structure is not only beneficial to you, but also to Google. More of your site can be crawled if it is laid out in with a clear architecture.

    Don’t…
    Fill your page with lists of keywords, attempt to “cloak” pages, or put up “crawler only” pages.
    If your site contains pages, links or text that you do not intend visitors to see, Google considers them deceptive and may ignore your site.

    Feel obligated to purchase a search optimization service.
    Some companies “guarantee” your site a place near the top of a results page. While legitimate consulting firms can improve your site’s flow and content, others employ deceptive tactics to try and fool search engines. Be careful – if your domain is affiliated with one of these services, it could be permanently banned from our index, we have found search engine optimization software like Web Position Gold works best but, again use it in moderation.

    Use images to display important names, content or links.
    Our crawler does not recognize text contained in graphics.
    Use ALT tags if the main content and key words on your page cannot be formatted in regular HTML.

    Provide multiple copies of a page under different URLs
    Many sites offer text-only or printer-friendly versions of pages that contain the same content as the graphic-enriched version of the page. While Google crawls these pages, duplicates are removed from our index. In order to ensure that we have the desired version of your page, place the other versions in separate directories and use the robots.txt file to block our crawler.

    Article written by a Google employee

  • Favicon.ico – What Does It Do?

    Date: 2011.02.22 | Category: WebDesign | Response: 0

    Favicon.ico is the name of the graphic Internet Explorer 5+ uses in the address bar and when someone views their favorite bookmarks. There should be one beside the address of this page now if you are using IE5+. If you want to see favicon.ico in action among your favorites bookmark our site now by right clicking and selecting ‘Add to favorites’.

    Internet Explorer looks for this file in the same directory as the HTML page currently being displayed, if it cant find favicon.ico it will then display the default Internet Explorer icon in the address bar. As for viewing of favorites, IE will check its temporary folder to see if favicon.ico is there again, if it is not located it will display the default white background with a blue ‘e’ icon.

    For a webmaster there are three main advantages to using the favicon.ico ‘trick’.

    The first, is that it helps to brand your site with a nice little icon that is easy to recognize.

    The second, is that it makes your website more professional.

    The third, is that your entry will stand out in surfers bookmarks over the others. This is especially good as, if you can get a surfer back to your site then you have another chance at making a sale.

    Many internet users have a multitude of site bookmarks so, you need to use favicon.ico to give you an edge. I highly recommend using it and, now I’m going to tell you how.

    First, you will need to create an icon file which is exactly 16 x 16 pixels. If the icon is larger or smaller IE5+ will just ignore it. As for the colors in it, 16 is standard. You can use more colors if you want but, the more colors you use, the larger the .ico file becomes and, the longer it takes to load.

    You now know the standards the favicon.ico file has to be, now to actually create this file you can take one of two routes.

    The first is to convert and existing 16 x 16 BMP or GIF graphic with 16-32 colors into an .ico file using converter software making sure to save it as favicon.ico.

    You know the standards the favicon.ico file has to be, now to actually create favicon.ico . The easiest way of creating a favicon.ico file is to convert an existing 16 X 16 BMP or GIF graphic with 16 – 32 colors into a .ico file using converter software making sure to save it as favicon.ico.

    Once you have created your favicon.ico file all you need to do is to upload it to any directory on your server that contains html pages. This way, when IE5+ searches for favicon.ico it will be bale to find it regardless of which page you are on.

    That’s nearly all the areas of favicon.ico covers apart from, what if you want different icons for different parts of your website? Can this be done? The answer is yes it can. All you have to do is place the following HTML code between the <head> and </head> tags of your web page.

    <LINK REL=”SHORTCUT ICON” HREF=”differenticon.ico”> (SHORTCUT ICON should be kept in uppercase).

    Now when someone adds a web page with that code to their favorites, IE5+ will not look for favicon.ico but will look for differenticon.ico and if it’s there it will display it, if not the default icon will be displayed.

    Using favicon.ico or the SHORTCUT ICON code is nice way to add a unique touch to your site, and of course will result in more repeat visitors than if you were not using it – which is always good for any webmaster.

    Article Written By Le

  • Using JavaScript To Auto Scroll Text

    Date: 2011.02.24 | Category: Scripts | Response: 0

    Using JavaScript To Auto Scroll Text.

    There may come a time when you would like to have some text on a page that is simply just to big to fit on a single page. Of course, you could always create a new document for this text but, what if you could make the text actually scroll through the surfers browser?

    The following JavaScript will do just that.

    Place the following section of JavaScript coding between your <head> and </head> tags:

    <SCRIPT LANGUAGE=”JavaScript”>
    <!–

    function scrollit() {
    for (I=1; I<=1200; I++) window.scroll(1,I);
    }

    // –>
    </SCRIPT>

    Along with the following JavaScript coding someone in the Body of your page:

    <FORM>
    <INPUT type=button value=”scroll” onClick=”scrollit()”>
    </FORM>

    Have a play around with the numbers in the first section of the JavaScript to speed up and slow down the rate of scrolling until you find a speed that is easy on the eye.

    Article written by Lee

  • Why Cant I Get Indexed By The Search Engines?

    Date: 2011.02.24 | Category: Search Engine Optimization | Response: 0

    Unfortunately, this is an all too common question. If it makes you feel any better, you’re not the only one frustrated about the length of time it takes to be indexed, or the many pitfalls involved. It often takes anywhere from two days to as much as six months to be listed on a search engine. For example, last month Excite finally updated its index for the first time since last August! Luckily, Excite is the most extreme case lately, but waiting several weeks to a month can also be extremely frustrating especially when your livelihood depends partly on these search engines.

    The Web Position Submitter report will give you current time estimates for each engine so you’ll know what to expect. However, an engine at any time could choose to delay their indexing beyond the “norm” for maintenance or other reasons. On the flip side, you could get lucky and submit just a couple days before an engine does a complete refresh of their database. Therefore, submission times can never be an exact science since we’re all ultimately at the mercy of the engine.

    If you’ve submitted your site and have waited the estimated time to be indexed and there’s still no listing, what do you do now?

    Here are 16 tips that should help you solve this problem:

    1. First, be sure you’re not already indexed but just don’t know it. Unfortunately, none of the major engines are kind enough to e-mail or notify you as to if and when you’ve been indexed.

    The method to determine if a page or domain has been indexed varies from one engine to another, and in many cases, it’s difficult to tell for sure. Never assume that you’re not indexed just because you searched for a bunch of keywords and you never came up in the first few pages of results. You could be in there but buried near the bottom.

    In addition, it’s not very practical to check the status of a number of pages on each major engine each week. Fortunately, Web Position has a URL verification feature in the Reporter that makes this process much easier. Each time you run a mission, it will report which URLs exist and do not exist in each engine. If you’re using Web Position and are not finding your URLs after submitting, be sure to see this page for common pitfalls to watch out for:

    http://www.webposition.com/urlnotfoundhelp.htm

    2. Make sure you have uploaded the pages to your site before submitting them. This one seems obvious, but submitting a page that does not exist or submitting with a subtle typo in the URL is a goof we might all make at one time or another. If you’re using Web Position’s Submitter, there’s a checkbox on tab 2 that forces Web Position to verify that all your URLs are valid before submitting them.

    3. If you have information inside frames, that can cause problems with submissions. It’s best if you can create non-framed versions of your pages. You should then submit the non-frames versions of your pages which can of course point to your framed Web site. Alternatively, you can enter your relevant text within the NOFRAMES area of a framed page which most search engine spiders will read.

    4. Search engine spiders cannot index sites that require any kind of registration or password. A spider cannot fill out a form of any kind. The same rule applies regarding indexing of content from a searchable database, because the spider cannot fill out a form to query that database. The solution is to create static pages that the engines will be able to find.

    5. Dynamic pages often block spiders. In fact, any URL containing special symbols like a question mark (?) or an ampersand (&) will be ignored by many engines.

    6. Most engines cannot index text that is embedded in graphics. Text that appears in multimedia files (audio and video) cannot be indexed by most engines. Information that is generated by Java applets or in XML coding cannot be indexed by most engines.

    7. If your site has a slow connection or the pages are very complex and take a long time to load, it might time out before the spider can index all the text. For the benefit of your visitors and the search engines, limit your page size to less than 60K. In fact, most Webmasters recommend that your page size plus the size of all your graphics should not exceed 50K-70K. If it does, many people on dial up connections will leave before the page fully loads.

    8. If you submit just your home page, don’t expect a search engine to travel more than one or two links away from the home page or the page that you submitted. Over time they may venture deeper into your site, but don’t count on it. You’ll often need to submit pages individually that appear further down into your site or have no link from the home page.

    9. If your Web site fails to respond when the search engine spider pays a visit, you will not be indexed. Even worse, if you are indexed and they pay a visit when your site is down, you’ll often be removed from their database! Therefore, it pays to have a reliable hosting service that is up 99.5% of the time. However, at some point a spider is going to hit that other 0.5% and end up yanking your pages by mistake. Therefore, it pays to keep a close eye on your listings.

    10. If you have ever used any questionable techniques that might be considered an overt attempt at spamming (i.e., excessive repetition of keywords, same color text as background, or other things that the Web Position Page Critic warns you about), an engine may ignore or reject your submissions. If you’re having trouble getting indexed in the expected amount of time, make sure your site is spam-free.

    11. If your site contains redirects or meta refresh tags these things can sometimes cause the engines to have trouble indexing your site. Generally they will index the page that it is redirecting TO, but if it thinks you are trying to “trick” the engine by using “cloaking” or IP redirection technology, there’s a chance that it may not index the site at all.

    12. If you’re submitting to a directory site like Yahoo, Open Directory, NBCI.com, LookSmart, or others, then a human being will review your site. They must decide the site is of sufficient “quality” before they will list it. I recommend you read the submission guide on the directory tab of the WebPosition Submitter. It contains tips to improve your chances of obtaining a good listing on these directories.

    13. A number of engines no longer index pages residing on many common free web hosting services. The common complaint from the engines is that they get too many “junk” or low-quality submissions from free web site domains. Therefore, they often choose not to index anyone from those domains or they limit submissions from them. It’s always best to buy your own domain name (very important) and place it on a respected, paid hosting service to avoid being discriminated against.

    14. Some engines have been known to drop pages that cannot be traveled to from the home page. HotBot has been rumored to do this. You may want to consider submitting your home page that links either directly or indirectly to your doorway pages.

    15. Make sure you’re submitting within the recommended limits. Some engines do not like more than a certain number of submissions per day for the same domain. If you exceed the limit, you may find that all your submissions are ignored. Fortunately, WebPosition’s submitter will warn you regarding current limits and recommend you stay within them. Some submission consultants feel it is dangerous to submit more than ONE page a day to a engine for a given Web site. For those who wish to be ultra-conservative in their approach, the Web Position Submitter includes a checkbox to limit submissions to one URL per day per engine.

    16. Last but not least, sometimes the engines just lose submissions at random through technical errors and bugs. Therefore, some people like to resubmit once or twice a month for good merit in case they do lose a submission. Certainly if you’ve followed all the “rules” and are still not listed, re-submit! Sometimes a little persistence is all that’s needed.

    If any of the above scenarios apply to your submission, you should make the necessary adjustments and re-submit. If that still does not work, you should consider e-mailing or calling the search engine and asking them politely why you have not been indexed yet. Sometimes they will reply back with “Sorry, there was a problem with our system and I’ve now made sure you’ll be indexed within the next couple days.” Or, sometimes they’ll tell you why you were not indexed. In other cases, they will ignore your e-mail and you’ll have to keep e-mailing or calling them until they respond. Still, it’s definitely worth the effort to get your site listed with the major engines assuming you also take the time to optimize your pages so you’ll achieve top rankings.

    Article written by Lee

  • Whats A Twink

    Date: 2011.02.24 | Category: General | Response: 0

    I think you would be surprised at the number of times I have been asked, “What’s a Twink?” Or my, personal, favorite “So how do two men have sex?”

    If you know me, you know I am never really bothered by questions – I never mind people’s candor. What does bother me, though, is the bulk of the people who ask are trying to work the gay market. Unsuccessfully, I might add.

    So, let’s expand our webmaster knowledge. Don’t worry, I am not about to explain how two men have sex (that I will save for my story site LOL)

    I have decided, however, to give you a Gay Glossary of sorts. There are many terms that can be included here. But, I thought to start off, we will hit the basics and add to it as time goes on. Keep an eye for future articles and additions on Gay Wide Webmasters.

    Here is a list of the most common terms on the Gay Adult ‘net:

    • Twink – A buff and lean young man; a 20 something. Age and leanness make the guy a twink. The best example is your typical bar room stripper.
    • Teen – This is the same thing as in EVERY market, hot young and under twenty – but LEGAL at 18.
    • Hunk/Stud – Beautiful, built and beefcake. The guys commonly posing for calendars and such.
    • Chubs (or chubby) – Just what it sounds like, a larger man. Not just muscular weight, most often heavy set.
    • Bears – This is a HAIRY man. Not a young guy, most often over thirty, sometimes with a husky build (but not always). You will probably see this niche sold as “Real Men”.
    • Cub – This is a YOUNG hairy guy. Sort of a HAIRY TWINK…
    • Daddies – Daddies are men over thirty, who like younger guys. (Just a side note here to clear up some misconceptions, not all gay men want to do young guys!)
    • Sons – the young male counterparts to Daddies. The young of the two can be a twink, cub or teen – makes no difference. This term is based on the visual couple. For our non-gay friends think of this as an example: the businessman who goes away for the weekend with his NEICE. See the comparison? LOL
    • Transvestite – is a man who dresses as a woman. Whether it is simple underwear or all our female attire. A lot of webmasters confuse this with Transsexual.
    • Transsexual – is a person who has decided to make the complete change and have a sex change.
    • Trans Gender – this is a more general term for people who live life as the opposite sex. They are in the process of, or have completed, a sex change.

    So know go have some fun. Try to classify your friends and lovers in these terms! I enjoy it when a webmistress comes up to me and says, “I heard your interview and I think my husband is a bear. But he’s a little twink-ish. What do you think?” This just makes my day! I have to laugh at the look on their husband’s face when they think I’m going to ask to examine them or something. “Turn you head and cough – OH! You’re a Twink…”

    Along with what works, I have to offer some terms that DO NOT work. If used incorrectly or out of context, then your attempt at marketing to gay men can be dreadful.

    • Faggot – I hate this word, personally. This is a word that is as derogatory to the Gay Community as other words are to a race.
    • Nancy boy – a typically non-US term that basically is used as a derogatory manner.
    • Lil Boys – a derogatory term used primarily in the Southern and Mid-Western US. It’s derived from the negative connotation that gay men are pedophiles.
    • Gay Owned and Operated. Allow me to be blunt – If it is not true, do not say it. This is probably the most OVER used phrase in the gay adult market. So much so, it is relatively meaningless today.

    We all need to remember from time to time, the key to success in any business is knowledge. Whether you are up selling to a sponsor or designing sites, knowing some of the more common terminology in the gay community can only help your ventures in the gay adult market. And remember, do not be afraid to try and NEVER be afraid to ask.

    As always, good luck!

    Article written by Gary-Alan

  • Robots.txt – Control The Robots That Crawl Your Sites

    Date: 2011.02.24 | Category: Search Engine Optimization, WebDesign | Response: 0

    By writing a structured text file you can indicate to robots that certain parts of your server are off-limits to some or all robots. It is best explained with an example:

    # robots.txt file for general use on web servers.

    User-agent: webcrawler
    Disallow:

    User-agent: googlebot
    Disallow: /

    User-agent: *
    Disallow: /cgi-bin
    Disallow: /logs
    The first line, starting with ‘#’, specifies a comment.

    The first paragraph specifies that the robot called ‘webcrawler’ has nothing disallowed: it may go anywhere.

    The second paragraph indicates that the robot called ‘googlebot’ has all relative URLs starting with ‘/’ disallowed. Because all relative URL’s on a server start with ‘/’, this means the entire site is closed off.

    The third paragraph indicates that all other robots should not visit URLs starting with /cgi-bin or /log. Note the ‘*’ is a special token, meaning “any other User-agent”; you cannot use wildcard patterns or regular expressions in either User-agent or Disallow lines.

    Two common errors:

    Wildcards are not supported: instead of ‘Disallow: /tmp/*’ just say ‘Disallow: /tmp’.
    You shouldn’t put more than one path on a Disallow line (this may change in a future version of the spec)
    Ultimately, without the use of robots.txt files on your servers/domains, you are risking a variety of potential problems including, unauthorized access to your cgi directory, unauthorized viewing of your site stats, possible spamming of the search engines by accidental crawling of doorway pages.

    One distinct advantage however of having a robots.txt file on your server is that, quite simply, you will be able to tell when and where your site has been indexed or potentially indexed as, all robots will automatically call for the robots.txt file BEFORE any other page on your server so, as long as you keep an eye open for any calls of this file, you can see who is knocking at your site for indexing purposes.

    Below is a robots.txt example that you can copy and paste into a text document to use on your own server:

    <!–Start Copy Below This Line–>

    User-agent: *
    Disallow: /cgi-bin
    Disallow: /logs

    <!–End Copy Above This Line–>

    The above will allow all spiders to crawl all of your site except the subdirectory’s ‘cgi-bin’ and ‘logs’ which, may be altered to suit any subdirectory’s you do not wish the spiders to crawl on your server.

    Article written by Lee

  • Google – Manipulate Your Listings For More Traffic

    Date: 2011.02.22 | Category: Search Engine Optimization | Response: 0

    Google seems to be the Search Engine that everyone talks about almost on a daily basis however, far from being about search engine optimization and specifically about Google related SEO i wanted to touch on something new that, perhaps you would not have already thought about.

    Manipulating Google Traffic.

    So your site is already listed in Google but you want to increase the amount of traffic you receive, one way that we as a company have been successfully using for the last 18 months is that of manipulating the display of our clickable links in Google, how are we doing this? Simple, using ASCII character codes in your meta tags.

    ASCII In Your Meta Tags.

    ASCII (American Standard Code for Information Interchange) it has been proven, when used in your HTML page Meta Tags will actually display not only letters and numbers but, symbols as well, as a webmaster this can give you a great advantage over how much traffic you can pull from your Google listings and, not only Google, a few of the other search engines also read ASCII code when it is found in your Meta Tags.

    Putting This To Use Practically.

    A good method of putting this to use would be utilizing in your Meta Tags as follows.

    <title>Site Name <ascii code here> page title</title>
    <meta name=”Description” content=”normal page description”>
    <meta name=”Keywords” content=”normal keywords”>

    This will display a search engine listing that not only has your site name on it as well as a description but, in the position where the ASCII code will appear you will also have an attention grabbing symbol enuring that your site stands out from all the others listed on the same search engine results page as yours.

    ASCII Meta Tags – An Overview.

    Hopefully you have seen how adapting your current HTML page Meta Tags by placing an ASCII character code within them can benefit you for gleeming further search engine traffic to your sites and, with this new found knowledge you may well place both your sites traffic and your bottom line profits ahead of other webmasters.

    Article written by Lee

  • Community Forum Scripts

    Date: 2011.02.21 | Category: Scripts | Response: 0

    Whether you have little or much traffic one thing that will enable you to benefit greater form this traffic is giving your surfers the sense that they are part of a ‘community’ much like the message boards built around adult webmasters have the feel of a community to them your surfers will end up staying around a lot longer if you can give them a reason to come back to your site over anyone else’s.

    That said, one of the main factors in starting your own little porn community is the need to have a place where all the ‘action’ take place. What better way than you own message forum.

    However, we hit our first problem, having never thought about starting a forum up before you wouldn’t know where to go to look for scripts or, what the best one is. That’s the reason behind this little guide.

    Below you will find a short breakdown of the more commonly used message forum scripts along with a short list of features each one has to offer you when choosing to build your own online community.

    VBulletin http://www.vbulletin.com

    This is actually quite a good forum script, most of the main forums on the web use either VBulletin or PhpBB when it comes to threaded forums. VBulletin uses MySQL and .php to run the actual forum and, set-up of it can become a little tricky if you want to customize it to your exact needs and color scheme. That said, once you overcome the initial complications in setting the forum script up it is easy to handle and, with a price tag of only $160.00 its an affordable option for many.

    Ultimate Bulletin Board http://www.infopop.com

    Unlike VBulletin UBB uses Perl programming for the forum script which, means you can install it on any cgi enabled host. However, the one main drawback with this script is that it uses flat text files to store all the data which, can sometimes bog down your server with unnecessary file calls. The cost of $199.00 however can be a bit to expensive for most webmasters.

    Ikon Board http://www.ikonboard.com

    Ikon board is a nice little threaded forum script and, the fact that it is free to download makes it even more so however, even though it uses MySQL for the backend or, choose to store your data in flat text files this script just seems a little to ‘basic’ looking for my personal taste. That said, there are many types of site using IkonBoard so the script must be good enough for them to be using it. Then again, maybe its the price tag of $0.00 that has made this a popular choice.

    PhpBB http://www.phpbb.com

    PhpBB is probably the most well known free forum script on the web today, its easy to customize (within reason) supports php and MySQL functions yet seems to offer nothing in return. The basics of a forum are there however, every webmaster and their closest friend seems to be using this script. The whole idea of building up a community is so that your site can stand out from the crowd, in my humble opinion, the script does what it is supposed to but, it doesn’t offer any redeeming qualities to your site.

    Site Net BBS http://www.focalmedia.net

    Sitenet BBS, formally known as Netboard, is probably one of the better Perl based forum scripts on the market, it price tag of $69.00 makes it an affordable choice for almost everyone and, the installation process itself is VERY simple to understand. The one drawback that i have found with this forum script however is that it stores the data in flat text file which actually slow the server down quite a lot making connections to the forum time-out on numerous occasions. However customization of the script is very easy using only HTML based templates you do not need any additional programming skills and, it has a nice interface with a few good features. For the price it is well worth a look. They also offer a freeware version however the links on the bottom of the forum become annoying after a while.

    In summary there are a lot of popular forum scripts available for webmasters to start using some are free other require payment ALL have a range of different functions available in them.

    Before looking at installing any of the scripts you should always try a demo first to see which one has the features and benefits you would like to offer your community members.

    If you can get your base community built up on a forum they enjoy using then all the rest of your marketing should pay off in dividends long term.

    Article written by Lee.

  • Doorway Pages

    Date: 2011.02.22 | Category: Traffic, WebDesign | Response: 0

    A doorway page is built to rank high for a particular keyword or search phrase. When your doorway page is visited by the searcher it simply has a “click me” button which links to your web site. The major search engines accept these pages as long as the end result does provide what the searcher is looking for. If you are discovered to be using doorway pages for irrelevant keywords you can expect to have your entire domain unlisted. For instance, one of our doorway pages is built to rank high for the search term -Adult Content—which is quite acceptable because when the searcher gets here he can find links to adult content providers from our site. If however, he couldn’t find links to adult content then we would risk the wrath of the all powerful search engines.

    Once you have selected your keywords and phrases (about 50 would be the normal) you need to build a doorway page for each keyword and ideally, for each major search engine. All the major engines look for different keyword density in the text, title and description so you need to do some research by doing a search using your chosen phrase or keyword and studying the top 10 results at each engine (if you can spot them, study high ranking doorway pages).

    Make notes of how many times the phrase or keyword is used in the title, description and body text. Is it used in header text etc? Once you have built a picture of what your doorway page should look like you can build it using your favorite html editor. Once you have done one for a particular engine the other 50 or so pages can be done by simply replacing keywords for keywords. After you have done a page for each keyword move on to the next search engine. Remember that your body text is not important but make each page unique or it may be considered spamming. It is your keyword or phrase density that you are attempting to get right. The page, when visited by a person, will be recognized for what it is, simply a link to the real content so don’t worry too much about what it looks like. When a search engine spider visits the page it is only interested in counting the keyword density in your body text.

    As you have probably worked out there is quite a lot of work involved. 50 phrases or keywords=50 pages X the top 8 search engines= 400 pages. In reality though, you are only creating 8 doorways, 1 page for each search engine. For all the other pages you simply have to substitute the keyword for the next and alter the text around so as not to finish with 50 identical pages. OK, nearly finished but the next step is crucial if your doorways are to work.

    Put all your pages in a folder on your server e.g. Mydomain/doorways/. Then create 2 more pages that have a link and a small description to each of your doorways (200 on each) and to each other. These are called corridor pages because the spider travels down them, visiting each page linked off them. Call these 2 pages index and home and place them in the folder with your doorway pages. Then place links to these two pages from your main index page so that a spider will be able to find them and list all your doorways. Submit only your index page and your 2 corridor pages. Do not submit your doorways to the engines.

    Ideally, and if you can afford it, you should set up a new domain and host for your doorways. This isn’t essential but it does have benefits. These are: If you do get over zealous with your doorways and an engine does ban you for spamdexing at least your real site will be safe. Engines rank sites higher if they have lots of links to them from other domains. You can put your most important keywords in your new domain name, which will increase the ranking of the doorway pages containing those keywords.

    Now all you have to do is wait for the engines to update their databases. We have had the best results with Google.

    Article written by Lee

Premium Sponsors















Categories

Site Links