Warning: Creating default object from empty value in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions.php on line 292
Search Engine Cloaking

Sunday January 22, 2006 JST

Search Engine Cloaker


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

Do you need Search Engine Cloaker?

Cloaking is a technique that is used to display different pages to the search engine spiders than the ones normal visitors see. The usefulness of this ability results from the fact that good search engine optimization often requires sacrificing some of the visual attractiveness of the page and changing the textual content into somewhat search engine friendly. As a result, a well-optimized page may look unattractive to human visitors.

“Do I need to cloak?”. If you’re fighting for extremely competitive keywords, then it might be a good idea to consider cloaking after you’re familiar enough with the search engine optimization techniques to get the most out of the benefits cloaking can bring.

Get your copy Search Engine Cloaker

Webmaster World - I think I get It


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

At first, I didn’t get it. While some of us thought he had lost his mind, we knew that Brett Tabke was no dummy. You don’t build the most successful webmaster forum on the web without understanding a thing or two about a thing or two.

As you may already know, he is now writing a blog on his robots.txt file (which is pretty funny). Clearly, he is cloaking to send the real robots.txt file to the search engine bots. Google, Yahoo, and MSN all know this - they don’t care. Getting banned just for cloaking is a myth.

As I see it, the bandwidth from the real search engine bots is obviously an acceptable cost of doing business. It’s all those other bots from scraper sites that you want to get rid of: the ones hitting 4 million pages per day each and pirating your content.

This blog entry about bot behavior and his experiment was interesting:

# After alot of testing and bot busting, the current robots.txt is what was
# settled on. I felt exposing the code was the best way to explain it all
# (see the actual robots.txt above for the full story).
#
# Testing the bots code and the security code to get it all right took alot
# of time. In the end, we found:
#
# - A surprising 21 bots that were following all the active list posts on a
# daily basis and downloading that content.
#
# - About 45 trademark and other page monitoring services. The majority of
# those monitoring services obey robots.txt.
#
# - 15 bots would accept cookies.
#
# - 2 more web sites reselling WebmasterWorld content. One in China and one
# in the stans. both out of legal reach.
#
# Sorry Shak - China will continue to be viewed with suspicion as long as
# it is still the wild-wild-west out there with few legal controls to
# protect content.
#
# - about 30 people who don’t understand the concept that if you look like
# a bot with a spoofed agent name - you are a bot.
#
# - Some of the worst bot running offenders? A few choice SEO’s.
# These are the same folks that bring you click bots and scrapper sites.
# I think they have little respect for other peoples content. I also think
# they don’t appreciate just how impactful their actions can be.
#
# - http://www.ojr.org/ojr/stories/051213niles/
#
# Thanks to Yahoo and MSN for the permission to treat your bots as if they
# were a tough steak during the testing and coding phase. Cloaking stuff for
# testing went a long way to being able to figure out the right
# balance of settings.

I don’t think Tabke was banned, or hacked, and it wasn’t to see if he could do without search engine traffic out of pride. Real search engine bots he can deal with, it’s all those other bots that he wanted to eliminate. It was a strategic business decision to get rid of bandwidth leaching non search engine bots of people who were taking his paid content for free: one that will likely pay off in the long term

http://seoblackhat.com/2006/01/03/webmaster-world-i-think-i-get-it/

An SEO Ancestor’s View of A-List SEO Lists


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

I couldn’t help but add my thoughts on the recent commentary about the Search Engine Optimization field ?A-lists? and “generations” of SEO?s by Andy Hagans, Danny Sullivan, Todd Malicoat, Rand Fishkin and Graywolf (who is right-on about Mike Grehan.).

I didn?t make any lists but, as an SEO Ancestor, I can still spin a tale or two.

When I Was Young, I Walked Five Miles to Search Engine School. Uphill. With No Shoes

When I started teaching myself SEO in 1996, Danny Sullivan, Jill Whalen and Fantomaster were my teachers. I launched Cre8pc.com that same year to begin keeping track of all the search engines and how each of them ranked and indexed web sites. In 1998, I launched a discussion group in what was then ?EGroups? called ?Cre8pc Website Promotion? because in those days, we didn?t refer to it as SEO.

I became an online teacher by virtue of sharing my passion. Like others in those days I shared SEO skills and advice in newsgroups, where I?d run into Fantomaster and small forums like MarketPosition, where I met Ammon Johns (a.k.a. ?Black Knight?) in the late 1990’s.

It was Ammon who first provided the glimpse that web site promotion and organic search engine optimization were related to web site marketing and he preferred to use the term ?SEM?. He?s always thought of himself as a marketer. Understanding how search engines work just added value to his skills. He was the first niche example but was quickly followed by Ralph (aka “Fantomaster”) and his foray into cloaking. Jill and Heather had their newsletter and others, like Detlev, provided months of discussions.

In The Old Days We Worried About Many Search Engines and Directories

Those were the days when SEO was fun as heck to do. There were lots of search engines to get client web sites into, not just one or two like today. In my day, we had 10 strong contenders and lots of minors to play with. Each of them refused to sit still for long. They changed business models constantly. Part of my work was just keeping up with these changes, passing them on to clients and making adjustments to submission campaigns. Because, you see, in the old days, we submitted web sites to search engines and directories for clients.

We tracked progress. We tracked rank. There was no Page Rank score. There were few fees for submission. There were many tools to use, including software and web-based submission software, but hand-submitters like me were in demand because we oozed the thrill of SEO. There were scams everywhere and it became one of my own personal missions to alert web site owners on what to look out for.

I Saw The Signs

There were several signs that I was going to retire from performing SEO services. Pay per click bored me to tears. Pay for inclusion was tolerable for a short time until every search engine wanted money. Inktomi was the big player then. In 2000-2001 Alta Vista began dropping web pages by the hundreds of thousands. That sent a lot of web site owners into a panic. The other shock was Yahoo! demanding $299 per year to be added to their directory and if your site didn?t meet their guidelines, they didn?t refund the money.

I had, on my old SEO web site, a text-based version of something similar to Bruce Clay’s Search Engine Relationships chart. His was prettier to view. Both of us had to monitor search engines to death to keep our data current for our web site visitors. He eventually won the battle, because after awhile I left the industry, though I continued to consult on SEO when asked for help by friends.

The Company I Worked For Trained Me in Usability and Software Testing

The move to usability was purely by accident, but once I had a taste of it, I was hooked.

I launched Cre8asiteforums in 2002 to cover web design, with a huge portion being SEO and search engine discussions. My blog followed shortly afterwards. Sadly, I took down the old Cre8pc web site with all the years of SEO related content it contained. My binders of old search engine information are in storage.

The In-between World Called “Niche”

Today I work with SEO?s who have added usability and accessibility services to their own. They like knowing I understand what they do and I stay clear of their ?side? of projects. They let me do mine. While they focus on search engines, I focus on the people who use web sites once they are found.

There are other people like me, who speak at SEO conferences and have their own businesses, but did not make the official lists of who you should know in SEO.

Christine Churchill works on PPI/PPC and landing pages. Debra O’Neil-Mastaler is the link building expert. Matt Bailey, for statistics and accessibility, Karon Thackston for copywriting and Scottie Claiborne is fantastic with usability design. They?re all part of the High Rankings team led by Jill Whalen and all of them are proficient in SEO/SEM, as well as their added areas of expertise.

None of them seek fame and glory, but they each participate in the industry in their own way. Christine, for example, was on the original SEMPO Board. That?s pretty ?A-list? in my book.

Sometimes people ask me why I still attend SEO/SEM conferences when I can. SEO is still in my blood. I write about SEO. I like to see how the industry has grown. When I started out, it seemed as though there was only 10 of us and I was one of the quiet ones, plugging away and memorizing every nuance of search engine life. Sitting in a sea of attendees at SES Conferences leaves me in awe of how great the need has grown for what SEO/SEM?s do.

It also makes me realize how valuable my own niche is, and will be in the coming years. My approach to web site usability is totally grounded in SEO practices. Much of my advice is complimentary to SEO/M goals. When someone speaks of needing help with SEO along with usability work, I have a stable of SEO partners to recommend.

In the end, I have had the gift of time on the Internet. Those who have found niches and specialize may not fit into the tidy “A-list” wrapper but they are my kind of mentor and peer anyway.

http://www.cre8pc.com/blog/2006/01/seo-ancestors-view-of-list-seo-lists.html

Bad SEO Habits


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

Occasionally, I will highlight company’s web sites that are doing things well, but usually it is a lot more fun to highlight a corporation that is doing things wrong. Wrong might be right sometimes, but here is what Google calls a bad idea: Avoid hidden text or hidden links. Don’t employ cloaking or sneaky redirects. Don’t send automated queries to Google. Don’t load pages with irrelevant words. Don’t create multiple pages, subdomains, or domains with substantially duplicate content. Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content. The thing to keep in mind is that there are some really really competitive areas of search on the internet. Generally, that competition only gets fiercer the more money is involved. For this reason, Banking and finance terms like “second mortgage” are extremely costly keywords averaging between five and seven dollars on Yahoo Search Marketing. With acquisition costs that high you had better spend some money on organic optimization. Well, Ditech.com did spend some time and perhaps money. The problem is they came up with a search engine no no in the process. If you did a Google search for home equity loan they are the 3rd result and 2nd company. Great Right? Well, not really, because they are engaging in some questionable optimization practices. If you disable JavaScript (I prefer to use the Web Developer Toolbar for Firefox) in your browser suddenly you will see a paragraph of text that is not visible when you normally visit the page.

http://www.mymotech.com/?p=40

SPAM – spamdexing story part 4


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

Search engine spamming - spamdexing

This type of spamming tries to manipulate the search engines algorithm. As one of the seo element, webmasters will submit URL to the search engines for indexing. Then, basically search engine will send its bots/spider/crawler to crawl the website, read and collect the Meta keywords, description, title and contents. Found links will be followed and start a new information reading and collecting. The collected information will be indexed in their database to make it searchable. After the indexing is completed, the documents are ranked to determine their relevancy. In the hardware aspect, the search engine database is stored in thousand of servers (use clustering, load balancing and redundancy etc.) to ensure the user searching faster.

Well, the main purpose of the spamdexing is to increase the chance to be placed close to the beginning of search engine results, for example page 1 in 10 listing per page of the serp.

Search engines use a variety of algorithms to determine relevancy ranking. Some of these include determining whether the search term appears in the META keywords tag, others whether the search term appears in the body text of a web page. A variety of techniques are used to spamdex, including listing chosen keywords on a page in small-point font face the same colour as the page background (rendering it invisible to humans but not search engine web crawlers).

Search engine spammers are generally aware that the content that they promote is not very useful or relevant to the ordinary internet surfer. They try to use methods that will make the website appear above more relevant websites in the search engine listings. Unfortunately all known technique has been recognized by the search engine. If you got caught (else, it is business as usual), your site or in the worse case the domain will be penalized and de-indexed. Major search engines’ features information can be found here.

Known techniques

Hidden or invisible text

Masquerade the text that is keywords and phrases by: Making them the same colour as the background. Using a tiny font size or hiding them within the html code such as no frame sections. ALT attributes and no script sections or Using the HIDDEN html and CSS codes. Keep in mind that ALT used in image is valid.

This is useful to make a page appear to be much relevant and ranked better. The content may be relevant or not relevant at all. For the irrelevant keywords or phrases to the site’s theme or content, this technique will hide the keywords that already ranked at the top in the search engine serp. For relevant keywords, it is used to make the relevancy level higher. Obviously, this technique tries to serve the search engine crawlers.

Keyword stuffing/spamming

Repeated use of the targeted keywords or phrases to increase its frequency on a page. Early versions of indexing programs simply counted how often a keyword appeared, and used that count to determine the relevance levels. Modern search engines have the ability to analyze a page for keyword stuffing and determine whether the frequency is above a “normal” level.

Meta tags stuffing

Repeating keywords in the Meta tags such as keywords, description and title tags more than once and using keywords that are unrelated to the site’s content. The Meta tags also may be repeated many times.

Hidden links

Putting links where visitors will not see them in order to increase link popularity.

Gateway or doorway pages

Creating low-quality web pages that contain very little content but stuffed with key words and phrases. They are designed to rank highly within the search results. For example, a doorway page will generally have “click here to enter” in the middle of it.

Link spamming

Link spam takes advantage of Google’s PageRank algorithm, which gives a higher ranking to a website the having more, relevant and from high page rank site, an inbound links. A spammer may create multiple web sites/pages at different domain names and IP blocks that all link to each other or just point to one targeted site. Another technique is to take advantage of web applications such as weblogs comments or spam blog (spblog), forums and wikis that display hyperlinks submitted by anonymous, posters or editors. The following is an example of guestbook spam. All the links point back to the poster’s site. Imagine that if the poster do this in hundreds guestbooks…and all indexed in the search engines…Link farms are another technique used by creating pages just for link listings at different sites and most of the links point to the similar targeted site, creating a lot of inbound links. The classic method is using the guest book to put link that point back to his/her site. For the wikis, the editor put links that point back to his/her sites. It is much useful if the inbound links come from the sites having high page rank. The following is an example of the blog’s comment spam. If you browse web blogs at blogger.com, there are a many blogs with keywords stuffing and link farm.

With links and anchor texts/keywords, recent ‘technique’ found called Google bombing. For example, if you search miserable failure on Google, it brings up the official George W. Bush biography from the US White House web site. In term of usefulness, the phrases are not relevant to the site. Google have the explanation on this. Another one is Googlewash, where a small group of webloggers can quickly redefine terms in the eyes of search engine (Google) and ranked high in the serp.

Page redirects

Taking the user to another page without his or her intervention such as using JavaScript, META refresh tags, CGI scripts or server side techniques. This technique can be used to increase the site’s hit.

Cloaking

Sending to a search engine a version of a web page different from what web surfers see. There are more techniques for cloaking.Mirror websitesHosting of multiple websites all with the same content but using different URL’s. Some search engines give higher rank to results where the keyword searched for appears in the URL.

Code swapping

Optimizing a page for top ranking by using legitimate or illegitimate methods, once a top ranking is achieved, then swapping another page in its place.

Referrer log spamming

When someone accesses a web page, that is the referee, by following a link from another web page, that is the referrer, the referee is given the address of the referrer by the person’s internet browser. Some websites have a referrer log which shows which pages link to that site. By having a robot randomly access many sites a sufficient amount of time, with a message or specific address given as the referrer, that message or internet address then appears in the referrer log of those sites that have referrer logs. Since some search engines base the importance of sites by the number of different sites linking to them, referrer log spam may be used to increase the search engine rankings of the spammer’s sites, by getting the referrer logs of many sites to link to them. Check your site stats for visitors, try finding any url that doesn’t relate to your site/page’s content and believe me if any, you tend to click the link .

Tag spamming

This technique uses the weblogs feature of tagging where pages/posts from the same or different site tagged with the same word whether the posts are relevant or not, to constantly dominate the search rank. It looks that in technorati for example, this should be self correcting except those posts kept refreshed or re submitted. Also can happen in social bookmarking such as del.icio.us.

It seems that spamdexing techniques quite close to legitimate search engine optimization (seo) techniques, which do not involve trickery. Other than aiming to better the site’s ranking and site popularity spamdexing also will boost the site’s traffic.

http://nocolourhatseoblog.tenouk.com/2006/01/08/seo-spam-spammer-search-engine-page-rank-web-traffic.html

Tips To Boost Traffic, Increase Website Visibility, And Let The World Know You Exist


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

Directories help locate websites within specific or definite categories. The difference between search engines and directories is that, directories feature only vetted websites, sites that have a certain standard and contain information of substance. One of the aims of using directories is to boost traffic. To achieve this, one must choose to be featured in large, organized directories like Yahoo, DMOZ, and LookSmart. These will drive highly targeted traffic to you site (subject specific), increase significantly the link popularity, and Google PageRank. To maximize exposure:

• Find a category that is most relevant to your website. The category and subcategories must totally fit the subject of your site and its purpose. For example, if your site is health related business the site must feature under health and not business. Only then, will surfers seeking health related websites access yours. Do a keyword search that is relevant to your site on the directory. The directory will throw up categories relevant to the key words. Choose one which is suited in all aspects.

• Coin a title and description that will boost traffic not reduce it. The title must be “dead on.? It should include the most important key word and if possible begin with a letter towards the beginning of the alphabet. And, the 15-25 word description should succinctly summarize the purpose or functions of your site. While being descriptive and informative, try and weave in as many keywords pertaining to your site as possible.

• Purchase keywords from one of the pay-per-click search engines or directories.

• Make the website search engine friendly.

• Pay attention to text and image content. Heavy images reduce search engine visibility. • Use back links on appropriate directories and trade or business listings. Check out the back links used by competitors.

• Add new content regularly. This encourages visitors and search engine spiders to return. Make sure the content is keyword rich.

• Start and maintain a web log or blog. This can become an active way to boost traffic while simultaneously disseminating information.

• Market your website by printing its address on business cards, paper bags, packaging and so on. Distribute an e-zine or updates as newsletters regularly to customers and business associates.

• Consider paying search engines for improved listings and fast appraisal of your site.

• Direct content towards search engines by adopting cloaking technology

• Run advertisements on related web sites and mailing lists.

• Adopt a reciprocal link program.

Maximize traffic by monitoring your traffic regularly. Analyze visitor movements and frequencies. Find out what is effective and what is redundant. Use cookies efficiently. Understand technology and use it to the maximum. Keep updated on new innovations and developments. Write down a workable website marketing strategy. There are simple steps to victory: be different, make sure your website is refreshing and delivers what it promises; create a network of supporters; be honest in your business ventures; deliver on time at affordable costs; be a learner all your life, constantly innovate your website to keep stride with changing times; carve out an exclusive niche and diversify in directions that are relevant to your business module.

http://www.allwahm.com/index.php/2006/01/13/tips-to-boost-traffic-increase-website-visibility-and-let-the-world-know-you-exist/

Can’t We All Just Get Along? - The Battle Between SEOs and Web Designers


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

I have been following an interesting thread over at Search Engine Watch Forums entitled "Do Designers Hate SEO?" where forum member "glengara" began with the question of whether all-Flash sites should be used in the commercial web space. It is certainly an interesting topic and one that has been hotly debated time and time again. The SEO argues for an "optimized" site that search engines can comprehend while the web designer argues for artistic liberty and creativity.

What’s the big deal anyway? What are the potential problems that "all-Flash" sites can impose? I mean Flash allows for so many benefits - animation, functionality, sizing, aesthetics, not to mention how Flash graphics are typically smaller in file size then their gif and jpeg counterparts. What could possibly be wrong with that?

One problem is that they are not very search engine friendly. A site developed completely in Flash typically does not have any html text associated with it. Search engines cannot yet read the contents of the Flash file and therefore have difficulty understanding the context of the site. Another problem is that some Flash sites are contained within one file or one URL. There are not a variety of sub pages which can be optimized for search engine visibility - just one. Are these types of sites good for the commercial web and if not, how can Flash be used in a way that is both acceptable to the person marketing the site as well as the artist who is designing it?

In some cases an all-Flash site is perfectly fine. I’ll give you an example. I just recently saw the current version of King Kong starring Naomi Watts and Jack Black. The official movie web site is located at www.kingkongmovie.com. After getting past the splash page, you arrive at a single URL in which the entire site is contained within a single Flash file. It is an awesome site which provides movie trailers, cast and crew info, storyline, computer wallpaper & screen savers and even detailed info on all the various creepy creatures that Kong and the cast encounter on the island.

But is it search engine friendly? As far as positioning well due to the content of the site, probably not. But who cares? A site such as this is going to draw people because they either saw the movie or are going to see the movie. They are going to search for something like "King Kong" or "King Kong the movie" of which they will easily find the site. Now if they search for something like "movies with giant gorillas" then they may have trouble finding it but is that really going to happen? Most likely not.

There are many sites that fit into this scenario whether it be for a major motion picture, a popular music artist/group and even a popular brand of product that consumers use. They may not be "search engine friendly" in the sense that they are optimized for search engines but they are still easy to find via search engines simply due to the strength of their brand. These sites are perfectly suitable in the commercial web environment in my opinion because they do not necessarily need to be optimized for search engine positioning. As long as their brand is searchable, they will do well.

Now what about a local architect? Is an all-Flash site the best option for them if they want to attract any search engine traffic? Not very likely. Take Architekton for example, a local Arizona based architecture firm whose site is not only in Flash but like the King Kong example, all contained within one single file after you pass the splash page. Their site contains information on the services they provide, projects they have done, their awards, etc. but there is nothing within their site that would help a search engine to recognize that they are an architect based in Phoenix Arizona. This would not be a big deal if they were a well known brand such as in the case of the King Kong movie. However they are not and I would imagine they lose out on a lot of potential traffic that could be derived from people searching for architecture firms in Arizona.

So can a company have their cake and eat it too? In other words, can a site be optimized for search engine visibility and at the same time, enjoy the benefits that Flash can bring? Certainly! There are a few solutions that we will look at. In the thread mentioned at the beginning of this article, forum member "seomike" posts the following comment:

Flash is how I got into SEO. My first programming language was actionscripting. I couldn’t get my flash sites to rank for squat. I figured there were 3 options

1. Cloak (Way to risky)
2. Build an html version (Way to expensive / Time consuming)
3. Go hybrid with flash elements in table or div/css holding it all together. (Just right).

Although some forms of cloaking have been acceptable or overlooked by search engines, I would say in general that this is much too risky as cloaking in general violates most search engines’ guidelines. This leaves us the option of either building an html version of your all-Flash site or incorporate Flash elements into an html design. I actually prefer the later - an html site that includes Flash elements such as animation, navigation, video, order forms, etc. In this manner you get the best of both worlds and are not having to worry about keeping two sites up to date.

Here are a few examples of html sites that use Flash elements: Alpine Engineered Products - Flash header and animated navigation with sound. ClickTracks - Flash animated graphics and product tutorials. ETEK Services - Flash header and navigation.What if a site such as the Architekton example above refuses any of these solutions? Can anything be done to help these gain some visibility in the search engines? Well for starters one can make sure they have properly optimized the title tag and added a meta description tag. If the site is one Flash file with no sub pages then they are pretty much limited to that one page and very few key phrases in which they can target. If the site has several pages, then at least they have a larger number of title tags and meta description tags to work with.

Content can also be included in a <noscript> tag. Placing content that is related to the site or page within an opening and closing <noscript> tag will hide it from end users but still be visible to search engines. Technically this is "hiding text" and borders on violating search engine guidelines. Many will call this a spam technique but my own personal feeling is that as long as the text is not an obvious attempt to either stuff keywords or place content totally unrelated to the site, it is okay. It is not the best method - html text that users can see would be better. It is more of a band aid in situations when the site owner simply will not budge on any other solution. I also feel that while search engines will recognize the text, that they don’t give it as much weight as actual html text that is visible on the page.

In summary, Flash can be used in sites, either developing complete sites with Flash or using Flash inserts. If you have a strong brand recognition such as in the King Kong Movie site referenced above, then maybe you do not need to pay much attention to optimizing your site for prime search engine visibility. However if your brand is not well known and you are looking to gather some of the search traffic, then Flash has to be used intelligently.

In the original thread I mentioned above,"Danny Sullivan" chimes in with an excellent comment:

In reality, designers really need to understand that search engines are like a third browser — and in fact a far more popular browser used by more people than using Firefox. They will spend tons of time making sure a site works for IE or Firefox, even Opera. But no time to make sure that the browsers of search engines are going to be OK with it?

Fellow moderator "Chris D" also provides an excellent example:

Web design is actually closer to architecture, than magazine cover design, in terms of accessibility.

Imagine a world where architects designed buildings just to be cool and edgy - and totally ignored physical accessibility issues…..

As more and more SEOs look beyond search engine rankings and begin pay attention to web site aesthetics, usability, conversions, etc. as well as more and more designers begin to recognize the power of search engine marketing, I think debates like this will be less likely to happen in the future. Rather the two types of professionals, both of which are necessary for a successful web site, will come to terms and work together to produce a quality site that is fully marketable online.

http://www.searchengineguide.com/wallace/2006/0116_dw1.html

Common SEO lament: Why is my site not in The Index?


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

Our firm, SEM Experts, has been taking on a few clients with interesting SEO challenges: older domains that have few but highly relevant backlinks with high PR and yet are still not in Googles Index. Definitely a common problem and I wanted to make sure I addressed it in my Search Engine Optimization Blog so that I could help these clients and also help the world before we make contact. Here are a few of the most common causes. Not all, mind you, as Google runs on a complex proprietary algorithm, but these are a few we’ve tested and of which we can be fairly certain:

1. What is the provenance of the domain? If the domain was owned by someone else in the past and was misused you may have to ditch it.

2. What is the provenance of the IP your site is on? This is one of the harder problems to pin down but if you’re stumped, change your IP. Getting off the same class C license is even better.

3. Who else is on your IP? Since more than one domain can reside on a given IP, you need to insure you have no “unsavory neighbors”. Don’t be cheap: get your own IP. Be in control of your own sites success or failure. If you don’t have a close working relationship with your provider or can’t establish one, get a new provider . Nothing is more dangerous to the success of a website than shoddy or shady hosting practices. Look here for relieable tomcat, php, asp.net, or coldfusion hosting.

4. To whom are you linking? When was the last time you reviewed all of the outbound links on your site? My company offers a monitoring service that insures any trades you make remain the same as when you agreed to them in the first place. A technique some disreputable internet spammers will use is to obtain a reciprocal link from you with a seemingly reputable site and then switching page content when they’ve achieved enough inbounds for their purposes. Another is to make the trade and use a cloaking script that uses referal information to decide which page to show you if you should check up on it from your business; all other traffic sees their ick-site. The best solution for curtailing this type of link fraud is to hire an SEO professional.

http://www.caseyweed.com/common-seo-lament-why-is-my-site-not-in-the-index.html

What Is Search Engine Spam


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

Of course every one knows what e-mail spam is. What is Search Engine Spam? Basically Search Engine spam is what ever the Search Engines Say it is.In short Google says this:“Trying to deceive (spam) our web crawler by means of hidden text, deceptive cloaking or doorway pages compromises the quality of our results and degrades the search experience for everyone. We think that’s a bad thing.”If you come across a site that you think is in violation of the Search Engine guide lines you can use these hand links to report them. Google Spam Report Yahoo Spam Report.

If your reluctant to report Search Engine Spam or not sure if the site is Spamming the SEs, then you can report what you think to be same at Engine-Spam.com is a web site that will investigate your spam complaint and file a report for you if they agree. They will file reports with the appropriate Search Engine (Google, Altavista, Yahoo, MSN, Teoma, Ask)

Engine-Spam.com
http://www.spam-whackers.com/blog/2006/01/19/4/

Making Money Online


Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/thesitez/public_html/search-engine-cloaking/wp-includes/functions-formatting.php on line 76

Making Money Online can be extremely difficult these days if you don’t have the right resources. Making Money on the Internet is the Top Home Base Business today. Getting started is simple. One of the first steps is Building a Website. Building a Website can be quite difficult if it is your first time. There are many resources out there that can help you do this. I recommend buying a book which is named “Teach Yourself HTML Visually?. I find this book quite useful on a day to day basis. Registering your Domain Names and Website Hosting would be the next step. I have provided some links to some great companies that offer great service at extremely low cost for what they have to offer. One of the key secrets to Making Money Online is your Keywords. Keywords are located in meta tags between your head tags on each page. You also need to implement your Keywords in your text for more visibility by Search Engine spiders. This is called Search Engine Optimization. This needs to be done before Search Engine Submission.

Here is some Free Information On How To Get Started Making Money On The Internet. Nobody I know have has ever learned how to Make Quick Money On The Internet. It actually requires work to learn How To Make Money On The Internet. Making Money is not easy in the beginning, It takes usually a couple of weeks before you start seeing a descent income. The Home Business Online Opportunity is the way to go. After working so hard in the beginning to set everything up is finally starting to pay off. I actually only work a few hours a day. I am enjoying the finer things in life. There are many Home Based Business Income Opportunities today. In whatever you decide to do and market, I recommend using Google Adsense on all your pages. This is great income if you can generate the traffic. But do not rely on this only source. There are many affiliate programs out there that will also generate an Income Make Money Business Opportunities.

Make Quick Money on the Internet. I do not think so. Though I am sure it has been done. There are many ways to Make Money Online. How To Get Rich On The Internet. This can happen if you keep with it and learning these Business Opportunities From Home. Again there is more than one Business Online Opportunity. If anyone is reading this post, I am sure you are aware that I am stuffing this with Keywords. I do not like to use Search Engine cloaking. I think it is completely wrong. Many people use these tools to get to the top. I do not think it is required. There are better strategies in doing so. Everyone is trying to Make Money Online. It is all about Money, How To Make Quick Money On The Internet. Making Money Online is simple if you know what market to go after. Some people say go after ideas that have been overdone. Yes, you can Make Money doing this. I believe people should go after there own ideas and continuously come up with something new or something reinvented but better. Making Money Online can be quite challenging. When you start to see your first checks come in, WOW, it feels good. No more 50+ hour weeks. I work about 20 to 30 hours a week making ten times more than I did working for someone else. Let me tell you, I love it. I could work only 10 hours a week, but I enjoy trying many different ways in Making Money Online. This is the Work Home Business Opportunity of a lifetime. If someone new is actually reading this post. There are links on the home page that will help you get started. I personally use all of the links. I can rely on these companies at all times. There is no wait time if I need them, if there is, it is not usually more than 3 minutes. Unlike some other companies where it is at hours.Well I hope in ever whom is reading this post is learning something. The secret to top ratings besides sponsored links is in your Keywords. Making Money Online is simple if you know how to use the infrastructure. Income Make Business Opportunities is the way to go. There are secrets. If you follow the basic steps from my home page at www.MasterLinkSolutions.com, you yourself will learn the 22 Money Making On The Internet. I missed typed that, oh well. If you are interested to learn I will teach for free. Just leave me a contact resource in my email at the top of this home page. You are free to leave a post here or comment if you want. Making Money Online is well rewarding. But it does require work. How To Make Money On The Internet. 

http://www.masterlinksolutions.com/craftycat/?p=18

« Previous Entries

Linkblog

Useful Links

About Income Streams

Search